Deep transfer learning usually hypothesizes the distribution of features are similar on training dataset, which causes a constant assumption of distance. However, this standpoint leads to an unclear relation between features and accuracy, and weakens the networks ability of overfitting prevention as well. To achieve a better result in transfer learning, we propose a dynamic model of deep transfer learning network with the influence of features in learning mission. First, we exhibit the distance and dropout rates function in a formal way. Second, we propose our model with algorithm in deep transfer learning networks. With the data of preprocessed MNIST and CIFAR-10, we conduct the reasonable experiment to compare the performance of transfer learning networks and conventional ones. The results show that the model we presented in this paper works better in both correction and productivity.
CITATION STYLE
Li, F., Yang, H., & Wang, J. (2016). A dropout distribution model on deep transfer learning networks. In Lecture Notes in Electrical Engineering (Vol. 376, pp. 431–439). Springer Verlag. https://doi.org/10.1007/978-981-10-0557-2_43
Mendeley helps you to discover research relevant for your work.