Telecommunications Science ›› 2023, Vol. 39 ›› Issue (3): 124-134.doi: 10.11959/j.issn.1000-0801.2023045

• Research and Development • Previous Articles     Next Articles

Substructure correlation adaptation transfer learning method based on K-means clustering

Haoshuang LIU1, Yong ZHANG2,3, Yingbo CAO1   

  1. 1 School of Computer &Information Technology, Liaoning Normal University, Dalian 116081, China
    2 School of Information Engineering, Huzhou University, Huzhou 313000, China
    3 Zhejiang Province Key Laboratory of Smart Management & Application of Modern Agricultural Resources, Huzhou 313000, China
  • Revised:2023-03-15 Online:2023-03-20 Published:2023-03-01
  • Supported by:
    The National Natural Science Foundation of China(61772252);Scientific Research Foundation of the Education Depart-ment of Liaoning Province(LJKZ0965)


Domain drifts severely affect the performance of traditional machine learning methods, and existing domain adaptive methods are mainly represented by adaptive adjustment cross-domain through global, class-level, or sample-level distribution adaptation.However, too coarse global matching and class-level matching can lead to insufficient adaptation, and sample-level adaptation to noise can lead to excessive adaptation.A substructure correlation adaptation (SCOAD) transfer learning algorithm based on K-means clustering was proposed.Firstly, multiple subdomains of the source domain and the target domain were obtained by K-means clustering.Then, the matching of the second-order statistics of the subdomain center was sought.Finally, the target domain samples were classified by using the subdomain structure.The proposed method approach further improves the performance of knowledge transfer between the source and target domains on top of the traditional approach.Experimental results on common transfer learning datasets show the effectiveness of the proposed method.

Key words: transfer learning, domain adaptation, substructural adaptation, clustering

No Suggested Reading articles found!