As one of the state-of-the-art automated function prediction(AFP)methods,NetGO 2.0 integrates multi-source information to improve the performance.However,it mainly utilizes the proteins with experimentally supported f...As one of the state-of-the-art automated function prediction(AFP)methods,NetGO 2.0 integrates multi-source information to improve the performance.However,it mainly utilizes the proteins with experimentally supported functional annotations without leveraging valuable information from a vast number of unannotated proteins.Recently,protein language models have been proposed to learn informative representations[e.g.,Evolutionary Scale Modeling(ESM)-1b embedding] from protein sequences based on self-supervision.Here,we represented each protein by ESM-1b and used logistic regression(LR)to train a new model,LR-ESM,for AFP.The experimental results showed that LR-ESM achieved comparable performance with the best-performing component of NetGO 2.0.Therefore,by incorporating LR-ESM into NetGO 2.0,we developed NetGO 3.0 to improve the performance of AFP extensively.展开更多
基金supported by the National Natural Science Foundation of China(Grant Nos.61872094 and 62272105)the Shanghai Municipal Science and Technology Major Project(Grant No.2018SHZDZX01)+2 种基金the ZJ Lab,and the Shanghai Research Center for Brain Science and Brain-Inspired Intelligence Technology.Shaojun Wang and Ronghui You have been supported by the lll Project(Grant No.B18015)the Shanghai Municipal Science and Technology Major Project(Grant No.2017SHZDZX01)the Information Technology Facility,CAS-MPG Partner Institute for Computational Biology,Shanghai Institute for Biological Sciences,Chinese Academy of Sciences.Yi Xiong has been supported by the National Natural Science Foundation of China(Grant Nos.61832019 and 62172274).
文摘As one of the state-of-the-art automated function prediction(AFP)methods,NetGO 2.0 integrates multi-source information to improve the performance.However,it mainly utilizes the proteins with experimentally supported functional annotations without leveraging valuable information from a vast number of unannotated proteins.Recently,protein language models have been proposed to learn informative representations[e.g.,Evolutionary Scale Modeling(ESM)-1b embedding] from protein sequences based on self-supervision.Here,we represented each protein by ESM-1b and used logistic regression(LR)to train a new model,LR-ESM,for AFP.The experimental results showed that LR-ESM achieved comparable performance with the best-performing component of NetGO 2.0.Therefore,by incorporating LR-ESM into NetGO 2.0,we developed NetGO 3.0 to improve the performance of AFP extensively.