期刊文献+

基于ST-LSTM网络的位置预测模型 被引量:13

Location Prediction Model Based on ST-LSTM Network
下载PDF
导出
摘要 针对现有位置预测研究多数忽略时间和空间之间关联性的问题,提出一种基于时空特性的长短期记忆模型(ST-LSTM)。基于LSTM网络添加单独处理用户移动行为时空信息的时空门,并考虑用户签到的时间及空间因素,从而使模型具有时空特性。在ST-LSTM网络中引入个人修正因子,对每类用户的输出结果进行修正,在确保基本特性的基础上突出个性化,更好地学习每类用户的行为轨迹特征,同时在保证ST-LSTM网络特性的前提下给出 2种 ST-LSTM网络的简化变体模型。在公开数据集上的测试结果表明,与主流位置预测方法相比,该预测模型精确率、召回率、 F 1值都有明显提升。 Existing research on location prediction usually neglects the correlation between time and space.To address the problem,this paper proposes a Long and Short Term Memory model based on Spatial and Temporal features(ST-LSTM).Based on the LSTM network,a spatial-temporal gate that independently processes the spatial-temporal information of the user’s move is added,and the spatial and temporal factors of the user’s sign-in are added,so that the model has spatial-temporal characteristics.The personal correction factor is introduced in the ST-LSTM network to correct the output of each type of users,highlighting the individuality on the basis of ensuring the basic characteristics,and better learning the behavioral trajectory characteristics of each type of users.At the same time,a simplified variant model of two ST-LSTM networks is proposed under the premise of ensuring the characteristics of the ST-LSTM network.Results of testing on the public dataset show that compared with the mainstream location prediction methods,the prediction model has better performance in terms of accuracy,recall rate,and F 1 value.
作者 许芳芳 杨俊杰 刘宏志 XU Fangfang;YANG Junjie;LIU Hongzhi(School of Software and Microelectronic,Peking University,Beijing 102600,China;State Grid Zhongxing Co.,Ltd.,Beijing 100761,China)
出处 《计算机工程》 CAS CSCD 北大核心 2019年第9期1-7,共7页 Computer Engineering
基金 国家重点研发计划(2017YFB1002000)
关键词 位置预测 长短期记忆模型 时空信息 个性化 行为轨迹 location prediction Long Short Term Memory(LSTM) model spatial-temporal information individuality behavioral trajectory
  • 相关文献

参考文献1

二级参考文献17

  • 1Bengio Y, Ducharme R, Vincent P, et al. A neural probabilistic language model. The Journal of Ma- chine Learning Research, 2003, 3; 1137-1155.
  • 2Mikolov T, Karaficit M, Burget L, et al. Recurrent neural network based language model[C]//Proceed- ings of the llth Annual Conference of the International Speech Communication Association, Makuhari, Chiba, Japan, September 26-30, 2010. 2010. 1045-1048.
  • 3Socher R, Pennington J, Huang E H, et al. Semi-su- pervised recursive autoencoders for predicting senti- ment distributions[C]//Proeeedings of the Conference on Empirical Methods in Natural Language Process- ing. Association for Computational Linguistics, 2011:151-161.
  • 4Hochreiter S, Bengio Y, Frasconi P, et al. Gradient flow in recurrent nets: the difficulty of learning long- term dependencies M. Wiley-IEEE Press, 2001: 237-243.
  • 5Hochreiter S, Schmidhuber J. Long short-term memo- ry. Neural computation, 1997, 9(8): 1735-1780.
  • 6Socher R, Lin C C, Manning C, et al. Parsing natural scenes and natural language with recursive neural net- works[C//Proceedings of the 28th international con- ference on machine learning (ICML-11). 2011 : 129- 136.
  • 7Socher R, Perelygin A, Wu J Y, et al. Recursive deep models for semantic compositionality over a sentiment treebankC//Proceedings of the conference on empiri- cal methods in natural language processing (EMNLP). 2013 : 1631-1642.
  • 8Irsoy O, Cardie C. Deep Recursive Neural Networks for Compositionality in Language[-C//Proeeedings of the Advances in Neural Information Processing Sys- tems. 2014:2096 -2104.
  • 9Li P, Liu Y, Sun M. Recursive Autoencoders for ITG-Based Translation[C]//Proceedings of the EMN- LP. 2013: 567-577.
  • 10Le P, Zuidema W. Inside-Outside Semantics: A Framework for Neural Models of Semantic Composi tlon[C]//Proceeding of the Deep Learning and Rep- resentation Learning Workshop: NIPS 2014.

共引文献90

同被引文献92

引证文献13

二级引证文献117

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部