期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
A label noise filtering and label missing supplement framework based on game theory
1
作者 Yuwen Liu Rongju Yao +4 位作者 Song Jia Fan Wang Ruili Wang Rui Ma lianyong qi 《Digital Communications and Networks》 SCIE CSCD 2023年第4期887-895,共9页
Labeled data is widely used in various classification tasks.However,there is a huge challenge that labels are often added artificially.Wrong labels added by malicious users will affect the training effect of the model... Labeled data is widely used in various classification tasks.However,there is a huge challenge that labels are often added artificially.Wrong labels added by malicious users will affect the training effect of the model.The unreliability of labeled data has hindered the research.In order to solve the above problems,we propose a framework of Label Noise Filtering and Missing Label Supplement(LNFS).And we take location labels in Location-Based Social Networks(LBSN)as an example to implement our framework.For the problem of label noise filtering,we first use FastText to transform the restaurant's labels into vectors,and then based on the assumption that the label most similar to all other labels in the location is most representative.We use cosine similarity to judge and select the most representative label.For the problem of label missing,we use simple common word similarity to judge the similarity of users'comments,and then use the label of the similar restaurant to supplement the missing labels.To optimize the performance of the model,we introduce game theory into our model to simulate the game between the malicious users and the model to improve the reliability of the model.Finally,a case study is given to illustrate the effectiveness and reliability of LNFS. 展开更多
关键词 Label noise FastText Cosine similarity Game theory LSTM
下载PDF
Edge Intelligence with Distributed Processing of DNNs:A Survey
2
作者 Sizhe Tang Mengmeng Cui +1 位作者 lianyong qi Xiaolong Xu 《Computer Modeling in Engineering & Sciences》 SCIE EI 2023年第7期5-42,共38页
Withthe rapiddevelopment of deep learning,the size of data sets anddeepneuralnetworks(DNNs)models are also booming.As a result,the intolerable long time for models’training or inference with conventional strategies c... Withthe rapiddevelopment of deep learning,the size of data sets anddeepneuralnetworks(DNNs)models are also booming.As a result,the intolerable long time for models’training or inference with conventional strategies can not meet the satisfaction of modern tasks gradually.Moreover,devices stay idle in the scenario of edge computing(EC),which presents a waste of resources since they can share the pressure of the busy devices but they do not.To address the problem,the strategy leveraging distributed processing has been applied to load computation tasks from a single processor to a group of devices,which results in the acceleration of training or inference of DNN models and promotes the high utilization of devices in edge computing.Compared with existing papers,this paper presents an enlightening and novel review of applying distributed processing with data and model parallelism to improve deep learning tasks in edge computing.Considering the practicalities,commonly used lightweight models in a distributed system are introduced as well.As the key technique,the parallel strategy will be described in detail.Then some typical applications of distributed processing will be analyzed.Finally,the challenges of distributed processing with edge computing will be described. 展开更多
关键词 Distributed processing edge computing parallel strategies acceleration of DNN processing
下载PDF
Guest editorial:Special issue on security and privacy for AI-powered smart IoT applications
3
作者 lianyong qi Jin Li +3 位作者 Mehdi Elahi Keshav Sood Yuan Yuan Mohammad Khosravi 《Digital Communications and Networks》 SCIE CSCD 2022年第4期411-414,共4页
The prevalence of Internet of Things(IoT)applications has generated an unprecedented volume of industrial data that create the main source of big data[1,2].How to effectively and efficiently preprocess,integrate,and a... The prevalence of Internet of Things(IoT)applications has generated an unprecedented volume of industrial data that create the main source of big data[1,2].How to effectively and efficiently preprocess,integrate,and analyze big IoT data from multiple sources is still a fundamental challenge[3,4].Fortunately,Artificial Intelligence(AI)has recently emerged as a key technology to achieving intelligent data analyses and scientific business decision-making.AI algorithms can process the streaming data generated by distributed IoT devices and provide powerful tools to address complex big data analytics[5,6].Therefore,the adaptation of AI-based methods is highly demanded for achieving their full potential in smart IoT applications.However,the IoT devices,which run in an unstable environment and have poor computing capabilities,are often confronted with a range of application requirements,such as quick response,secure communications,and privacy protection[7,8].Currently,lightweight security and privacy solutions specifically designed for the devices and servers operating in the IoT environment are still lacking. 展开更多
关键词 IOT SMART UNSTABLE
下载PDF
Editorial
4
作者 lianyong qi 《Intelligent and Converged Networks》 EI 2023年第2期I0001-I0003,共3页
The Internet of Things(IoT)is an innovative technology in the field of information technology.The goal of IoT is to connect various devices and objects through the internet to enable information sharing and interconne... The Internet of Things(IoT)is an innovative technology in the field of information technology.The goal of IoT is to connect various devices and objects through the internet to enable information sharing and interconnection,thereby achieving automation,intelligence,and more efficient communication.The development of IoT has significantly transformed our way of life in multiple aspects,including entertainment,sports,agriculture,manufacturing,and more.In IoT applications,a large number of computing devices,sensors,and infrastructure are connected via the internet,resulting in the generation of virtually limitless data.Effectively and efficiently processing,analyzing,and mining IoT data have become a realistic challenge that requires in-depth research. 展开更多
关键词 IOT INTERCONNECTION thereby
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部