In rice production,the prevention and management of pests and diseases have always received special attention.Traditional methods require human experts,which is costly and time-consuming.Due to the complexity of the s...In rice production,the prevention and management of pests and diseases have always received special attention.Traditional methods require human experts,which is costly and time-consuming.Due to the complexity of the structure of rice diseases and pests,quickly and reliably recognizing and locating them is difficult.Recently,deep learning technology has been employed to detect and identify rice diseases and pests.This paper introduces common publicly available datasets;summarizes the applications on rice diseases and pests from the aspects of image recognition,object detection,image segmentation,attention mechanism,and few-shot learning methods according to the network structure differences;and compares the performances of existing studies.Finally,the current issues and challenges are explored fromthe perspective of data acquisition,data processing,and application,providing possible solutions and suggestions.This study aims to review various DL models and provide improved insight into DL techniques and their cutting-edge progress in the prevention and management of rice diseases and pests.展开更多
Sparse representation plays an important role in the research of face recognition.As a deformable sample classification task,face recognition is often used to test the performance of classification algorithms.In face ...Sparse representation plays an important role in the research of face recognition.As a deformable sample classification task,face recognition is often used to test the performance of classification algorithms.In face recognition,differences in expression,angle,posture,and lighting conditions have become key factors that affect recognition accuracy.Essentially,there may be significant differences between different image samples of the same face,which makes image classification very difficult.Therefore,how to build a robust virtual image representation becomes a vital issue.To solve the above problems,this paper proposes a novel image classification algorithm.First,to better retain the global features and contour information of the original sample,the algorithm uses an improved non‐linear image representation method to highlight the low‐intensity and high‐intensity pixels of the original training sample,thus generating a virtual sample.Second,by the principle of sparse representation,the linear expression coefficients of the original sample and the virtual sample can be calculated,respectively.After obtaining these two types of coefficients,calculate the distances between the original sample and the test sample and the distance between the virtual sample and the test sample.These two distances are converted into distance scores.Finally,a simple and effective weight fusion scheme is adopted to fuse the classification scores of the original image and the virtual image.The fused score will determine the final classification result.The experimental results show that the proposed method outperforms other typical sparse representation classification methods.展开更多
With the increasing popularity of artificial intelligence applications,machine learning is also playing an increasingly important role in the Internet of Things(IoT)and the Internet of Vehicles(IoV).As an essential pa...With the increasing popularity of artificial intelligence applications,machine learning is also playing an increasingly important role in the Internet of Things(IoT)and the Internet of Vehicles(IoV).As an essential part of the IoV,smart transportation relies heavily on information obtained from images.However,inclement weather,such as snowy weather,negatively impacts the process and can hinder the regular operation of imaging equipment and the acquisition of conventional image information.Not only that,but the snow also makes intelligent transportation systems make the wrong judgment of road conditions and the entire system of the Internet of Vehicles adverse.This paper describes the single image snowremoval task and the use of a vision transformer to generate adversarial networks.The residual structure is used in the algorithm,and the Transformer structure is used in the network structure of the generator in the generative adversarial networks,which improves the accuracy of the snow removal task.Moreover,the vision transformer has good scalability and versatility for larger models and has a more vital fitting ability than the previously popular convolutional neural networks.The Snow100K dataset is used for training,testing and comparison,and the peak signal-to-noise ratio and structural similarity are used as evaluation indicators.The experimental results show that the improved snow removal algorithm performs well and can obtain high-quality snow removal images.展开更多
Accurate forecasting of time series is crucial across various domains.Many prediction tasks rely on effectively segmenting,matching,and time series data alignment.For instance,regardless of time series with the same g...Accurate forecasting of time series is crucial across various domains.Many prediction tasks rely on effectively segmenting,matching,and time series data alignment.For instance,regardless of time series with the same granularity,segmenting them into different granularity events can effectively mitigate the impact of varying time scales on prediction accuracy.However,these events of varying granularity frequently intersect with each other,which may possess unequal durations.Even minor differences can result in significant errors when matching time series with future trends.Besides,directly using matched events but unaligned events as state vectors in machine learning-based prediction models can lead to insufficient prediction accuracy.Therefore,this paper proposes a short-term forecasting method for time series based on a multi-granularity event,MGE-SP(multi-granularity event-based short-termprediction).First,amethodological framework for MGE-SP established guides the implementation steps.The framework consists of three key steps,including multi-granularity event matching based on the LTF(latest time first)strategy,multi-granularity event alignment using a piecewise aggregate approximation based on the compression ratio,and a short-term prediction model based on XGBoost.The data from a nationwide online car-hailing service in China ensures the method’s reliability.The average RMSE(root mean square error)and MAE(mean absolute error)of the proposed method are 3.204 and 2.360,lower than the respective values of 4.056 and 3.101 obtained using theARIMA(autoregressive integratedmoving average)method,as well as the values of 4.278 and 2.994 obtained using k-means-SVR(support vector regression)method.The other experiment is conducted on stock data froma public data set.The proposed method achieved an average RMSE and MAE of 0.836 and 0.696,lower than the respective values of 1.019 and 0.844 obtained using the ARIMA method,as well as the values of 1.350 and 1.172 obtained using the k-means-SVR method.展开更多
The accurate simulation of regional-scale winter wheat yield is important for national food security and the balance of grain supply and demand in China.Presently,most remote sensing process models use the“biomass...The accurate simulation of regional-scale winter wheat yield is important for national food security and the balance of grain supply and demand in China.Presently,most remote sensing process models use the“biomass×harvest index(HI)”method to simulate regional-scale winter wheat yield.However,spatiotemporal differences in HI contribute to inaccuracies in yield simulation at the regional scale.Time-series dry matter partition coefficients(Fr)can dynamically reflect the dry matter partition of winter wheat.In this study,Fr equations were fitted for each organ of winter wheat using site-scale data.These equations were then coupled into a process-based and remote sensingdriven crop yield model for wheat(PRYM-Wheat)to improve the regional simulation of winter wheat yield over the North China Plain(NCP).The improved PRYM-Wheat model integrated with the fitted Fr equations(PRYM-Wheat-Fr)was validated using data obtained from provincial yearbooks.A 3-year(2000-2002)averaged validation showed that PRYM-Wheat-Fr had a higher coefficient of determination(R^(2)=0.55)and lower root mean square error(RMSE=0.94 t ha^(-1))than PRYM-Wheat with a stable HI(abbreviated as PRYM-Wheat-HI),which had R^(2) and RMSE values of 0.30 and 1.62 t ha^(-1),respectively.The PRYM-Wheat-Fr model also performed better than PRYM-Wheat-HI for simulating yield in verification years(2013-2015).In conclusion,the PRYM-Wheat-Fr model exhibited a better accuracy than the original PRYM-Wheat model,making it a useful tool for the simulation of regional winter wheat yield.展开更多
As one of the most effective techniques for finding software vulnerabilities,fuzzing has become a hot topic in software security.It feeds potentially syntactically or semantically malformed test data to a target progr...As one of the most effective techniques for finding software vulnerabilities,fuzzing has become a hot topic in software security.It feeds potentially syntactically or semantically malformed test data to a target program to mine vulnerabilities and crash the system.In recent years,considerable efforts have been dedicated by researchers and practitioners towards improving fuzzing,so there aremore and more methods and forms,whichmake it difficult to have a comprehensive understanding of the technique.This paper conducts a thorough survey of fuzzing,focusing on its general process,classification,common application scenarios,and some state-of-the-art techniques that have been introduced to improve its performance.Finally,this paper puts forward key research challenges and proposes possible future research directions that may provide new insights for researchers.展开更多
The globalization of hardware designs and supply chains,as well as the integration of third-party intellectual property(IP)cores,has led to an increased focus from malicious attackers on computing hardware.However,exi...The globalization of hardware designs and supply chains,as well as the integration of third-party intellectual property(IP)cores,has led to an increased focus from malicious attackers on computing hardware.However,existing defense or detection approaches often require additional circuitry to perform security verification,and are thus constrained by time and resource limitations.Considering the scale of actual engineering tasks and tight project schedules,it is usually difficult to implement designs for all modules in field programmable gate array(FPGA)circuits.Some studies have pointed out that the failure of key modules tends to cause greater damage to the network.Therefore,under limited conditions,priority protection designs need to be made on key modules to improve protection efficiency.We have conducted research on FPGA designs including single FPGA systems and multi-FPGA systems,to identify key modules in FPGA systems.For the single FPGA designs,considering the topological structure,network characteristics,and directionality of FPGA designs,we propose a node importance evaluationmethod based on the technique for order preference by similarity to an ideal solution(TOPSIS)method.Then,for the multi-FPGA designs,considering the influence of nodes in intra-layer and inter-layers,they are constructed into the interdependent network,and we propose a method based on connection strength to identify the important modules.Finally,we conduct empirical research using actual FPGA designs as examples.The results indicate that compared to other traditional indexes,node importance indexes proposed for different designs can better characterize the importance of nodes.展开更多
The 6th generation mobile networks(6G)network is a kind of multi-network interconnection and multi-scenario coexistence network,where multiple network domains break the original fixed boundaries to form connections an...The 6th generation mobile networks(6G)network is a kind of multi-network interconnection and multi-scenario coexistence network,where multiple network domains break the original fixed boundaries to form connections and convergence.In this paper,with the optimization objective of maximizing network utility while ensuring flows performance-centric weighted fairness,this paper designs a reinforcement learning-based cloud-edge autonomous multi-domain data center network architecture that achieves single-domain autonomy and multi-domain collaboration.Due to the conflict between the utility of different flows,the bandwidth fairness allocation problem for various types of flows is formulated by considering different defined reward functions.Regarding the tradeoff between fairness and utility,this paper deals with the corresponding reward functions for the cases where the flows undergo abrupt changes and smooth changes in the flows.In addition,to accommodate the Quality of Service(QoS)requirements for multiple types of flows,this paper proposes a multi-domain autonomous routing algorithm called LSTM+MADDPG.Introducing a Long Short-Term Memory(LSTM)layer in the actor and critic networks,more information about temporal continuity is added,further enhancing the adaptive ability changes in the dynamic network environment.The LSTM+MADDPG algorithm is compared with the latest reinforcement learning algorithm by conducting experiments on real network topology and traffic traces,and the experimental results show that LSTM+MADDPG improves the delay convergence speed by 14.6%and delays the start moment of packet loss by 18.2%compared with other algorithms.展开更多
An effective algorithm based on signal coverage of effective communication and local energy-consumption saving strategy is proposed for the application in wireless sensor networks.This algorithm consists of two sub-al...An effective algorithm based on signal coverage of effective communication and local energy-consumption saving strategy is proposed for the application in wireless sensor networks.This algorithm consists of two sub-algorithms.One is the multi-hop partition subspaces clustering algorithm for ensuring local energybalanced consumption ascribed to the deployment from another algorithm of distributed locating deployment based on efficient communication coverage probability(DLD-ECCP).DLD-ECCP makes use of the characteristics of Markov chain and probabilistic optimization to obtain the optimum topology and number of sensor nodes.Through simulation,the relative data demonstrate the advantages of the proposed approaches on saving hardware resources and energy consumption of networks.展开更多
Focused on the Klinkenberg effect on gas seepage, the independently developed triaxial experimental system of gas seepage was applied to conduct research on the seepage characteristics of coal seam gas. By means of ex...Focused on the Klinkenberg effect on gas seepage, the independently developed triaxial experimental system of gas seepage was applied to conduct research on the seepage characteristics of coal seam gas. By means of experimental data analysis and theoretical derivation, a calculation method of coal seam gas permeability was proposed, which synthesized the respective influences of gas dynamic viscosity, compressibility factor and Klinkenberg effect. The study results show that the Klinkenberg effect has a significant influence on the coal seam gas seepage, the permeability estimated with the method considering the Klinkenberg effect is correct, and this permeability can fully reflect the true seepage state of the gas. For the gas around the standard conditions, the influences of dynamic viscosity and compressibility factor on the permeability may be ignored. For the gas deviating far away from the standard conditions, the influences of dynamic viscosity and compressibility factor on the permeability must be considered. The research results have certain guiding significance in forming a correct understanding of the Klinkenberg effect and selecting a more accurate calculation method for the permeability of coal containing gas.展开更多
Background: Glutamine and glutamate are known to play important roles in cancer biology. However, no detailed information is available in terms of their levels of involvement in various biological processes across dif...Background: Glutamine and glutamate are known to play important roles in cancer biology. However, no detailed information is available in terms of their levels of involvement in various biological processes across different cancer types, whereas such knowledge could be critical for understanding the distinct characteristics of different cancer types. Our computational study aimed to examine the functional roles of glutamine and glutamate across different cancer types.Methods: We conducted a comparative analysis of gene expression data of cancer tissues versus normal control tissues of 11 cancer types to understand glutamine and glutamate metabolisms in cancer. Specifically, we developed a linear regression model to assess differential contributions by glutamine and/or glutamate to each of seven biological processes in cancer versus control tissues.Results: While our computational predictions were consistent with some of the previous observations, multiple novel predictions were made:(1) glutamine is generally not involved in purine synthesis in cancer except for breast cancer, and is similarly not involved in pyridine synthesis except for kidney cancer;(2) glutamine is generally not involved in ATP production in cancer;(3) glutamine's contribution to nucleotide synthesis is minimal if any in cancer;(4) glutamine is not involved in asparagine synthesis in cancer except for bladder and lung cancers; and(5) glutamate does not contribute to serine synthesis except for bladder cancer.Conclusions: We comprehensively predicted the roles of glutamine and glutamate metabolisms in selected metabolic pathways in cancer tissues versus control tissues, which may lead to novel approaches to therapeutic development targeted at glutamine and/or glutamate metabolism. However, our predictions need further functional validation.展开更多
A global planning algorithm for intelligent vehicles is designed based on the A* algorithm, which provides intelligent vehicles with a global path towards their destinations. A distributed real-time multiple vehicle c...A global planning algorithm for intelligent vehicles is designed based on the A* algorithm, which provides intelligent vehicles with a global path towards their destinations. A distributed real-time multiple vehicle collision avoidance(MVCA)algorithm is proposed by extending the reciprocal n-body collision avoidance method. MVCA enables the intelligent vehicles to choose their destinations and control inputs independently,without needing to negotiate with each other or with the coordinator. Compared to the centralized trajectory-planning algorithm, MVCA reduces computation costs and greatly improves the robustness of the system. Because the destination of each intelligent vehicle can be regarded as private, which can be protected by MVCA, at the same time MVCA can provide a real-time trajectory planning for intelligent vehicles. Therefore,MVCA can better improve the safety of intelligent vehicles. The simulation was conducted in MATLAB, including crossroads scene simulation and circular exchange position simulation. The results show that MVCA behaves safely and reliably. The effects of latency and packet loss on MVCA are also statistically investigated through theoretically formulating broadcasting process based on one-dimensional Markov chain. The results uncover that the tolerant delay should not exceed the half of deciding cycle of trajectory planning, and shortening the sending interval could alleviate the negative effects caused by the packet loss to an extent. The cases of short delay(< 100100 ms) and low packet loss(< 5%) can bring little influence to those trajectory planning algorithms that only depend on V2 V to sense the context, but the unpredictable collision may occur if the delay and packet loss are further worsened. The MVCA was also tested by a real intelligent vehicle, the test results prove the operability of MVCA.展开更多
Aiming at the higher bit-rate occupation of motion vector encoding and more time load of full-searching strategies, a multi-resolution motion estimation and compensation algorithm based on adjacent prediction of frame...Aiming at the higher bit-rate occupation of motion vector encoding and more time load of full-searching strategies, a multi-resolution motion estimation and compensation algorithm based on adjacent prediction of frame difference was proposed.Differential motion detection was employed to image sequences and proper threshold was adopted to identify the connected region.Then the motion region was extracted to carry out motion estimation and motion compensation on it.The experiment results show that the encoding efficiency of motion vector is promoted, the complexity of motion estimation is reduced and the quality of the reconstruction image at the same bit-rate as Multi-Resolution Motion Estimation(MRME) is improved.展开更多
The main advantages of role-based access control (RBAC) are able to support the well-known security principles and roles’ inheritance. But for there remains a lack of specific definition and the necessary formalizati...The main advantages of role-based access control (RBAC) are able to support the well-known security principles and roles’ inheritance. But for there remains a lack of specific definition and the necessary formalization for RBAC, it is hard to realize RBAC in practical work. Our contribution here is to formalize the main relations of RBAC and take first step to propose concepts of action closure and data closure of a role, based on which we got the specification and algorithm for the least privileges of a role. We propose that roles’ inheritance should consist of inheritance of actions and inheritance of data, and then we got the inheritance of privileges among roles, which can also be supported by existing exploit tools.展开更多
The concept and definition of trust degree of Web service is given. Evaluating trust degree of Web services with the method of neural network from the aspect of trust history sequence is proposed. The principle of the...The concept and definition of trust degree of Web service is given. Evaluating trust degree of Web services with the method of neural network from the aspect of trust history sequence is proposed. The principle of the method, applicable neural network structure, neural network constructing, input standardization, training sample constructing, and the procedure of evaluating trust degree of Web services with trained neural network are described. Experiments show that it is feasible and effective to evaluate trust degree of Web Service with neural network.展开更多
The scattered fields of plane waves in a solid from a cylinder or sphere are critical in determining its acoustic characteristics as well as in engineering applications. This paper investigates the scattered field dis...The scattered fields of plane waves in a solid from a cylinder or sphere are critical in determining its acoustic characteristics as well as in engineering applications. This paper investigates the scattered field distributions of different incident waves created by elastic cylinders embedded in an elastic isotropic medium. Scattered waves, including longitudinal and transverse waves both inside and outside the cylinder, are described with specific modalities under an incident plane wave. A model with a scatterer embedded in a structural steel matrix and filled with aluminum is developed for comparison with the theoretical solution. The frequency of the plane wave ranged from 235 kHz to 2348 kHz, which corresponds to scaling factors from 0.5 to 5. Scattered field distributions in matrix materials blocked by an elastic cylindrical solid have been obtained by simulation or calculated using existing parameters. The simulation results are in good agreement with the theoretical solution, which supports the correctness of the simulation analysis. Furthermore, ultrasonic phased arrays are used to study scattered fields by changing the characteristics of the incident wave. On this foundation, a partial preliminary study of the scattered field distribution of double cylinders in a solid has been carried out, and the scattered field distribution at a given distance has been found to exhibit particular behaviors at different moments. Further studies on directivities and scattered fields are expected to improve the quantification of scattered images in isotropic solid materials by the phased array technique.展开更多
This paper proposes an algorithm applied in semantic P2P network based on the description logics with the purpose for realizing the concepts distribution of resources, which makes the resources semantic locating easy....This paper proposes an algorithm applied in semantic P2P network based on the description logics with the purpose for realizing the concepts distribution of resources, which makes the resources semantic locating easy. With the idea of the consistent hashing in the Chord, our algorithm stores the addresses and resources with the values of the same type to select instance. In addition, each peer has its own ontology, which will be completed by the knowledge distributed over the network during the exchange of CHGs (classification hierarchy graphs). The hierarchy classification of concepts allows to find matching resource by querying to the upper level concept because the all concepts described in the CHG have the same展开更多
Because the web is huge and web pages are updated frequently, the index maintained by a search engine has to refresh web pages periodically. This is extremely resource consuming because the search engine needs to craw...Because the web is huge and web pages are updated frequently, the index maintained by a search engine has to refresh web pages periodically. This is extremely resource consuming because the search engine needs to crawl the web and download web pages to refresh its index. Based on present technologies of web refreshing, we present a cooperative schema between web server and search engine for maintaining freshness of web repository. The web server provides meta-data defined through XML standard to describe web sites. Before updating the web page the crawler visits the meta-data files. If the meta-data indicates that the page is not modified, then the crawler will not update it. So this schema can save bandwidth resource. A primitive model based on the schema is implemented. The cost and efficiency of the schema are analyzed.展开更多
WISA 2006 has received 581 submissions and has accepted 65 papers for publication of this issue. These papers are involved in 8 research areas, including Web Information Mining and Retrieval, Semantic Web and Intellig...WISA 2006 has received 581 submissions and has accepted 65 papers for publication of this issue. These papers are involved in 8 research areas, including Web Information Mining and Retrieval, Semantic Web and Intelligent Web, Web Data Management and Information Integration, Web Application Framework and Architecture, Web Information Security, Web Services and Workflow Models, Text Processing and Decision Support, and Grid and Networking Technology. This paper gives an introduction to previous WISA conferences and a survey on the papers to be published in this issue.展开更多
基金funded by Hunan Provincial Natural Science Foundation of China with Grant Numbers(2022JJ50016,2023JJ50096)Innovation Platform Open Fund of Hengyang Normal University Grant 2021HSKFJJ039Hengyang Science and Technology Plan Guiding Project with Number 202222025902.
文摘In rice production,the prevention and management of pests and diseases have always received special attention.Traditional methods require human experts,which is costly and time-consuming.Due to the complexity of the structure of rice diseases and pests,quickly and reliably recognizing and locating them is difficult.Recently,deep learning technology has been employed to detect and identify rice diseases and pests.This paper introduces common publicly available datasets;summarizes the applications on rice diseases and pests from the aspects of image recognition,object detection,image segmentation,attention mechanism,and few-shot learning methods according to the network structure differences;and compares the performances of existing studies.Finally,the current issues and challenges are explored fromthe perspective of data acquisition,data processing,and application,providing possible solutions and suggestions.This study aims to review various DL models and provide improved insight into DL techniques and their cutting-edge progress in the prevention and management of rice diseases and pests.
基金supported by the Research Foundation for Advanced Talents of Guizhou University under Grant(2016)No.49,Key Disciplines of Guizhou Province Computer Science and Technology(ZDXK[2018]007)Research Projects of Innovation Group of Education(QianJiaoHeKY[2021]022)supported by the National Natural Science Foundation of China(62062023).
文摘Sparse representation plays an important role in the research of face recognition.As a deformable sample classification task,face recognition is often used to test the performance of classification algorithms.In face recognition,differences in expression,angle,posture,and lighting conditions have become key factors that affect recognition accuracy.Essentially,there may be significant differences between different image samples of the same face,which makes image classification very difficult.Therefore,how to build a robust virtual image representation becomes a vital issue.To solve the above problems,this paper proposes a novel image classification algorithm.First,to better retain the global features and contour information of the original sample,the algorithm uses an improved non‐linear image representation method to highlight the low‐intensity and high‐intensity pixels of the original training sample,thus generating a virtual sample.Second,by the principle of sparse representation,the linear expression coefficients of the original sample and the virtual sample can be calculated,respectively.After obtaining these two types of coefficients,calculate the distances between the original sample and the test sample and the distance between the virtual sample and the test sample.These two distances are converted into distance scores.Finally,a simple and effective weight fusion scheme is adopted to fuse the classification scores of the original image and the virtual image.The fused score will determine the final classification result.The experimental results show that the proposed method outperforms other typical sparse representation classification methods.
基金supported by School of Computer Science and Technology,Shandong University of Technology.This paper is supported by Shandong Provincial Natural Science Foundation,China(Grant Number ZR2019BF022)National Natural Science Foundation of China(Grant Number 62001272).
文摘With the increasing popularity of artificial intelligence applications,machine learning is also playing an increasingly important role in the Internet of Things(IoT)and the Internet of Vehicles(IoV).As an essential part of the IoV,smart transportation relies heavily on information obtained from images.However,inclement weather,such as snowy weather,negatively impacts the process and can hinder the regular operation of imaging equipment and the acquisition of conventional image information.Not only that,but the snow also makes intelligent transportation systems make the wrong judgment of road conditions and the entire system of the Internet of Vehicles adverse.This paper describes the single image snowremoval task and the use of a vision transformer to generate adversarial networks.The residual structure is used in the algorithm,and the Transformer structure is used in the network structure of the generator in the generative adversarial networks,which improves the accuracy of the snow removal task.Moreover,the vision transformer has good scalability and versatility for larger models and has a more vital fitting ability than the previously popular convolutional neural networks.The Snow100K dataset is used for training,testing and comparison,and the peak signal-to-noise ratio and structural similarity are used as evaluation indicators.The experimental results show that the improved snow removal algorithm performs well and can obtain high-quality snow removal images.
基金funded by the Fujian Province Science and Technology Plan,China(Grant Number 2019H0017).
文摘Accurate forecasting of time series is crucial across various domains.Many prediction tasks rely on effectively segmenting,matching,and time series data alignment.For instance,regardless of time series with the same granularity,segmenting them into different granularity events can effectively mitigate the impact of varying time scales on prediction accuracy.However,these events of varying granularity frequently intersect with each other,which may possess unequal durations.Even minor differences can result in significant errors when matching time series with future trends.Besides,directly using matched events but unaligned events as state vectors in machine learning-based prediction models can lead to insufficient prediction accuracy.Therefore,this paper proposes a short-term forecasting method for time series based on a multi-granularity event,MGE-SP(multi-granularity event-based short-termprediction).First,amethodological framework for MGE-SP established guides the implementation steps.The framework consists of three key steps,including multi-granularity event matching based on the LTF(latest time first)strategy,multi-granularity event alignment using a piecewise aggregate approximation based on the compression ratio,and a short-term prediction model based on XGBoost.The data from a nationwide online car-hailing service in China ensures the method’s reliability.The average RMSE(root mean square error)and MAE(mean absolute error)of the proposed method are 3.204 and 2.360,lower than the respective values of 4.056 and 3.101 obtained using theARIMA(autoregressive integratedmoving average)method,as well as the values of 4.278 and 2.994 obtained using k-means-SVR(support vector regression)method.The other experiment is conducted on stock data froma public data set.The proposed method achieved an average RMSE and MAE of 0.836 and 0.696,lower than the respective values of 1.019 and 0.844 obtained using the ARIMA method,as well as the values of 1.350 and 1.172 obtained using the k-means-SVR method.
基金supported by the National Natural Science Foundation of China(42101382 and 42201407)the Shandong Provincial Natural Science Foundation China(ZR2020QD016 and ZR2022QD120)。
文摘The accurate simulation of regional-scale winter wheat yield is important for national food security and the balance of grain supply and demand in China.Presently,most remote sensing process models use the“biomass×harvest index(HI)”method to simulate regional-scale winter wheat yield.However,spatiotemporal differences in HI contribute to inaccuracies in yield simulation at the regional scale.Time-series dry matter partition coefficients(Fr)can dynamically reflect the dry matter partition of winter wheat.In this study,Fr equations were fitted for each organ of winter wheat using site-scale data.These equations were then coupled into a process-based and remote sensingdriven crop yield model for wheat(PRYM-Wheat)to improve the regional simulation of winter wheat yield over the North China Plain(NCP).The improved PRYM-Wheat model integrated with the fitted Fr equations(PRYM-Wheat-Fr)was validated using data obtained from provincial yearbooks.A 3-year(2000-2002)averaged validation showed that PRYM-Wheat-Fr had a higher coefficient of determination(R^(2)=0.55)and lower root mean square error(RMSE=0.94 t ha^(-1))than PRYM-Wheat with a stable HI(abbreviated as PRYM-Wheat-HI),which had R^(2) and RMSE values of 0.30 and 1.62 t ha^(-1),respectively.The PRYM-Wheat-Fr model also performed better than PRYM-Wheat-HI for simulating yield in verification years(2013-2015).In conclusion,the PRYM-Wheat-Fr model exhibited a better accuracy than the original PRYM-Wheat model,making it a useful tool for the simulation of regional winter wheat yield.
基金supported in part by the National Natural Science Foundation of China under Grants 62273272,62303375,and 61873277in part by the Key Research and Development Program of Shaanxi Province under Grant 2023-YBGY-243+1 种基金in part by the Natural Science Foundation of Shaanxi Province under Grant 2020JQ-758in part by the Youth Innovation Team of Shaanxi Universities,and in part by the Special Fund for Scientific and Technological Innovation Strategy of Guangdong Province under Grant 2022A0505030025.
文摘As one of the most effective techniques for finding software vulnerabilities,fuzzing has become a hot topic in software security.It feeds potentially syntactically or semantically malformed test data to a target program to mine vulnerabilities and crash the system.In recent years,considerable efforts have been dedicated by researchers and practitioners towards improving fuzzing,so there aremore and more methods and forms,whichmake it difficult to have a comprehensive understanding of the technique.This paper conducts a thorough survey of fuzzing,focusing on its general process,classification,common application scenarios,and some state-of-the-art techniques that have been introduced to improve its performance.Finally,this paper puts forward key research challenges and proposes possible future research directions that may provide new insights for researchers.
基金supported by the Natural Science Foundation of China under Grant Nos.62362008,61973163,61972345,U1911401.
文摘The globalization of hardware designs and supply chains,as well as the integration of third-party intellectual property(IP)cores,has led to an increased focus from malicious attackers on computing hardware.However,existing defense or detection approaches often require additional circuitry to perform security verification,and are thus constrained by time and resource limitations.Considering the scale of actual engineering tasks and tight project schedules,it is usually difficult to implement designs for all modules in field programmable gate array(FPGA)circuits.Some studies have pointed out that the failure of key modules tends to cause greater damage to the network.Therefore,under limited conditions,priority protection designs need to be made on key modules to improve protection efficiency.We have conducted research on FPGA designs including single FPGA systems and multi-FPGA systems,to identify key modules in FPGA systems.For the single FPGA designs,considering the topological structure,network characteristics,and directionality of FPGA designs,we propose a node importance evaluationmethod based on the technique for order preference by similarity to an ideal solution(TOPSIS)method.Then,for the multi-FPGA designs,considering the influence of nodes in intra-layer and inter-layers,they are constructed into the interdependent network,and we propose a method based on connection strength to identify the important modules.Finally,we conduct empirical research using actual FPGA designs as examples.The results indicate that compared to other traditional indexes,node importance indexes proposed for different designs can better characterize the importance of nodes.
文摘The 6th generation mobile networks(6G)network is a kind of multi-network interconnection and multi-scenario coexistence network,where multiple network domains break the original fixed boundaries to form connections and convergence.In this paper,with the optimization objective of maximizing network utility while ensuring flows performance-centric weighted fairness,this paper designs a reinforcement learning-based cloud-edge autonomous multi-domain data center network architecture that achieves single-domain autonomy and multi-domain collaboration.Due to the conflict between the utility of different flows,the bandwidth fairness allocation problem for various types of flows is formulated by considering different defined reward functions.Regarding the tradeoff between fairness and utility,this paper deals with the corresponding reward functions for the cases where the flows undergo abrupt changes and smooth changes in the flows.In addition,to accommodate the Quality of Service(QoS)requirements for multiple types of flows,this paper proposes a multi-domain autonomous routing algorithm called LSTM+MADDPG.Introducing a Long Short-Term Memory(LSTM)layer in the actor and critic networks,more information about temporal continuity is added,further enhancing the adaptive ability changes in the dynamic network environment.The LSTM+MADDPG algorithm is compared with the latest reinforcement learning algorithm by conducting experiments on real network topology and traffic traces,and the experimental results show that LSTM+MADDPG improves the delay convergence speed by 14.6%and delays the start moment of packet loss by 18.2%compared with other algorithms.
基金supported by the Major State Basic Research Program of China(B1420080204)National Science Fund for Distinguished Young Scholars(60725415)the National Natural Science Foundation of China(60606006)
文摘An effective algorithm based on signal coverage of effective communication and local energy-consumption saving strategy is proposed for the application in wireless sensor networks.This algorithm consists of two sub-algorithms.One is the multi-hop partition subspaces clustering algorithm for ensuring local energybalanced consumption ascribed to the deployment from another algorithm of distributed locating deployment based on efficient communication coverage probability(DLD-ECCP).DLD-ECCP makes use of the characteristics of Markov chain and probabilistic optimization to obtain the optimum topology and number of sensor nodes.Through simulation,the relative data demonstrate the advantages of the proposed approaches on saving hardware resources and energy consumption of networks.
基金Projects(51104059,51204067)supported by the National Natural Science Foundation of ChinaProject(2012CB723103)supported the National Basic Research Program of China+3 种基金Project(IRT1235)supported by Innovation Team Development Plan of the Ministry of Education of ChinaProject(2013M531674)supported by China Postdoctoral Science FoundationProject(132300413203)supported by Basic and Frontier Technology Research Program of Henan Province,ChinaProject(WS2012B07)supported by the Open Project of State Key Laboratory Cultivation Base for Gas Geology and Gas Control(Henan Polytechnic University),China
文摘Focused on the Klinkenberg effect on gas seepage, the independently developed triaxial experimental system of gas seepage was applied to conduct research on the seepage characteristics of coal seam gas. By means of experimental data analysis and theoretical derivation, a calculation method of coal seam gas permeability was proposed, which synthesized the respective influences of gas dynamic viscosity, compressibility factor and Klinkenberg effect. The study results show that the Klinkenberg effect has a significant influence on the coal seam gas seepage, the permeability estimated with the method considering the Klinkenberg effect is correct, and this permeability can fully reflect the true seepage state of the gas. For the gas around the standard conditions, the influences of dynamic viscosity and compressibility factor on the permeability may be ignored. For the gas deviating far away from the standard conditions, the influences of dynamic viscosity and compressibility factor on the permeability must be considered. The research results have certain guiding significance in forming a correct understanding of the Klinkenberg effect and selecting a more accurate calculation method for the permeability of coal containing gas.
基金supported by Georgia Research Alliance and the National Natural Science Foundation of China(Grant Nos.81320108025,61402194,61572227)the Science-Technology Development Project from Jilin Province(Nos.20160101259JC,20160204022GX,20170520063JH)
文摘Background: Glutamine and glutamate are known to play important roles in cancer biology. However, no detailed information is available in terms of their levels of involvement in various biological processes across different cancer types, whereas such knowledge could be critical for understanding the distinct characteristics of different cancer types. Our computational study aimed to examine the functional roles of glutamine and glutamate across different cancer types.Methods: We conducted a comparative analysis of gene expression data of cancer tissues versus normal control tissues of 11 cancer types to understand glutamine and glutamate metabolisms in cancer. Specifically, we developed a linear regression model to assess differential contributions by glutamine and/or glutamate to each of seven biological processes in cancer versus control tissues.Results: While our computational predictions were consistent with some of the previous observations, multiple novel predictions were made:(1) glutamine is generally not involved in purine synthesis in cancer except for breast cancer, and is similarly not involved in pyridine synthesis except for kidney cancer;(2) glutamine is generally not involved in ATP production in cancer;(3) glutamine's contribution to nucleotide synthesis is minimal if any in cancer;(4) glutamine is not involved in asparagine synthesis in cancer except for bladder and lung cancers; and(5) glutamate does not contribute to serine synthesis except for bladder cancer.Conclusions: We comprehensively predicted the roles of glutamine and glutamate metabolisms in selected metabolic pathways in cancer tissues versus control tissues, which may lead to novel approaches to therapeutic development targeted at glutamine and/or glutamate metabolism. However, our predictions need further functional validation.
基金supported by the National Natural Science Foundation of China(61572229,6171101066)the Key Scientific and Technological Projects for Jilin Province Development Plan(20170204074GX,20180201068GX)Jilin Provincial International Cooperation Foundation(20180414015GH)。
文摘A global planning algorithm for intelligent vehicles is designed based on the A* algorithm, which provides intelligent vehicles with a global path towards their destinations. A distributed real-time multiple vehicle collision avoidance(MVCA)algorithm is proposed by extending the reciprocal n-body collision avoidance method. MVCA enables the intelligent vehicles to choose their destinations and control inputs independently,without needing to negotiate with each other or with the coordinator. Compared to the centralized trajectory-planning algorithm, MVCA reduces computation costs and greatly improves the robustness of the system. Because the destination of each intelligent vehicle can be regarded as private, which can be protected by MVCA, at the same time MVCA can provide a real-time trajectory planning for intelligent vehicles. Therefore,MVCA can better improve the safety of intelligent vehicles. The simulation was conducted in MATLAB, including crossroads scene simulation and circular exchange position simulation. The results show that MVCA behaves safely and reliably. The effects of latency and packet loss on MVCA are also statistically investigated through theoretically formulating broadcasting process based on one-dimensional Markov chain. The results uncover that the tolerant delay should not exceed the half of deciding cycle of trajectory planning, and shortening the sending interval could alleviate the negative effects caused by the packet loss to an extent. The cases of short delay(< 100100 ms) and low packet loss(< 5%) can bring little influence to those trajectory planning algorithms that only depend on V2 V to sense the context, but the unpredictable collision may occur if the delay and packet loss are further worsened. The MVCA was also tested by a real intelligent vehicle, the test results prove the operability of MVCA.
基金Supported by the National Natural Science Foundation of China (No. 60803036)the Scientific Research Fund of Heilongjiang Provincial Education Department (No.11531013)
文摘Aiming at the higher bit-rate occupation of motion vector encoding and more time load of full-searching strategies, a multi-resolution motion estimation and compensation algorithm based on adjacent prediction of frame difference was proposed.Differential motion detection was employed to image sequences and proper threshold was adopted to identify the connected region.Then the motion region was extracted to carry out motion estimation and motion compensation on it.The experiment results show that the encoding efficiency of motion vector is promoted, the complexity of motion estimation is reduced and the quality of the reconstruction image at the same bit-rate as Multi-Resolution Motion Estimation(MRME) is improved.
基金Supported by the National Natural Science Foun-dation of China (60403027)
文摘The main advantages of role-based access control (RBAC) are able to support the well-known security principles and roles’ inheritance. But for there remains a lack of specific definition and the necessary formalization for RBAC, it is hard to realize RBAC in practical work. Our contribution here is to formalize the main relations of RBAC and take first step to propose concepts of action closure and data closure of a role, based on which we got the specification and algorithm for the least privileges of a role. We propose that roles’ inheritance should consist of inheritance of actions and inheritance of data, and then we got the inheritance of privileges among roles, which can also be supported by existing exploit tools.
基金Supported by the National Natural Science Foun-dation of China(60403027)the Natural Science Foundation of HubeiProvince (2005ABA258) the Open Foundation of State Key Labora-tory of Software Engineering(SKLSE05-07)
文摘The concept and definition of trust degree of Web service is given. Evaluating trust degree of Web services with the method of neural network from the aspect of trust history sequence is proposed. The principle of the method, applicable neural network structure, neural network constructing, input standardization, training sample constructing, and the procedure of evaluating trust degree of Web services with trained neural network are described. Experiments show that it is feasible and effective to evaluate trust degree of Web Service with neural network.
基金Supported by National Key R&D Program of China(Grant No.2016YFF0203000)State Key Program of National Natural Science Foundation of China(Grant No.11834008)+5 种基金National Natural Science Foundation of China(Grant Nos.11774167,61571222)Fundamental research funds for the Central Universities(Grant No.020414380001)State Key Laboratory of Acoustics,Chinese Academy of Science(Grant No.SKLA201809)Key Laboratory of Underwater Acoustic Environment,Chinese Academy of Sciences(Grant No.SSHJ-KFKT-1701)AQSIQ technology R&D program(Grant No.2017QK125)Innovative Talents Program of Far East NDT New Technology&Application Forum
文摘The scattered fields of plane waves in a solid from a cylinder or sphere are critical in determining its acoustic characteristics as well as in engineering applications. This paper investigates the scattered field distributions of different incident waves created by elastic cylinders embedded in an elastic isotropic medium. Scattered waves, including longitudinal and transverse waves both inside and outside the cylinder, are described with specific modalities under an incident plane wave. A model with a scatterer embedded in a structural steel matrix and filled with aluminum is developed for comparison with the theoretical solution. The frequency of the plane wave ranged from 235 kHz to 2348 kHz, which corresponds to scaling factors from 0.5 to 5. Scattered field distributions in matrix materials blocked by an elastic cylindrical solid have been obtained by simulation or calculated using existing parameters. The simulation results are in good agreement with the theoretical solution, which supports the correctness of the simulation analysis. Furthermore, ultrasonic phased arrays are used to study scattered fields by changing the characteristics of the incident wave. On this foundation, a partial preliminary study of the scattered field distribution of double cylinders in a solid has been carried out, and the scattered field distribution at a given distance has been found to exhibit particular behaviors at different moments. Further studies on directivities and scattered fields are expected to improve the quantification of scattered images in isotropic solid materials by the phased array technique.
基金Supported by the National Natural Science Foun-dation of China (60403027)
文摘This paper proposes an algorithm applied in semantic P2P network based on the description logics with the purpose for realizing the concepts distribution of resources, which makes the resources semantic locating easy. With the idea of the consistent hashing in the Chord, our algorithm stores the addresses and resources with the values of the same type to select instance. In addition, each peer has its own ontology, which will be completed by the knowledge distributed over the network during the exchange of CHGs (classification hierarchy graphs). The hierarchy classification of concepts allows to find matching resource by querying to the upper level concept because the all concepts described in the CHG have the same
基金Supported by the National Natural Science Funda-tion of China (60403027)
文摘Because the web is huge and web pages are updated frequently, the index maintained by a search engine has to refresh web pages periodically. This is extremely resource consuming because the search engine needs to crawl the web and download web pages to refresh its index. Based on present technologies of web refreshing, we present a cooperative schema between web server and search engine for maintaining freshness of web repository. The web server provides meta-data defined through XML standard to describe web sites. Before updating the web page the crawler visits the meta-data files. If the meta-data indicates that the page is not modified, then the crawler will not update it. So this schema can save bandwidth resource. A primitive model based on the schema is implemented. The cost and efficiency of the schema are analyzed.
文摘WISA 2006 has received 581 submissions and has accepted 65 papers for publication of this issue. These papers are involved in 8 research areas, including Web Information Mining and Retrieval, Semantic Web and Intelligent Web, Web Data Management and Information Integration, Web Application Framework and Architecture, Web Information Security, Web Services and Workflow Models, Text Processing and Decision Support, and Grid and Networking Technology. This paper gives an introduction to previous WISA conferences and a survey on the papers to be published in this issue.