Geomechanical assessment using coupled reservoir-geomechanical simulation is becoming increasingly important for analyzing the potential geomechanical risks in subsurface geological developments.However,a robust and e...Geomechanical assessment using coupled reservoir-geomechanical simulation is becoming increasingly important for analyzing the potential geomechanical risks in subsurface geological developments.However,a robust and efficient geomechanical upscaling technique for heterogeneous geological reservoirs is lacking to advance the applications of three-dimensional(3D)reservoir-scale geomechanical simulation considering detailed geological heterogeneities.Here,we develop convolutional neural network(CNN)proxies that reproduce the anisotropic nonlinear geomechanical response caused by lithological heterogeneity,and compute upscaled geomechanical properties from CNN proxies.The CNN proxies are trained using a large dataset of randomly generated spatially correlated sand-shale realizations as inputs and simulation results of their macroscopic geomechanical response as outputs.The trained CNN models can provide the upscaled shear strength(R^(2)>0.949),stress-strain behavior(R^(2)>0.925),and volumetric strain changes(R^(2)>0.958)that highly agree with the numerical simulation results while saving over two orders of magnitude of computational time.This is a major advantage in computing the upscaled geomechanical properties directly from geological realizations without the need to perform local numerical simulations to obtain the geomechanical response.The proposed CNN proxybased upscaling technique has the ability to(1)bridge the gap between the fine-scale geocellular models considering geological uncertainties and computationally efficient geomechanical models used to assess the geomechanical risks of large-scale subsurface development,and(2)improve the efficiency of numerical upscaling techniques that rely on local numerical simulations,leading to significantly increased computational time for uncertainty quantification using numerous geological realizations.展开更多
Research on discrete memristor-based neural networks has received much attention.However,current research mainly focuses on memristor–based discrete homogeneous neuron networks,while memristor-coupled discrete hetero...Research on discrete memristor-based neural networks has received much attention.However,current research mainly focuses on memristor–based discrete homogeneous neuron networks,while memristor-coupled discrete heterogeneous neuron networks are rarely reported.In this study,a new four-stable discrete locally active memristor is proposed and its nonvolatile and locally active properties are verified by its power-off plot and DC V–I diagram.Based on two-dimensional(2D)discrete Izhikevich neuron and 2D discrete Chialvo neuron,a heterogeneous discrete neuron network is constructed by using the proposed discrete memristor as a coupling synapse connecting the two heterogeneous neurons.Considering the coupling strength as the control parameter,chaotic firing,periodic firing,and hyperchaotic firing patterns are revealed.In particular,multiple coexisting firing patterns are observed,which are induced by different initial values of the memristor.Phase synchronization between the two heterogeneous neurons is discussed and it is found that they can achieve perfect synchronous at large coupling strength.Furthermore,the effect of Gaussian white noise on synchronization behaviors is also explored.We demonstrate that the presence of noise not only leads to the transition of firing patterns,but also achieves the phase synchronization between two heterogeneous neurons under low coupling strength.展开更多
In Beyond the Fifth Generation(B5G)heterogeneous edge networks,numerous users are multiplexed on a channel or served on the same frequency resource block,in which case the transmitter applies coding and the receiver u...In Beyond the Fifth Generation(B5G)heterogeneous edge networks,numerous users are multiplexed on a channel or served on the same frequency resource block,in which case the transmitter applies coding and the receiver uses interference cancellation.Unfortunately,uncoordinated radio resource allocation can reduce system throughput and lead to user inequity,for this reason,in this paper,channel allocation and power allocation problems are formulated to maximize the system sum rate and minimum user achievable rate.Since the construction model is non-convex and the response variables are high-dimensional,a distributed Deep Reinforcement Learning(DRL)framework called distributed Proximal Policy Optimization(PPO)is proposed to allocate or assign resources.Specifically,several simulated agents are trained in a heterogeneous environment to find robust behaviors that perform well in channel assignment and power allocation.Moreover,agents in the collection stage slow down,which hinders the learning of other agents.Therefore,a preemption strategy is further proposed in this paper to optimize the distributed PPO,form DP-PPO and successfully mitigate the straggler problem.The experimental results show that our mechanism named DP-PPO improves the performance over other DRL methods.展开更多
In this paper,we analyze a hybrid Heterogeneous Cellular Network(HCNet)framework by deploying millimeter Wave(mmWave)small cells with coexisting traditional sub-6GHz macro cells to achieve improved coverage and high d...In this paper,we analyze a hybrid Heterogeneous Cellular Network(HCNet)framework by deploying millimeter Wave(mmWave)small cells with coexisting traditional sub-6GHz macro cells to achieve improved coverage and high data rate.We consider randomly-deployed macro base stations throughout the network whereas mmWave Small Base Stations(SBSs)are deployed in the areas with high User Equipment(UE)density.Such user centric deployment of mmWave SBSs inevitably incurs correlation between UE and SBSs.For a realistic scenario where the UEs are distributed according to Poisson cluster process and directional beamforming with line-of-sight and non-line-of-sight transmissions is adopted for mmWave communication.By using tools from stochastic geometry,we develop an analytical framework to analyze various performance metrics in the downlink hybrid HCNets under biased received power association.For UE clustering we considered Thomas cluster process and derive expressions for the association probability,coverage probability,area spectral efficiency,and energy efficiency.We also provide Monte Carlo simulation results to validate the accuracy of the derived expressions.Furthermore,we analyze the impact of mmWave operating frequency,antenna gain,small cell biasing,and BSs density to get useful engineering insights into the performance of hybrid mmWave HCNets.Our results show that network performance is significantly improved by deploying millimeter wave SBS instead of microwave BS in hot spots.展开更多
Interference management is one of the most important issues in the device-to-device(D2D)-enabled heterogeneous cellular networks(HetCNets)due to the coexistence of massive cellular and D2D devices in which D2D devices...Interference management is one of the most important issues in the device-to-device(D2D)-enabled heterogeneous cellular networks(HetCNets)due to the coexistence of massive cellular and D2D devices in which D2D devices reuse the cellular spectrum.To alleviate the interference,an efficient interference management way is to set exclusion zones around the cellular receivers.In this paper,we adopt a stochastic geometry approach to analyze the outage probabilities of cellular and D2D users in the D2D-enabled HetCNets.The main difficulties contain three aspects:1)how to model the location randomness of base stations,cellular and D2D users in practical networks;2)how to capture the randomness and interrelation of cellular and D2D transmissions due to the existence of random exclusion zones;3)how to characterize the different types of interference and their impacts on the outage probabilities of cellular and D2D users.We then run extensive Monte-Carlo simulations which manifest that our theoretical model is very accurate.展开更多
The structural optimization of wireless sensor networks is a critical issue because it impacts energy consumption and hence the network’s lifetime.Many studies have been conducted for homogeneous networks,but few hav...The structural optimization of wireless sensor networks is a critical issue because it impacts energy consumption and hence the network’s lifetime.Many studies have been conducted for homogeneous networks,but few have been performed for heterogeneouswireless sensor networks.This paper utilizes Rao algorithms to optimize the structure of heterogeneous wireless sensor networks according to node locations and their initial energies.The proposed algorithms lack algorithm-specific parameters and metaphorical connotations.The proposed algorithms examine the search space based on the relations of the population with the best,worst,and randomly assigned solutions.The proposed algorithms can be evaluated using any routing protocol,however,we have chosen the well-known routing protocols in the literature:Low Energy Adaptive Clustering Hierarchy(LEACH),Power-Efficient Gathering in Sensor Information Systems(PEAGSIS),Partitioned-based Energy-efficient LEACH(PE-LEACH),and the Power-Efficient Gathering in Sensor Information Systems Neural Network(PEAGSIS-NN)recent routing protocol.We compare our optimized method with the Jaya,the Particle Swarm Optimization-based Energy Efficient Clustering(PSO-EEC)protocol,and the hybrid Harmony Search Algorithm and PSO(HSA-PSO)algorithms.The efficiencies of our proposed algorithms are evaluated by conducting experiments in terms of the network lifetime(first dead node,half dead nodes,and last dead node),energy consumption,packets to cluster head,and packets to the base station.The experimental results were compared with those obtained using the Jaya optimization algorithm.The proposed algorithms exhibited the best performance.The proposed approach successfully prolongs the network lifetime by 71% for the PEAGSIS protocol,51% for the LEACH protocol,10% for the PE-LEACH protocol,and 73% for the PEGSIS-NN protocol;Moreover,it enhances other criteria such as energy conservation,fitness convergence,packets to cluster head,and packets to the base station.展开更多
A heterogeneous information network,which is composed of various types of nodes and edges,has a complex structure and rich information content,and is widely used in social networks,academic networks,e-commerce,and oth...A heterogeneous information network,which is composed of various types of nodes and edges,has a complex structure and rich information content,and is widely used in social networks,academic networks,e-commerce,and other fields.Link prediction,as a key task to reveal the unobserved relationships in the network,is of great significance in heterogeneous information networks.This paper reviews the application of presentation-based learning methods in link prediction of heterogeneous information networks.This paper introduces the basic concepts of heterogeneous information networks,and the theoretical basis of representation learning,and discusses the specific application of the deep learning model in node embedding learning and link prediction in detail.The effectiveness and superiority of these methods on multiple real data sets are demonstrated by experimental verification.展开更多
Blockchain-enabled cybersecurity system to ensure and strengthen decentralized digital transaction is gradually gaining popularity in the digital era for various areas like finance,transportation,healthcare,education,...Blockchain-enabled cybersecurity system to ensure and strengthen decentralized digital transaction is gradually gaining popularity in the digital era for various areas like finance,transportation,healthcare,education,and supply chain management.Blockchain interactions in the heterogeneous network have fascinated more attention due to the authentication of their digital application exchanges.However,the exponential development of storage space capabilities across the blockchain-based heterogeneous network has become an important issue in preventing blockchain distribution and the extension of blockchain nodes.There is the biggest challenge of data integrity and scalability,including significant computing complexity and inapplicable latency on regional network diversity,operating system diversity,bandwidth diversity,node diversity,etc.,for decision-making of data transactions across blockchain-based heterogeneous networks.Data security and privacy have also become the main concerns across the heterogeneous network to build smart IoT ecosystems.To address these issues,today’s researchers have explored the potential solutions of the capability of heterogeneous network devices to perform data transactions where the system stimulates their integration reliably and securely with blockchain.The key goal of this paper is to conduct a state-of-the-art and comprehensive survey on cybersecurity enhancement using blockchain in the heterogeneous network.This paper proposes a full-fledged taxonomy to identify the main obstacles,research gaps,future research directions,effective solutions,andmost relevant blockchain-enabled cybersecurity systems.In addition,Blockchain based heterogeneous network framework with cybersecurity is proposed in this paper tomeet the goal of maintaining optimal performance data transactions among organizations.Overall,this paper provides an in-depth description based on the critical analysis to overcome the existing work gaps for future research where it presents a potential cybersecurity design with key requirements of blockchain across a heterogeneous network.展开更多
The increased adoption of Internet of Medical Things (IoMT) technologies has resulted in the widespread use ofBody Area Networks (BANs) in medical and non-medical domains. However, the performance of IEEE 802.15.4-bas...The increased adoption of Internet of Medical Things (IoMT) technologies has resulted in the widespread use ofBody Area Networks (BANs) in medical and non-medical domains. However, the performance of IEEE 802.15.4-based BANs is impacted by challenges related to heterogeneous data traffic requirements among nodes, includingcontention during finite backoff periods, association delays, and traffic channel access through clear channelassessment (CCA) algorithms. These challenges lead to increased packet collisions, queuing delays, retransmissions,and the neglect of critical traffic, thereby hindering performance indicators such as throughput, packet deliveryratio, packet drop rate, and packet delay. Therefore, we propose Dynamic Next Backoff Period and Clear ChannelAssessment (DNBP-CCA) schemes to address these issues. The DNBP-CCA schemes leverage a combination ofthe Dynamic Next Backoff Period (DNBP) scheme and the Dynamic Next Clear Channel Assessment (DNCCA)scheme. The DNBP scheme employs a fuzzy Takagi, Sugeno, and Kang (TSK) model’s inference system toquantitatively analyze backoff exponent, channel clearance, collision ratio, and data rate as input parameters. Onthe other hand, the DNCCA scheme dynamically adapts the CCA process based on requested data transmission tothe coordinator, considering input parameters such as buffer status ratio and acknowledgement ratio. As a result,simulations demonstrate that our proposed schemes are better than some existing representative approaches andenhance data transmission, reduce node collisions, improve average throughput, and packet delivery ratio, anddecrease average packet drop rate and packet delay.展开更多
On the multilingual online social networks of global information sharing,the wanton spread of rumors has an enormous negative impact on people's lives.Thus,it is essential to explore the rumor-spreading rules in m...On the multilingual online social networks of global information sharing,the wanton spread of rumors has an enormous negative impact on people's lives.Thus,it is essential to explore the rumor-spreading rules in multilingual environment and formulate corresponding control strategies to reduce the harm caused by rumor propagation.In this paper,considering the multilingual environment and intervention mechanism in the rumor-spreading process,an improved ignorants–spreaders-1–spreaders-2–removers(I2SR)rumor-spreading model with time delay and the nonlinear incidence is established in heterogeneous networks.Firstly,based on the mean-field equations corresponding to the model,the basic reproduction number is derived to ensure the existence of rumor-spreading equilibrium.Secondly,by applying Lyapunov stability theory and graph theory,the global stability of rumor-spreading equilibrium is analyzed in detail.In particular,aiming at the lowest control cost,the optimal control scheme is designed to optimize the intervention mechanism,and the optimal control conditions are derived using the Pontryagin's minimum principle.Finally,some illustrative examples are provided to verify the effectiveness of the theoretical results.The results show that optimizing the intervention mechanism can effectively reduce the densities of spreaders-1 and spreaders-2 within the expected time,which provides guiding insights for public opinion managers to control rumors.展开更多
This paper studies the coordinated planning of transmission tasks in the heterogeneous space networks to enable efficient sharing of ground stations cross satellite systems.Specifically,we first formulate the coordina...This paper studies the coordinated planning of transmission tasks in the heterogeneous space networks to enable efficient sharing of ground stations cross satellite systems.Specifically,we first formulate the coordinated planning problem into a mixed integer liner programming(MILP)problem based on time expanded graph.Then,the problem is transferred and reformulated into a consensus optimization framework which can be solved by satellite systems parallelly.With alternating direction method of multipliers(ADMM),a semi-distributed coordinated transmission task planning algorithm is proposed,in which each satellite system plans its own tasks based on local information and limited communication with the coordination center.Simulation results demonstrate that compared with the centralized and fully-distributed methods,the proposed semi-distributed coordinated method can strike a better balance among task complete rate,complexity,and the amount of information required to be exchanged.展开更多
ExpertRecommendation(ER)aims to identify domain experts with high expertise and willingness to provide answers to questions in Community Question Answering(CQA)web services.How to model questions and users in the hete...ExpertRecommendation(ER)aims to identify domain experts with high expertise and willingness to provide answers to questions in Community Question Answering(CQA)web services.How to model questions and users in the heterogeneous content network is critical to this task.Most traditional methods focus on modeling questions and users based on the textual content left in the community while ignoring the structural properties of heterogeneous CQA networks and always suffering from textual data sparsity issues.Recent approaches take advantage of structural proximities between nodes and attempt to fuse the textual content of nodes for modeling.However,they often fail to distinguish the nodes’personalized preferences and only consider the textual content of a part of the nodes in network embedding learning,while ignoring the semantic relevance of nodes.In this paper,we propose a novel framework that jointly considers the structural proximity relations and textual semantic relevance to model users and questions more comprehensively.Specifically,we learn topology-based embeddings through a hierarchical attentive network learning strategy,in which the proximity information and the personalized preference of nodes are encoded and preserved.Meanwhile,we utilize the node’s textual content and the text correlation between adjacent nodes to build the content-based embedding through a meta-context-aware skip-gram model.In addition,the user’s relative answer quality is incorporated to promote the ranking performance.Experimental results show that our proposed framework consistently and significantly outperforms the state-of-the-art baselines on three real-world datasets by taking the deep semantic understanding and structural feature learning together.The performance of the proposed work is analyzed in terms of MRR,P@K,and MAP and is proven to be more advanced than the existing methodologies.展开更多
The continuous improvement of the cyber threat intelligence sharing mechanism provides new ideas to deal with Advanced Persistent Threats(APT).Extracting attack behaviors,i.e.,Tactics,Techniques,Procedures(TTP)from Cy...The continuous improvement of the cyber threat intelligence sharing mechanism provides new ideas to deal with Advanced Persistent Threats(APT).Extracting attack behaviors,i.e.,Tactics,Techniques,Procedures(TTP)from Cyber Threat Intelligence(CTI)can facilitate APT actors’profiling for an immediate response.However,it is difficult for traditional manual methods to analyze attack behaviors from cyber threat intelligence due to its heterogeneous nature.Based on the Adversarial Tactics,Techniques and Common Knowledge(ATT&CK)of threat behavior description,this paper proposes a threat behavioral knowledge extraction framework that integrates Heterogeneous Text Network(HTN)and Graph Convolutional Network(GCN)to solve this issue.It leverages the hierarchical correlation relationships of attack techniques and tactics in the ATT&CK to construct a text network of heterogeneous cyber threat intelligence.With the help of the Bidirectional EncoderRepresentation fromTransformers(BERT)pretraining model to analyze the contextual semantics of cyber threat intelligence,the task of threat behavior identification is transformed into a text classification task,which automatically extracts attack behavior in CTI,then identifies the malware and advanced threat actors.The experimental results show that F1 achieve 94.86%and 92.15%for the multi-label classification tasks of tactics and techniques.Extend the experiment to verify the method’s effectiveness in identifying the malware and threat actors in APT attacks.The F1 for malware and advanced threat actors identification task reached 98.45%and 99.48%,which are better than the benchmark model in the experiment and achieve state of the art.The model can effectivelymodel threat intelligence text data and acquire knowledge and experience migration by correlating implied features with a priori knowledge to compensate for insufficient sample data and improve the classification performance and recognition ability of threat behavior in text.展开更多
In distributed machine learning(DML)based on the parameter server(PS)architecture,unbalanced communication load distribution of PSs will lead to a significant slowdown of model synchronization in heterogeneous network...In distributed machine learning(DML)based on the parameter server(PS)architecture,unbalanced communication load distribution of PSs will lead to a significant slowdown of model synchronization in heterogeneous networks due to low utilization of bandwidth.To address this problem,a network-aware adaptive PS load distribution scheme is proposed,which accelerates model synchronization by proactively adjusting the communication load on PSs according to network states.We evaluate the proposed scheme on MXNet,known as a realworld distributed training platform,and results show that our scheme achieves up to 2.68 times speed-up of model training in the dynamic and heterogeneous network environment.展开更多
Real-world complex networks are inherently heterogeneous;they have different types of nodes,attributes,and relationships.In recent years,various methods have been proposed to automatically learn how to encode the stru...Real-world complex networks are inherently heterogeneous;they have different types of nodes,attributes,and relationships.In recent years,various methods have been proposed to automatically learn how to encode the structural and semantic information contained in heterogeneous information networks(HINs)into low-dimensional embeddings;this task is called heterogeneous network embedding(HNE).Efficient HNE techniques can benefit various HIN-based machine learning tasks such as node classification,recommender systems,and information retrieval.Here,we provide a comprehensive survey of key advancements in the area of HNE.First,we define an encoder-decoder-based HNE model taxonomy.Then,we systematically overview,compare,and summarize various state-of-the-art HNE models and analyze the advantages and disadvantages of various model categories to identify more potentially competitive HNE frameworks.We also summarize the application fields,benchmark datasets,open source tools,andperformance evaluation in theHNEarea.Finally,wediscuss open issues and suggest promising future directions.We anticipate that this survey will provide deep insights into research in the field of HNE.展开更多
Heterogeneous small cell network is one of the most effective solutions to overcome spectrum scarcity for the next generation of mobile networks.Dual connectivity(DC)can improve the throughput for each individual user...Heterogeneous small cell network is one of the most effective solutions to overcome spectrum scarcity for the next generation of mobile networks.Dual connectivity(DC)can improve the throughput for each individual user by allowing concurrent access to two heterogeneous radio networks.In this paper,we propose a joint user association and fair scheduling algorithm(JUAFS)to deal with the resource allocation and load balancing issues for DC heterogeneous small cell networks.Considering different coverage sizes,numbers of users,and quality of experience characteristics of heterogeneous cells,we present a proportional fair scheduling for user association among cells and utilize interference graph to minimize the transmission conflict probability.Simulation results show the performance improvement of the proposed algorithm in spectrum efficiency and fairness comparing to the existing schemes.展开更多
In this paper,we investigate the system performance of a heterogeneous cellular network consisting of a macro cell and a small cell,where each cell has one user and one base station with multiple antennas.The macro ba...In this paper,we investigate the system performance of a heterogeneous cellular network consisting of a macro cell and a small cell,where each cell has one user and one base station with multiple antennas.The macro base station(MBS)and the small base station(SBS)transmit their confidential messages to the macro user(MU)and the small user(SU)over their shared spectrum respectively.To enhance the system sum rate(SSR)of MBS-MU and SBS-SU transmission,we propose joint antenna selection combined with optimal power allocation(JAS-OPA)scheme and independent antenna selection combined with optimal power allocation(IAS-OPA)scheme.The JAS-OPA scheme requires to know the channel state information(CSI)of transmission channels and interference channels,while the IAS-OPA scheme only needs to know the CSI of transmission channels.In addition,we carry out the analysis for conventional round-robin antenna selection combined with optimal power allocation(RR-OPA)as a benchmark scheme.We formulate the SSR maximization problem through the power allocation between MBS and SBS and propose iterative OPA algorithms for JAS-OPA,IAS-OPA and RR-OPA schemes,respectively.The results show that the OPA schemes outperform the equal power allocation in terms of SSR.Moreover,we provide the closed-form expression of the system outage probability(SOP)for IAS scheme and RR scheme,it shows the SOP performance can be significantly improved by our proposed IAS scheme compared with RR scheme.展开更多
In recent years,real-time video streaming has grown in popularity.The growing popularity of the Internet of Things(IoT)and other wireless heterogeneous networks mandates that network resources be carefully apportioned...In recent years,real-time video streaming has grown in popularity.The growing popularity of the Internet of Things(IoT)and other wireless heterogeneous networks mandates that network resources be carefully apportioned among versatile users in order to achieve the best Quality of Experience(QoE)and performance objectives.Most researchers focused on Forward Error Correction(FEC)techniques when attempting to strike a balance between QoE and performance.However,as network capacity increases,the performance degrades,impacting the live visual experience.Recently,Deep Learning(DL)algorithms have been successfully integrated with FEC to stream videos across multiple heterogeneous networks.But these algorithms need to be changed to make the experience better without sacrificing packet loss and delay time.To address the previous challenge,this paper proposes a novel intelligent algorithm that streams video in multi-home heterogeneous networks based on network-centric characteristics.The proposed framework contains modules such as Intelligent Content Extraction Module(ICEM),Channel Status Monitor(CSM),and Adaptive FEC(AFEC).This framework adopts the Cognitive Learning-based Scheduling(CLS)Module,which works on the deep Reinforced Gated Recurrent Networks(RGRN)principle and embeds them along with the FEC to achieve better performances.The complete framework was developed using the Objective Modular Network Testbed in C++(OMNET++),Internet networking(INET),and Python 3.10,with Keras as the front end and Tensorflow 2.10 as the back end.With extensive experimentation,the proposed model outperforms the other existing intelligentmodels in terms of improving the QoE,minimizing the End-to-End Delay(EED),and maintaining the highest accuracy(98%)and a lower Root Mean Square Error(RMSE)value of 0.001.展开更多
In this work,we consider the performance analysis of state dependent priority traffic and scheduling in device to device(D2D)heterogeneous networks.There are two priority transmission types of data in wireless communi...In this work,we consider the performance analysis of state dependent priority traffic and scheduling in device to device(D2D)heterogeneous networks.There are two priority transmission types of data in wireless communication,such as video or telephone,which always meet the requirements of high priority(HP)data transmission first.If there is a large amount of low priority(LP)data,there will be a large amount of LP data that cannot be sent.This situation will cause excessive delay of LP data and packet dropping probability.In order to solve this problem,the data transmission process of high priority queue and low priority queue is studied.Considering the priority jump strategy to the priority queuing model,the queuing process with two priority data is modeled as a two-dimensionalMarkov chain.A state dependent priority jump queuing strategy is proposed,which can improve the discarding performance of low priority data.The quasi birth and death process method(QBD)and fixed point iterationmethod are used to solve the causality,and the steady-state probability distribution is further obtained.Then,performance parameters such as average queue length,average throughput,average delay and packet dropping probability for both high and low priority data can be expressed.The simulation results verify the correctness of the theoretical derivation.Meanwhile,the proposed priority jump queuing strategy can significantly improve the drop performance of low-priority data.展开更多
Automatic text summarization(ATS)plays a significant role in Natural Language Processing(NLP).Abstractive summarization produces summaries by identifying and compressing the most important information in a document.Ho...Automatic text summarization(ATS)plays a significant role in Natural Language Processing(NLP).Abstractive summarization produces summaries by identifying and compressing the most important information in a document.However,there are only relatively several comprehensively evaluated abstractive summarization models that work well for specific types of reports due to their unstructured and oral language text characteristics.In particular,Chinese complaint reports,generated by urban complainers and collected by government employees,describe existing resident problems in daily life.Meanwhile,the reflected problems are required to respond speedily.Therefore,automatic summarization tasks for these reports have been developed.However,similar to traditional summarization models,the generated summaries still exist problems of informativeness and conciseness.To address these issues and generate suitably informative and less redundant summaries,a topic-based abstractive summarization method is proposed to obtain global and local features.Additionally,a heterogeneous graph of the original document is constructed using word-level and topic-level features.Experiments and analyses on public review datasets(Yelp and Amazon)and our constructed dataset(Chinese complaint reports)show that the proposed framework effectively improves the performance of the abstractive summarization model for Chinese complaint reports.展开更多
基金financial support provided by the Future Energy System at University of Alberta and NSERC Discovery Grant RGPIN-2023-04084。
文摘Geomechanical assessment using coupled reservoir-geomechanical simulation is becoming increasingly important for analyzing the potential geomechanical risks in subsurface geological developments.However,a robust and efficient geomechanical upscaling technique for heterogeneous geological reservoirs is lacking to advance the applications of three-dimensional(3D)reservoir-scale geomechanical simulation considering detailed geological heterogeneities.Here,we develop convolutional neural network(CNN)proxies that reproduce the anisotropic nonlinear geomechanical response caused by lithological heterogeneity,and compute upscaled geomechanical properties from CNN proxies.The CNN proxies are trained using a large dataset of randomly generated spatially correlated sand-shale realizations as inputs and simulation results of their macroscopic geomechanical response as outputs.The trained CNN models can provide the upscaled shear strength(R^(2)>0.949),stress-strain behavior(R^(2)>0.925),and volumetric strain changes(R^(2)>0.958)that highly agree with the numerical simulation results while saving over two orders of magnitude of computational time.This is a major advantage in computing the upscaled geomechanical properties directly from geological realizations without the need to perform local numerical simulations to obtain the geomechanical response.The proposed CNN proxybased upscaling technique has the ability to(1)bridge the gap between the fine-scale geocellular models considering geological uncertainties and computationally efficient geomechanical models used to assess the geomechanical risks of large-scale subsurface development,and(2)improve the efficiency of numerical upscaling techniques that rely on local numerical simulations,leading to significantly increased computational time for uncertainty quantification using numerous geological realizations.
基金Project supported by the National Natural Science Foundations of China(Grant Nos.62171401 and 62071411).
文摘Research on discrete memristor-based neural networks has received much attention.However,current research mainly focuses on memristor–based discrete homogeneous neuron networks,while memristor-coupled discrete heterogeneous neuron networks are rarely reported.In this study,a new four-stable discrete locally active memristor is proposed and its nonvolatile and locally active properties are verified by its power-off plot and DC V–I diagram.Based on two-dimensional(2D)discrete Izhikevich neuron and 2D discrete Chialvo neuron,a heterogeneous discrete neuron network is constructed by using the proposed discrete memristor as a coupling synapse connecting the two heterogeneous neurons.Considering the coupling strength as the control parameter,chaotic firing,periodic firing,and hyperchaotic firing patterns are revealed.In particular,multiple coexisting firing patterns are observed,which are induced by different initial values of the memristor.Phase synchronization between the two heterogeneous neurons is discussed and it is found that they can achieve perfect synchronous at large coupling strength.Furthermore,the effect of Gaussian white noise on synchronization behaviors is also explored.We demonstrate that the presence of noise not only leads to the transition of firing patterns,but also achieves the phase synchronization between two heterogeneous neurons under low coupling strength.
基金supported by the Key Research and Development Program of China(No.2022YFC3005401)Key Research and Development Program of China,Yunnan Province(No.202203AA080009,202202AF080003)Postgraduate Research&Practice Innovation Program of Jiangsu Province(No.KYCX21_0482).
文摘In Beyond the Fifth Generation(B5G)heterogeneous edge networks,numerous users are multiplexed on a channel or served on the same frequency resource block,in which case the transmitter applies coding and the receiver uses interference cancellation.Unfortunately,uncoordinated radio resource allocation can reduce system throughput and lead to user inequity,for this reason,in this paper,channel allocation and power allocation problems are formulated to maximize the system sum rate and minimum user achievable rate.Since the construction model is non-convex and the response variables are high-dimensional,a distributed Deep Reinforcement Learning(DRL)framework called distributed Proximal Policy Optimization(PPO)is proposed to allocate or assign resources.Specifically,several simulated agents are trained in a heterogeneous environment to find robust behaviors that perform well in channel assignment and power allocation.Moreover,agents in the collection stage slow down,which hinders the learning of other agents.Therefore,a preemption strategy is further proposed in this paper to optimize the distributed PPO,form DP-PPO and successfully mitigate the straggler problem.The experimental results show that our mechanism named DP-PPO improves the performance over other DRL methods.
文摘In this paper,we analyze a hybrid Heterogeneous Cellular Network(HCNet)framework by deploying millimeter Wave(mmWave)small cells with coexisting traditional sub-6GHz macro cells to achieve improved coverage and high data rate.We consider randomly-deployed macro base stations throughout the network whereas mmWave Small Base Stations(SBSs)are deployed in the areas with high User Equipment(UE)density.Such user centric deployment of mmWave SBSs inevitably incurs correlation between UE and SBSs.For a realistic scenario where the UEs are distributed according to Poisson cluster process and directional beamforming with line-of-sight and non-line-of-sight transmissions is adopted for mmWave communication.By using tools from stochastic geometry,we develop an analytical framework to analyze various performance metrics in the downlink hybrid HCNets under biased received power association.For UE clustering we considered Thomas cluster process and derive expressions for the association probability,coverage probability,area spectral efficiency,and energy efficiency.We also provide Monte Carlo simulation results to validate the accuracy of the derived expressions.Furthermore,we analyze the impact of mmWave operating frequency,antenna gain,small cell biasing,and BSs density to get useful engineering insights into the performance of hybrid mmWave HCNets.Our results show that network performance is significantly improved by deploying millimeter wave SBS instead of microwave BS in hot spots.
基金This work is funded in part by the Science and Technology Development Fund,Macao SAR(Grant Nos.0093/2022/A2,0076/2022/A2 and 0008/2022/AGJ)in part by the National Nature Science Foundation of China(Grant No.61872452)+3 种基金in part by Special fund for Dongguan’s Rural Revitalization Strategy in 2021(Grant No.20211800400102)in part by Dongguan Special Commissioner Project(Grant No.20211800500182)in part by Guangdong-Dongguan Joint Fund for Basic and Applied Research of Guangdong Province(Grant No.2020A1515110162)in part by University Special Fund of Guangdong Provincial Department of Education(Grant No.2022ZDZX1073).
文摘Interference management is one of the most important issues in the device-to-device(D2D)-enabled heterogeneous cellular networks(HetCNets)due to the coexistence of massive cellular and D2D devices in which D2D devices reuse the cellular spectrum.To alleviate the interference,an efficient interference management way is to set exclusion zones around the cellular receivers.In this paper,we adopt a stochastic geometry approach to analyze the outage probabilities of cellular and D2D users in the D2D-enabled HetCNets.The main difficulties contain three aspects:1)how to model the location randomness of base stations,cellular and D2D users in practical networks;2)how to capture the randomness and interrelation of cellular and D2D transmissions due to the existence of random exclusion zones;3)how to characterize the different types of interference and their impacts on the outage probabilities of cellular and D2D users.We then run extensive Monte-Carlo simulations which manifest that our theoretical model is very accurate.
文摘The structural optimization of wireless sensor networks is a critical issue because it impacts energy consumption and hence the network’s lifetime.Many studies have been conducted for homogeneous networks,but few have been performed for heterogeneouswireless sensor networks.This paper utilizes Rao algorithms to optimize the structure of heterogeneous wireless sensor networks according to node locations and their initial energies.The proposed algorithms lack algorithm-specific parameters and metaphorical connotations.The proposed algorithms examine the search space based on the relations of the population with the best,worst,and randomly assigned solutions.The proposed algorithms can be evaluated using any routing protocol,however,we have chosen the well-known routing protocols in the literature:Low Energy Adaptive Clustering Hierarchy(LEACH),Power-Efficient Gathering in Sensor Information Systems(PEAGSIS),Partitioned-based Energy-efficient LEACH(PE-LEACH),and the Power-Efficient Gathering in Sensor Information Systems Neural Network(PEAGSIS-NN)recent routing protocol.We compare our optimized method with the Jaya,the Particle Swarm Optimization-based Energy Efficient Clustering(PSO-EEC)protocol,and the hybrid Harmony Search Algorithm and PSO(HSA-PSO)algorithms.The efficiencies of our proposed algorithms are evaluated by conducting experiments in terms of the network lifetime(first dead node,half dead nodes,and last dead node),energy consumption,packets to cluster head,and packets to the base station.The experimental results were compared with those obtained using the Jaya optimization algorithm.The proposed algorithms exhibited the best performance.The proposed approach successfully prolongs the network lifetime by 71% for the PEAGSIS protocol,51% for the LEACH protocol,10% for the PE-LEACH protocol,and 73% for the PEGSIS-NN protocol;Moreover,it enhances other criteria such as energy conservation,fitness convergence,packets to cluster head,and packets to the base station.
基金Science and Technology Research Project of Jiangxi Provincial Department of Education(Project No.GJJ211348,GJJ211347 and GJJ2201056)。
文摘A heterogeneous information network,which is composed of various types of nodes and edges,has a complex structure and rich information content,and is widely used in social networks,academic networks,e-commerce,and other fields.Link prediction,as a key task to reveal the unobserved relationships in the network,is of great significance in heterogeneous information networks.This paper reviews the application of presentation-based learning methods in link prediction of heterogeneous information networks.This paper introduces the basic concepts of heterogeneous information networks,and the theoretical basis of representation learning,and discusses the specific application of the deep learning model in node embedding learning and link prediction in detail.The effectiveness and superiority of these methods on multiple real data sets are demonstrated by experimental verification.
基金The authors would like to acknowledge the Institute for Big Data Analytics and Artificial Intelligence(IBDAAI),Universiti TeknologiMARA and the Ministry of Higher Education,Malaysia for the financial support through Fundamental Research Grant Scheme(FRGS)Grant No.FRGS/1/2021/ICT11/UITM/01/1.
文摘Blockchain-enabled cybersecurity system to ensure and strengthen decentralized digital transaction is gradually gaining popularity in the digital era for various areas like finance,transportation,healthcare,education,and supply chain management.Blockchain interactions in the heterogeneous network have fascinated more attention due to the authentication of their digital application exchanges.However,the exponential development of storage space capabilities across the blockchain-based heterogeneous network has become an important issue in preventing blockchain distribution and the extension of blockchain nodes.There is the biggest challenge of data integrity and scalability,including significant computing complexity and inapplicable latency on regional network diversity,operating system diversity,bandwidth diversity,node diversity,etc.,for decision-making of data transactions across blockchain-based heterogeneous networks.Data security and privacy have also become the main concerns across the heterogeneous network to build smart IoT ecosystems.To address these issues,today’s researchers have explored the potential solutions of the capability of heterogeneous network devices to perform data transactions where the system stimulates their integration reliably and securely with blockchain.The key goal of this paper is to conduct a state-of-the-art and comprehensive survey on cybersecurity enhancement using blockchain in the heterogeneous network.This paper proposes a full-fledged taxonomy to identify the main obstacles,research gaps,future research directions,effective solutions,andmost relevant blockchain-enabled cybersecurity systems.In addition,Blockchain based heterogeneous network framework with cybersecurity is proposed in this paper tomeet the goal of maintaining optimal performance data transactions among organizations.Overall,this paper provides an in-depth description based on the critical analysis to overcome the existing work gaps for future research where it presents a potential cybersecurity design with key requirements of blockchain across a heterogeneous network.
基金Research Supporting Project Number(RSP2024R421),King Saud University,Riyadh,Saudi Arabia。
文摘The increased adoption of Internet of Medical Things (IoMT) technologies has resulted in the widespread use ofBody Area Networks (BANs) in medical and non-medical domains. However, the performance of IEEE 802.15.4-based BANs is impacted by challenges related to heterogeneous data traffic requirements among nodes, includingcontention during finite backoff periods, association delays, and traffic channel access through clear channelassessment (CCA) algorithms. These challenges lead to increased packet collisions, queuing delays, retransmissions,and the neglect of critical traffic, thereby hindering performance indicators such as throughput, packet deliveryratio, packet drop rate, and packet delay. Therefore, we propose Dynamic Next Backoff Period and Clear ChannelAssessment (DNBP-CCA) schemes to address these issues. The DNBP-CCA schemes leverage a combination ofthe Dynamic Next Backoff Period (DNBP) scheme and the Dynamic Next Clear Channel Assessment (DNCCA)scheme. The DNBP scheme employs a fuzzy Takagi, Sugeno, and Kang (TSK) model’s inference system toquantitatively analyze backoff exponent, channel clearance, collision ratio, and data rate as input parameters. Onthe other hand, the DNCCA scheme dynamically adapts the CCA process based on requested data transmission tothe coordinator, considering input parameters such as buffer status ratio and acknowledgement ratio. As a result,simulations demonstrate that our proposed schemes are better than some existing representative approaches andenhance data transmission, reduce node collisions, improve average throughput, and packet delivery ratio, anddecrease average packet drop rate and packet delay.
基金the National Natural Science Foundation of People’s Republic of China(Grant Nos.U1703262 and 62163035)the Special Project for Local Science and Technology Development Guided by the Central Government(Grant No.ZYYD2022A05)Xinjiang Key Laboratory of Applied Mathematics(Grant No.XJDX1401)。
文摘On the multilingual online social networks of global information sharing,the wanton spread of rumors has an enormous negative impact on people's lives.Thus,it is essential to explore the rumor-spreading rules in multilingual environment and formulate corresponding control strategies to reduce the harm caused by rumor propagation.In this paper,considering the multilingual environment and intervention mechanism in the rumor-spreading process,an improved ignorants–spreaders-1–spreaders-2–removers(I2SR)rumor-spreading model with time delay and the nonlinear incidence is established in heterogeneous networks.Firstly,based on the mean-field equations corresponding to the model,the basic reproduction number is derived to ensure the existence of rumor-spreading equilibrium.Secondly,by applying Lyapunov stability theory and graph theory,the global stability of rumor-spreading equilibrium is analyzed in detail.In particular,aiming at the lowest control cost,the optimal control scheme is designed to optimize the intervention mechanism,and the optimal control conditions are derived using the Pontryagin's minimum principle.Finally,some illustrative examples are provided to verify the effectiveness of the theoretical results.The results show that optimizing the intervention mechanism can effectively reduce the densities of spreaders-1 and spreaders-2 within the expected time,which provides guiding insights for public opinion managers to control rumors.
基金supported in part by the NSF China under Grant(61701365,61801365,62001347)in part by Natural Science Foundation of Shaanxi Province(2020JQ-686)+4 种基金in part by the China Postdoctoral Science Foundation under Grant(2018M643581,2019TQ0210,2019TQ0241,2020M673344)in part by Young Talent fund of University Association for Science and Technology in Shaanxi,China(20200112)in part by Key Research and Development Program in Shaanxi Province of China(2021GY066)in part by Postdoctoral Foundation in Shaanxi Province of China(2018BSHEDZZ47)the Fundamental Research Funds for the Central Universities。
文摘This paper studies the coordinated planning of transmission tasks in the heterogeneous space networks to enable efficient sharing of ground stations cross satellite systems.Specifically,we first formulate the coordinated planning problem into a mixed integer liner programming(MILP)problem based on time expanded graph.Then,the problem is transferred and reformulated into a consensus optimization framework which can be solved by satellite systems parallelly.With alternating direction method of multipliers(ADMM),a semi-distributed coordinated transmission task planning algorithm is proposed,in which each satellite system plans its own tasks based on local information and limited communication with the coordination center.Simulation results demonstrate that compared with the centralized and fully-distributed methods,the proposed semi-distributed coordinated method can strike a better balance among task complete rate,complexity,and the amount of information required to be exchanged.
文摘ExpertRecommendation(ER)aims to identify domain experts with high expertise and willingness to provide answers to questions in Community Question Answering(CQA)web services.How to model questions and users in the heterogeneous content network is critical to this task.Most traditional methods focus on modeling questions and users based on the textual content left in the community while ignoring the structural properties of heterogeneous CQA networks and always suffering from textual data sparsity issues.Recent approaches take advantage of structural proximities between nodes and attempt to fuse the textual content of nodes for modeling.However,they often fail to distinguish the nodes’personalized preferences and only consider the textual content of a part of the nodes in network embedding learning,while ignoring the semantic relevance of nodes.In this paper,we propose a novel framework that jointly considers the structural proximity relations and textual semantic relevance to model users and questions more comprehensively.Specifically,we learn topology-based embeddings through a hierarchical attentive network learning strategy,in which the proximity information and the personalized preference of nodes are encoded and preserved.Meanwhile,we utilize the node’s textual content and the text correlation between adjacent nodes to build the content-based embedding through a meta-context-aware skip-gram model.In addition,the user’s relative answer quality is incorporated to promote the ranking performance.Experimental results show that our proposed framework consistently and significantly outperforms the state-of-the-art baselines on three real-world datasets by taking the deep semantic understanding and structural feature learning together.The performance of the proposed work is analyzed in terms of MRR,P@K,and MAP and is proven to be more advanced than the existing methodologies.
基金supported by China’s National Key R&D Program,No.2019QY1404the National Natural Science Foundation of China,Grant No.U20A20161,U1836103the Basic Strengthening Program Project,No.2019-JCJQ-ZD-113.
文摘The continuous improvement of the cyber threat intelligence sharing mechanism provides new ideas to deal with Advanced Persistent Threats(APT).Extracting attack behaviors,i.e.,Tactics,Techniques,Procedures(TTP)from Cyber Threat Intelligence(CTI)can facilitate APT actors’profiling for an immediate response.However,it is difficult for traditional manual methods to analyze attack behaviors from cyber threat intelligence due to its heterogeneous nature.Based on the Adversarial Tactics,Techniques and Common Knowledge(ATT&CK)of threat behavior description,this paper proposes a threat behavioral knowledge extraction framework that integrates Heterogeneous Text Network(HTN)and Graph Convolutional Network(GCN)to solve this issue.It leverages the hierarchical correlation relationships of attack techniques and tactics in the ATT&CK to construct a text network of heterogeneous cyber threat intelligence.With the help of the Bidirectional EncoderRepresentation fromTransformers(BERT)pretraining model to analyze the contextual semantics of cyber threat intelligence,the task of threat behavior identification is transformed into a text classification task,which automatically extracts attack behavior in CTI,then identifies the malware and advanced threat actors.The experimental results show that F1 achieve 94.86%and 92.15%for the multi-label classification tasks of tactics and techniques.Extend the experiment to verify the method’s effectiveness in identifying the malware and threat actors in APT attacks.The F1 for malware and advanced threat actors identification task reached 98.45%and 99.48%,which are better than the benchmark model in the experiment and achieve state of the art.The model can effectivelymodel threat intelligence text data and acquire knowledge and experience migration by correlating implied features with a priori knowledge to compensate for insufficient sample data and improve the classification performance and recognition ability of threat behavior in text.
基金partially supported by the computing power networks and new communication primitives project under Grant No. HC-CN-2020120001the National Natural Science Foundation of China under Grant No. 62102066Open Research Projects of Zhejiang Lab under Grant No. 2022QA0AB02
文摘In distributed machine learning(DML)based on the parameter server(PS)architecture,unbalanced communication load distribution of PSs will lead to a significant slowdown of model synchronization in heterogeneous networks due to low utilization of bandwidth.To address this problem,a network-aware adaptive PS load distribution scheme is proposed,which accelerates model synchronization by proactively adjusting the communication load on PSs according to network states.We evaluate the proposed scheme on MXNet,known as a realworld distributed training platform,and results show that our scheme achieves up to 2.68 times speed-up of model training in the dynamic and heterogeneous network environment.
基金supported by the National Key Research and Development Plan of China(2017YFB0503700,2016YFB0501801)the National Natural Science Foundation of China(61170026,62173157)+1 种基金the Thirteen Five-Year Research Planning Project of National Language Committee(No.YB135-149)the Fundamental Research Funds for the Central Universities(Nos.CCNU20QN022,CCNU20QN021,CCNU20ZT012).
文摘Real-world complex networks are inherently heterogeneous;they have different types of nodes,attributes,and relationships.In recent years,various methods have been proposed to automatically learn how to encode the structural and semantic information contained in heterogeneous information networks(HINs)into low-dimensional embeddings;this task is called heterogeneous network embedding(HNE).Efficient HNE techniques can benefit various HIN-based machine learning tasks such as node classification,recommender systems,and information retrieval.Here,we provide a comprehensive survey of key advancements in the area of HNE.First,we define an encoder-decoder-based HNE model taxonomy.Then,we systematically overview,compare,and summarize various state-of-the-art HNE models and analyze the advantages and disadvantages of various model categories to identify more potentially competitive HNE frameworks.We also summarize the application fields,benchmark datasets,open source tools,andperformance evaluation in theHNEarea.Finally,wediscuss open issues and suggest promising future directions.We anticipate that this survey will provide deep insights into research in the field of HNE.
基金supported in part by the National Natural Science Foundation of China under Grant 61871433,61828103in part by the Research Platform of South China Normal University and Foshan。
文摘Heterogeneous small cell network is one of the most effective solutions to overcome spectrum scarcity for the next generation of mobile networks.Dual connectivity(DC)can improve the throughput for each individual user by allowing concurrent access to two heterogeneous radio networks.In this paper,we propose a joint user association and fair scheduling algorithm(JUAFS)to deal with the resource allocation and load balancing issues for DC heterogeneous small cell networks.Considering different coverage sizes,numbers of users,and quality of experience characteristics of heterogeneous cells,we present a proportional fair scheduling for user association among cells and utilize interference graph to minimize the transmission conflict probability.Simulation results show the performance improvement of the proposed algorithm in spectrum efficiency and fairness comparing to the existing schemes.
基金supported by National Natural Science Foundation of China(No.62071253)Postgraduate Research and Practice Innovation Program of Jiangsu Province(KYCX210747).
文摘In this paper,we investigate the system performance of a heterogeneous cellular network consisting of a macro cell and a small cell,where each cell has one user and one base station with multiple antennas.The macro base station(MBS)and the small base station(SBS)transmit their confidential messages to the macro user(MU)and the small user(SU)over their shared spectrum respectively.To enhance the system sum rate(SSR)of MBS-MU and SBS-SU transmission,we propose joint antenna selection combined with optimal power allocation(JAS-OPA)scheme and independent antenna selection combined with optimal power allocation(IAS-OPA)scheme.The JAS-OPA scheme requires to know the channel state information(CSI)of transmission channels and interference channels,while the IAS-OPA scheme only needs to know the CSI of transmission channels.In addition,we carry out the analysis for conventional round-robin antenna selection combined with optimal power allocation(RR-OPA)as a benchmark scheme.We formulate the SSR maximization problem through the power allocation between MBS and SBS and propose iterative OPA algorithms for JAS-OPA,IAS-OPA and RR-OPA schemes,respectively.The results show that the OPA schemes outperform the equal power allocation in terms of SSR.Moreover,we provide the closed-form expression of the system outage probability(SOP)for IAS scheme and RR scheme,it shows the SOP performance can be significantly improved by our proposed IAS scheme compared with RR scheme.
文摘In recent years,real-time video streaming has grown in popularity.The growing popularity of the Internet of Things(IoT)and other wireless heterogeneous networks mandates that network resources be carefully apportioned among versatile users in order to achieve the best Quality of Experience(QoE)and performance objectives.Most researchers focused on Forward Error Correction(FEC)techniques when attempting to strike a balance between QoE and performance.However,as network capacity increases,the performance degrades,impacting the live visual experience.Recently,Deep Learning(DL)algorithms have been successfully integrated with FEC to stream videos across multiple heterogeneous networks.But these algorithms need to be changed to make the experience better without sacrificing packet loss and delay time.To address the previous challenge,this paper proposes a novel intelligent algorithm that streams video in multi-home heterogeneous networks based on network-centric characteristics.The proposed framework contains modules such as Intelligent Content Extraction Module(ICEM),Channel Status Monitor(CSM),and Adaptive FEC(AFEC).This framework adopts the Cognitive Learning-based Scheduling(CLS)Module,which works on the deep Reinforced Gated Recurrent Networks(RGRN)principle and embeds them along with the FEC to achieve better performances.The complete framework was developed using the Objective Modular Network Testbed in C++(OMNET++),Internet networking(INET),and Python 3.10,with Keras as the front end and Tensorflow 2.10 as the back end.With extensive experimentation,the proposed model outperforms the other existing intelligentmodels in terms of improving the QoE,minimizing the End-to-End Delay(EED),and maintaining the highest accuracy(98%)and a lower Root Mean Square Error(RMSE)value of 0.001.
基金2020 MajorNatural Science Research Project of Jiangsu Province Colleges and Universities:Research on Forensic Modeling and Analysis of the Internet of Things(20KJA520004)2020 Open Project of National and Local Joint Engineering Laboratory of Radio Frequency Integration andMicro-assembly Technology:Research on the Security Performance of Radio Frequency Energy Collection Cooperative Communication Network(KFJJ20200201)+1 种基金2021 Jiangsu Police Officer Academy Scientific Research Project:Research on D2D Cache Network Resource Optimization Based on Edge Computing Technology(2021SJYZK01)High-level Introduction of Talent Scientific Research Start-up Fund of Jiangsu Police Institute(JSPI19GKZL407).
文摘In this work,we consider the performance analysis of state dependent priority traffic and scheduling in device to device(D2D)heterogeneous networks.There are two priority transmission types of data in wireless communication,such as video or telephone,which always meet the requirements of high priority(HP)data transmission first.If there is a large amount of low priority(LP)data,there will be a large amount of LP data that cannot be sent.This situation will cause excessive delay of LP data and packet dropping probability.In order to solve this problem,the data transmission process of high priority queue and low priority queue is studied.Considering the priority jump strategy to the priority queuing model,the queuing process with two priority data is modeled as a two-dimensionalMarkov chain.A state dependent priority jump queuing strategy is proposed,which can improve the discarding performance of low priority data.The quasi birth and death process method(QBD)and fixed point iterationmethod are used to solve the causality,and the steady-state probability distribution is further obtained.Then,performance parameters such as average queue length,average throughput,average delay and packet dropping probability for both high and low priority data can be expressed.The simulation results verify the correctness of the theoretical derivation.Meanwhile,the proposed priority jump queuing strategy can significantly improve the drop performance of low-priority data.
基金supported byNationalNatural Science Foundation of China(52274205)and Project of Education Department of Liaoning Province(LJKZ0338).
文摘Automatic text summarization(ATS)plays a significant role in Natural Language Processing(NLP).Abstractive summarization produces summaries by identifying and compressing the most important information in a document.However,there are only relatively several comprehensively evaluated abstractive summarization models that work well for specific types of reports due to their unstructured and oral language text characteristics.In particular,Chinese complaint reports,generated by urban complainers and collected by government employees,describe existing resident problems in daily life.Meanwhile,the reflected problems are required to respond speedily.Therefore,automatic summarization tasks for these reports have been developed.However,similar to traditional summarization models,the generated summaries still exist problems of informativeness and conciseness.To address these issues and generate suitably informative and less redundant summaries,a topic-based abstractive summarization method is proposed to obtain global and local features.Additionally,a heterogeneous graph of the original document is constructed using word-level and topic-level features.Experiments and analyses on public review datasets(Yelp and Amazon)and our constructed dataset(Chinese complaint reports)show that the proposed framework effectively improves the performance of the abstractive summarization model for Chinese complaint reports.