In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow tasks.In cloud data centers,fog computing takes more time to run workflow applications...In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow tasks.In cloud data centers,fog computing takes more time to run workflow applications.Therefore,it is essential to develop effective models for Virtual Machine(VM)allocation and task scheduling in fog computing environments.Effective task scheduling,VM migration,and allocation,altogether optimize the use of computational resources across different fog nodes.This process ensures that the tasks are executed with minimal energy consumption,which reduces the chances of resource bottlenecks.In this manuscript,the proposed framework comprises two phases:(i)effective task scheduling using a fractional selectivity approach and(ii)VM allocation by proposing an algorithm by the name of Fitness Sharing Chaotic Particle Swarm Optimization(FSCPSO).The proposed FSCPSO algorithm integrates the concepts of chaos theory and fitness sharing that effectively balance both global exploration and local exploitation.This balance enables the use of a wide range of solutions that leads to minimal total cost and makespan,in comparison to other traditional optimization algorithms.The FSCPSO algorithm’s performance is analyzed using six evaluation measures namely,Load Balancing Level(LBL),Average Resource Utilization(ARU),total cost,makespan,energy consumption,and response time.In relation to the conventional optimization algorithms,the FSCPSO algorithm achieves a higher LBL of 39.12%,ARU of 58.15%,a minimal total cost of 1175,and a makespan of 85.87 ms,particularly when evaluated for 50 tasks.展开更多
The diversity of data sources resulted in seeking effective manipulation and dissemination.The challenge that arises from the increasing dimensionality has a negative effect on the computation performance,efficiency,a...The diversity of data sources resulted in seeking effective manipulation and dissemination.The challenge that arises from the increasing dimensionality has a negative effect on the computation performance,efficiency,and stability of computing.One of the most successful optimization algorithms is Particle Swarm Optimization(PSO)which has proved its effectiveness in exploring the highest influencing features in the search space based on its fast convergence and the ability to utilize a small set of parameters in the search task.This research proposes an effective enhancement of PSO that tackles the challenge of randomness search which directly enhances PSO performance.On the other hand,this research proposes a generic intelligent framework for early prediction of orders delay and eliminate orders backlogs which could be considered as an efficient potential solution for raising the supply chain performance.The proposed adapted algorithm has been applied to a supply chain dataset which minimized the features set from twenty-one features to ten significant features.To confirm the proposed algorithm results,the updated data has been examined by eight of the well-known classification algorithms which reached a minimum accuracy percentage equal to 94.3%for random forest and a maximum of 99.0 for Naïve Bayes.Moreover,the proposed algorithm adaptation has been compared with other proposed adaptations of PSO from the literature over different datasets.The proposed PSO adaptation reached a higher accuracy compared with the literature ranging from 97.8 to 99.36 which also proved the advancement of the current research.展开更多
Driven piles are used in many geological environments as a practical and convenient structural component.Hence,the determination of the drivability of piles is actually of great importance in complex geotechnical appl...Driven piles are used in many geological environments as a practical and convenient structural component.Hence,the determination of the drivability of piles is actually of great importance in complex geotechnical applications.Conventional methods of predicting pile drivability often rely on simplified physicalmodels or empirical formulas,whichmay lack accuracy or applicability in complex geological conditions.Therefore,this study presents a practical machine learning approach,namely a Random Forest(RF)optimized by Bayesian Optimization(BO)and Particle Swarm Optimization(PSO),which not only enhances prediction accuracy but also better adapts to varying geological environments to predict the drivability parameters of piles(i.e.,maximumcompressive stress,maximum tensile stress,and blow per foot).In addition,support vector regression,extreme gradient boosting,k nearest neighbor,and decision tree are also used and applied for comparison purposes.In order to train and test these models,among the 4072 datasets collected with 17model inputs,3258 datasets were randomly selected for training,and the remaining 814 datasets were used for model testing.Lastly,the results of these models were compared and evaluated using two performance indices,i.e.,the root mean square error(RMSE)and the coefficient of determination(R2).The results indicate that the optimized RF model achieved lower RMSE than other prediction models in predicting the three parameters,specifically 0.044,0.438,and 0.146;and higher R^(2) values than other implemented techniques,specifically 0.966,0.884,and 0.977.In addition,the sensitivity and uncertainty of the optimized RF model were analyzed using Sobol sensitivity analysis and Monte Carlo(MC)simulation.It can be concluded that the optimized RF model could be used to predict the performance of the pile,and it may provide a useful reference for solving some problems under similar engineering conditions.展开更多
The influence maximization problem aims to select a small set of influential nodes, termed a seed set, to maximize their influence coverage in social networks. Although the methods that are based on a greedy strategy ...The influence maximization problem aims to select a small set of influential nodes, termed a seed set, to maximize their influence coverage in social networks. Although the methods that are based on a greedy strategy can obtain good accuracy, they come at the cost of enormous computational time, and are therefore not applicable to practical scenarios in large-scale networks. In addition, the centrality heuristic algorithms that are based on network topology can be completed in relatively less time. However, they tend to fail to achieve satisfactory results because of drawbacks such as overlapped influence spread. In this work, we propose a discrete two-stage metaheuristic optimization combining quantum-behaved particle swarm optimization with Lévy flight to identify a set of the most influential spreaders. According to the framework,first, the particles in the population are tasked to conduct an exploration in the global solution space to eventually converge to an acceptable solution through the crossover and replacement operations. Second, the Lévy flight mechanism is used to perform a wandering walk on the optimal candidate solution in the population to exploit the potentially unidentified influential nodes in the network. Experiments on six real-world social networks show that the proposed algorithm achieves more satisfactory results when compared to other well-known algorithms.展开更多
Drone logistics is a novel method of distribution that will become prevalent.The advantageous location of the logistics hub enables quicker customer deliveries and lower fuel consumption,resulting in cost savings for ...Drone logistics is a novel method of distribution that will become prevalent.The advantageous location of the logistics hub enables quicker customer deliveries and lower fuel consumption,resulting in cost savings for the company’s transportation operations.Logistics firms must discern the ideal location for establishing a logistics hub,which is challenging due to the simplicity of existing models and the intricate delivery factors.To simulate the drone logistics environment,this study presents a new mathematical model.The model not only retains the aspects of the current models,but also considers the degree of transportation difficulty from the logistics hub to the village,the capacity of drones for transportation,and the distribution of logistics hub locations.Moreover,this paper proposes an improved particle swarm optimization(PSO)algorithm which is a diversity-based hybrid PSO(DHPSO)algorithm to solve this model.In DHPSO,the Gaussian random walk can enhance global search in the model space,while the bubble-net attacking strategy can speed convergence.Besides,Archimedes spiral strategy is employed to overcome the local optima trap in the model and improve the exploitation of the algorithm.DHPSO maintains a balance between exploration and exploitation while better defining the distribution of logistics hub locations Numerical experiments show that the newly proposed model always achieves better locations than the current model.Comparing DHPSO with other state-of-the-art intelligent algorithms,the efficiency of the scheme can be improved by 42.58%.This means that logistics companies can reduce distribution costs and consumers can enjoy a more enjoyable shopping experience by using DHPSO’s location selection.All the results show the location of the drone logistics hub is solved by DHPSO effectively.展开更多
The selection of important factors in machine learning-based susceptibility assessments is crucial to obtain reliable susceptibility results.In this study,metaheuristic optimization and feature selection techniques we...The selection of important factors in machine learning-based susceptibility assessments is crucial to obtain reliable susceptibility results.In this study,metaheuristic optimization and feature selection techniques were applied to identify the most important input parameters for mapping debris flow susceptibility in the southern mountain area of Chengde City in Hebei Province,China,by using machine learning algorithms.In total,133 historical debris flow records and 16 related factors were selected.The support vector machine(SVM)was first used as the base classifier,and then a hybrid model was introduced by a two-step process.First,the particle swarm optimization(PSO)algorithm was employed to select the SVM model hyperparameters.Second,two feature selection algorithms,namely principal component analysis(PCA)and PSO,were integrated into the PSO-based SVM model,which generated the PCA-PSO-SVM and FS-PSO-SVM models,respectively.Three statistical metrics(accuracy,recall,and specificity)and the area under the receiver operating characteristic curve(AUC)were employed to evaluate and validate the performance of the models.The results indicated that the feature selection-based models exhibited the best performance,followed by the PSO-based SVM and SVM models.Moreover,the performance of the FS-PSO-SVM model was better than that of the PCA-PSO-SVM model,showing the highest AUC,accuracy,recall,and specificity values in both the training and testing processes.It was found that the selection of optimal features is crucial to improving the reliability of debris flow susceptibility assessment results.Moreover,the PSO algorithm was found to be not only an effective tool for hyperparameter optimization,but also a useful feature selection algorithm to improve prediction accuracies of debris flow susceptibility by using machine learning algorithms.The high and very high debris flow susceptibility zone appropriately covers 38.01%of the study area,where debris flow may occur under intensive human activities and heavy rainfall events.展开更多
Conventional machine learning(CML)methods have been successfully applied for gas reservoir prediction.Their prediction accuracy largely depends on the quality of the sample data;therefore,feature optimization of the i...Conventional machine learning(CML)methods have been successfully applied for gas reservoir prediction.Their prediction accuracy largely depends on the quality of the sample data;therefore,feature optimization of the input samples is particularly important.Commonly used feature optimization methods increase the interpretability of gas reservoirs;however,their steps are cumbersome,and the selected features cannot sufficiently guide CML models to mine the intrinsic features of sample data efficiently.In contrast to CML methods,deep learning(DL)methods can directly extract the important features of targets from raw data.Therefore,this study proposes a feature optimization and gas-bearing prediction method based on a hybrid fusion model that combines a convolutional neural network(CNN)and an adaptive particle swarm optimization-least squares support vector machine(APSO-LSSVM).This model adopts an end-to-end algorithm structure to directly extract features from sensitive multicomponent seismic attributes,considerably simplifying the feature optimization.A CNN was used for feature optimization to highlight sensitive gas reservoir information.APSO-LSSVM was used to fully learn the relationship between the features extracted by the CNN to obtain the prediction results.The constructed hybrid fusion model improves gas-bearing prediction accuracy through two processes of feature optimization and intelligent prediction,giving full play to the advantages of DL and CML methods.The prediction results obtained are better than those of a single CNN model or APSO-LSSVM model.In the feature optimization process of multicomponent seismic attribute data,CNN has demonstrated better gas reservoir feature extraction capabilities than commonly used attribute optimization methods.In the prediction process,the APSO-LSSVM model can learn the gas reservoir characteristics better than the LSSVM model and has a higher prediction accuracy.The constructed CNN-APSO-LSSVM model had lower errors and a better fit on the test dataset than the other individual models.This method proves the effectiveness of DL technology for the feature extraction of gas reservoirs and provides a feasible way to combine DL and CML technologies to predict gas reservoirs.展开更多
In the process of identifying parameters for a permanent magnet synchronous motor,the particle swarm optimization method is prone to being stuck in local optima in the later stages of iteration,resulting in low parame...In the process of identifying parameters for a permanent magnet synchronous motor,the particle swarm optimization method is prone to being stuck in local optima in the later stages of iteration,resulting in low parameter accuracy.This work proposes a fuzzy particle swarm optimization approach based on the transformation function and the filled function.This approach addresses the topic of particle swarmoptimization in parameter identification from two perspectives.Firstly,the algorithm uses a transformation function to change the form of the fitness function without changing the position of the extreme point of the fitness function,making the extreme point of the fitness function more prominent and improving the algorithm’s search ability while reducing the algorithm’s computational burden.Secondly,on the basis of themulti-loop fuzzy control systembased onmultiplemembership functions,it is merged with the filled function to improve the algorithm’s capacity to skip out of the local optimal solution.This approach can be used to identify the parameters of permanent magnet synchronous motors by sampling only the stator current,voltage,and speed data.The simulation results show that the method can effectively identify the electrical parameters of a permanent magnet synchronous motor,and it has superior global convergence performance and robustness.展开更多
Task scheduling plays a key role in effectively managing and allocating computing resources to meet various computing tasks in a cloud computing environment.Short execution time and low load imbalance may be the chall...Task scheduling plays a key role in effectively managing and allocating computing resources to meet various computing tasks in a cloud computing environment.Short execution time and low load imbalance may be the challenges for some algorithms in resource scheduling scenarios.In this work,the Hierarchical Particle Swarm Optimization-Evolutionary Artificial Bee Colony Algorithm(HPSO-EABC)has been proposed,which hybrids our presented Evolutionary Artificial Bee Colony(EABC),and Hierarchical Particle Swarm Optimization(HPSO)algorithm.The HPSO-EABC algorithm incorporates both the advantages of the HPSO and the EABC algorithm.Comprehensive testing including evaluations of algorithm convergence speed,resource execution time,load balancing,and operational costs has been done.The results indicate that the EABC algorithm exhibits greater parallelism compared to the Artificial Bee Colony algorithm.Compared with the Particle Swarm Optimization algorithm,the HPSO algorithmnot only improves the global search capability but also effectively mitigates getting stuck in local optima.As a result,the hybrid HPSO-EABC algorithm demonstrates significant improvements in terms of stability and convergence speed.Moreover,it exhibits enhanced resource scheduling performance in both homogeneous and heterogeneous environments,effectively reducing execution time and cost,which also is verified by the ablation experimental.展开更多
Numerous wireless networks have emerged that can be used for short communication ranges where the infrastructure-based networks may fail because of their installation and cost.One of them is a sensor network with embe...Numerous wireless networks have emerged that can be used for short communication ranges where the infrastructure-based networks may fail because of their installation and cost.One of them is a sensor network with embedded sensors working as the primary nodes,termed Wireless Sensor Networks(WSNs),in which numerous sensors are connected to at least one Base Station(BS).These sensors gather information from the environment and transmit it to a BS or gathering location.WSNs have several challenges,including throughput,energy usage,and network lifetime concerns.Different strategies have been applied to get over these restrictions.Clustering may,therefore,be thought of as the best way to solve such issues.Consequently,it is crucial to analyze effective Cluster Head(CH)selection to maximize efficiency throughput,extend the network lifetime,and minimize energy consumption.This paper proposed an Accelerated Particle Swarm Optimization(APSO)algorithm based on the Low Energy Adaptive Clustering Hierarchy(LEACH),Neighboring Based Energy Efficient Routing(NBEER),Cooperative Energy Efficient Routing(CEER),and Cooperative Relay Neighboring Based Energy Efficient Routing(CR-NBEER)techniques.With the help of APSO in the implementation of the WSN,the main methodology of this article has taken place.The simulation findings in this study demonstrated that the suggested approach uses less energy,with respective energy consumption ranges of 0.1441 to 0.013 for 5 CH,1.003 to 0.0521 for 10 CH,and 0.1734 to 0.0911 for 15 CH.The sending packets ratio was also raised for all three CH selection scenarios,increasing from 659 to 1730.The number of dead nodes likewise dropped for the given combination,falling between 71 and 66.The network lifetime was deemed to have risen based on the results found.A hybrid with a few valuable parameters can further improve the suggested APSO-based protocol.Similar to underwater,WSN can make use of the proposed protocol.The overall results have been evaluated and compared with the existing approaches of sensor networks.展开更多
The escalating deployment of distributed power sources and random loads in DC distribution networks hasamplified the potential consequences of faults if left uncontrolled. To expedite the process of achieving an optim...The escalating deployment of distributed power sources and random loads in DC distribution networks hasamplified the potential consequences of faults if left uncontrolled. To expedite the process of achieving an optimalconfiguration of measurement points, this paper presents an optimal configuration scheme for fault locationmeasurement points in DC distribution networks based on an improved particle swarm optimization algorithm.Initially, a measurement point distribution optimization model is formulated, leveraging compressive sensing.The model aims to achieve the minimum number of measurement points while attaining the best compressivesensing reconstruction effect. It incorporates constraints from the compressive sensing algorithm and networkwide viewability. Subsequently, the traditional particle swarm algorithm is enhanced by utilizing the Haltonsequence for population initialization, generating uniformly distributed individuals. This enhancement reducesindividual search blindness and overlap probability, thereby promoting population diversity. Furthermore, anadaptive t-distribution perturbation strategy is introduced during the particle update process to enhance the globalsearch capability and search speed. The established model for the optimal configuration of measurement points issolved, and the results demonstrate the efficacy and practicality of the proposed method. The optimal configurationreduces the number of measurement points, enhances localization accuracy, and improves the convergence speedof the algorithm. These findings validate the effectiveness and utility of the proposed approach.展开更多
In many Eastern and Western countries,falling birth rates have led to the gradual aging of society.Older adults are often left alone at home or live in a long-term care center,which results in them being susceptible t...In many Eastern and Western countries,falling birth rates have led to the gradual aging of society.Older adults are often left alone at home or live in a long-term care center,which results in them being susceptible to unsafe events(such as falls)that can have disastrous consequences.However,automatically detecting falls fromvideo data is challenging,and automatic fall detection methods usually require large volumes of training data,which can be difficult to acquire.To address this problem,video kinematic data can be used as training data,thereby avoiding the requirement of creating a large fall data set.This study integrated an improved particle swarm optimization method into a double interactively recurrent fuzzy cerebellar model articulation controller model to develop a costeffective and accurate fall detection system.First,it obtained an optical flow(OF)trajectory diagram from image sequences by using the OF method,and it solved problems related to focal length and object offset by employing the discrete Fourier transform(DFT)algorithm.Second,this study developed the D-IRFCMAC model,which combines spatial and temporal(recurrent)information.Third,it designed an IPSO(Improved Particle Swarm Optimization)algorithm that effectively strengthens the exploratory capabilities of the proposed D-IRFCMAC(Double-Interactively Recurrent Fuzzy Cerebellar Model Articulation Controller)model in the global search space.The proposed approach outperforms existing state-of-the-art methods in terms of action recognition accuracy on the UR-Fall,UP-Fall,and PRECIS HAR data sets.The UCF11 dataset had an average accuracy of 93.13%,whereas the UCF101 dataset had an average accuracy of 92.19%.The UR-Fall dataset had an accuracy of 100%,the UP-Fall dataset had an accuracy of 99.25%,and the PRECIS HAR dataset had an accuracy of 99.07%.展开更多
This paper introduces a novel variant of particle swarm optimization that leverages local displacements through attractors for addressing multiobjective optimization problems. The method incorporates a square root dis...This paper introduces a novel variant of particle swarm optimization that leverages local displacements through attractors for addressing multiobjective optimization problems. The method incorporates a square root distance mechanism into the external archives to enhance the diversity. We evaluate the performance of the proposed approach on a set of constrained and unconstrained multiobjective test functions, establishing a benchmark for comparison. In order to gauge its effectiveness relative to established techniques, we conduct a comprehensive comparison with well-known approaches such as SMPSO, NSGA2 and SPEA2. The numerical results demonstrate that our method not only achieves efficiency but also exhibits competitiveness when compared to evolutionary algorithms. Particularly noteworthy is its superior performance in terms of convergence and diversification, surpassing the capabilities of its predecessors.展开更多
Analyzing big data, especially medical data, helps to provide good health care to patients and face the risks of death. The COVID-19 pandemic has had a significant impact on public health worldwide, emphasizing the ne...Analyzing big data, especially medical data, helps to provide good health care to patients and face the risks of death. The COVID-19 pandemic has had a significant impact on public health worldwide, emphasizing the need for effective risk prediction models. Machine learning (ML) techniques have shown promise in analyzing complex data patterns and predicting disease outcomes. The accuracy of these techniques is greatly affected by changing their parameters. Hyperparameter optimization plays a crucial role in improving model performance. In this work, the Particle Swarm Optimization (PSO) algorithm was used to effectively search the hyperparameter space and improve the predictive power of the machine learning models by identifying the optimal hyperparameters that can provide the highest accuracy. A dataset with a variety of clinical and epidemiological characteristics linked to COVID-19 cases was used in this study. Various machine learning models, including Random Forests, Decision Trees, Support Vector Machines, and Neural Networks, were utilized to capture the complex relationships present in the data. To evaluate the predictive performance of the models, the accuracy metric was employed. The experimental findings showed that the suggested method of estimating COVID-19 risk is effective. When compared to baseline models, the optimized machine learning models performed better and produced better results.展开更多
At present Bayesian Networks(BN)are being used widely for demonstrating uncertain knowledge in many disciplines,including biology,computer science,risk analysis,service quality analysis,and business.But they suffer fr...At present Bayesian Networks(BN)are being used widely for demonstrating uncertain knowledge in many disciplines,including biology,computer science,risk analysis,service quality analysis,and business.But they suffer from the problem that when the nodes and edges increase,the structure learning difficulty increases and algorithms become inefficient.To solve this problem,heuristic optimization algorithms are used,which tend to find a near-optimal answer rather than an exact one,with particle swarm optimization(PSO)being one of them.PSO is a swarm intelligence-based algorithm having basic inspiration from flocks of birds(how they search for food).PSO is employed widely because it is easier to code,converges quickly,and can be parallelized easily.We use a recently proposed version of PSO called generalized particle swarm optimization(GEPSO)to learn bayesian network structure.We construct an initial directed acyclic graph(DAG)by using the max-min parent’s children(MMPC)algorithm and cross relative average entropy.ThisDAGis used to create a population for theGEPSO optimization procedure.Moreover,we propose a velocity update procedure to increase the efficiency of the algorithmic search process.Results of the experiments show that as the complexity of the dataset increases,our algorithm Bayesian network generalized particle swarm optimization(BN-GEPSO)outperforms the PSO algorithm in terms of the Bayesian information criterion(BIC)score.展开更多
Background:To solve the cluster analysis better,we propose a new method based on the chaotic particle swarm optimization(CPSO)algorithm.Methods:In order to enhance the performance in clustering,we propose a novel meth...Background:To solve the cluster analysis better,we propose a new method based on the chaotic particle swarm optimization(CPSO)algorithm.Methods:In order to enhance the performance in clustering,we propose a novel method based on CPSO.We first evaluate the clustering performance of this model using the variance ratio criterion(VRC)as the evaluation metric.The effectiveness of the CPSO algorithm is compared with that of the traditional particle swarm optimization(PSO)algorithm.The CPSO aims to improve the VRC value while avoiding local optimal solutions.The simulated dataset is set at three levels of overlapping:non-overlapping,partial overlapping,and severe overlapping.Finally,we compare CPSO with two other methods.Results:By observing the comparative results,our proposed CPSO method performs outstandingly.In the conditions of non-overlapping,partial overlapping,and severe overlapping,our method has the best VRC values of 1683.2,620.5,and 275.6,respectively.The mean VRC values in these three cases are 1683.2,617.8,and 222.6.Conclusion:The CPSO performed better than other methods for cluster analysis problems.CPSO is effective for cluster analysis.展开更多
基金This work was supported in part by the National Science and Technology Council of Taiwan,under Contract NSTC 112-2410-H-324-001-MY2.
文摘In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow tasks.In cloud data centers,fog computing takes more time to run workflow applications.Therefore,it is essential to develop effective models for Virtual Machine(VM)allocation and task scheduling in fog computing environments.Effective task scheduling,VM migration,and allocation,altogether optimize the use of computational resources across different fog nodes.This process ensures that the tasks are executed with minimal energy consumption,which reduces the chances of resource bottlenecks.In this manuscript,the proposed framework comprises two phases:(i)effective task scheduling using a fractional selectivity approach and(ii)VM allocation by proposing an algorithm by the name of Fitness Sharing Chaotic Particle Swarm Optimization(FSCPSO).The proposed FSCPSO algorithm integrates the concepts of chaos theory and fitness sharing that effectively balance both global exploration and local exploitation.This balance enables the use of a wide range of solutions that leads to minimal total cost and makespan,in comparison to other traditional optimization algorithms.The FSCPSO algorithm’s performance is analyzed using six evaluation measures namely,Load Balancing Level(LBL),Average Resource Utilization(ARU),total cost,makespan,energy consumption,and response time.In relation to the conventional optimization algorithms,the FSCPSO algorithm achieves a higher LBL of 39.12%,ARU of 58.15%,a minimal total cost of 1175,and a makespan of 85.87 ms,particularly when evaluated for 50 tasks.
基金funded by the University of Jeddah,Jeddah,Saudi Arabia,under Grant No.(UJ-23-DR-26)。
文摘The diversity of data sources resulted in seeking effective manipulation and dissemination.The challenge that arises from the increasing dimensionality has a negative effect on the computation performance,efficiency,and stability of computing.One of the most successful optimization algorithms is Particle Swarm Optimization(PSO)which has proved its effectiveness in exploring the highest influencing features in the search space based on its fast convergence and the ability to utilize a small set of parameters in the search task.This research proposes an effective enhancement of PSO that tackles the challenge of randomness search which directly enhances PSO performance.On the other hand,this research proposes a generic intelligent framework for early prediction of orders delay and eliminate orders backlogs which could be considered as an efficient potential solution for raising the supply chain performance.The proposed adapted algorithm has been applied to a supply chain dataset which minimized the features set from twenty-one features to ten significant features.To confirm the proposed algorithm results,the updated data has been examined by eight of the well-known classification algorithms which reached a minimum accuracy percentage equal to 94.3%for random forest and a maximum of 99.0 for Naïve Bayes.Moreover,the proposed algorithm adaptation has been compared with other proposed adaptations of PSO from the literature over different datasets.The proposed PSO adaptation reached a higher accuracy compared with the literature ranging from 97.8 to 99.36 which also proved the advancement of the current research.
基金supported by the National Science Foundation of China(42107183).
文摘Driven piles are used in many geological environments as a practical and convenient structural component.Hence,the determination of the drivability of piles is actually of great importance in complex geotechnical applications.Conventional methods of predicting pile drivability often rely on simplified physicalmodels or empirical formulas,whichmay lack accuracy or applicability in complex geological conditions.Therefore,this study presents a practical machine learning approach,namely a Random Forest(RF)optimized by Bayesian Optimization(BO)and Particle Swarm Optimization(PSO),which not only enhances prediction accuracy but also better adapts to varying geological environments to predict the drivability parameters of piles(i.e.,maximumcompressive stress,maximum tensile stress,and blow per foot).In addition,support vector regression,extreme gradient boosting,k nearest neighbor,and decision tree are also used and applied for comparison purposes.In order to train and test these models,among the 4072 datasets collected with 17model inputs,3258 datasets were randomly selected for training,and the remaining 814 datasets were used for model testing.Lastly,the results of these models were compared and evaluated using two performance indices,i.e.,the root mean square error(RMSE)and the coefficient of determination(R2).The results indicate that the optimized RF model achieved lower RMSE than other prediction models in predicting the three parameters,specifically 0.044,0.438,and 0.146;and higher R^(2) values than other implemented techniques,specifically 0.966,0.884,and 0.977.In addition,the sensitivity and uncertainty of the optimized RF model were analyzed using Sobol sensitivity analysis and Monte Carlo(MC)simulation.It can be concluded that the optimized RF model could be used to predict the performance of the pile,and it may provide a useful reference for solving some problems under similar engineering conditions.
基金Project supported by the Zhejiang Provincial Natural Science Foundation (Grant No.LQ20F020011)the Gansu Provincial Foundation for Distinguished Young Scholars (Grant No.23JRRA766)+1 种基金the National Natural Science Foundation of China (Grant No.62162040)the National Key Research and Development Program of China (Grant No.2020YFB1713600)。
文摘The influence maximization problem aims to select a small set of influential nodes, termed a seed set, to maximize their influence coverage in social networks. Although the methods that are based on a greedy strategy can obtain good accuracy, they come at the cost of enormous computational time, and are therefore not applicable to practical scenarios in large-scale networks. In addition, the centrality heuristic algorithms that are based on network topology can be completed in relatively less time. However, they tend to fail to achieve satisfactory results because of drawbacks such as overlapped influence spread. In this work, we propose a discrete two-stage metaheuristic optimization combining quantum-behaved particle swarm optimization with Lévy flight to identify a set of the most influential spreaders. According to the framework,first, the particles in the population are tasked to conduct an exploration in the global solution space to eventually converge to an acceptable solution through the crossover and replacement operations. Second, the Lévy flight mechanism is used to perform a wandering walk on the optimal candidate solution in the population to exploit the potentially unidentified influential nodes in the network. Experiments on six real-world social networks show that the proposed algorithm achieves more satisfactory results when compared to other well-known algorithms.
基金supported by the NationalNatural Science Foundation of China(No.61866023).
文摘Drone logistics is a novel method of distribution that will become prevalent.The advantageous location of the logistics hub enables quicker customer deliveries and lower fuel consumption,resulting in cost savings for the company’s transportation operations.Logistics firms must discern the ideal location for establishing a logistics hub,which is challenging due to the simplicity of existing models and the intricate delivery factors.To simulate the drone logistics environment,this study presents a new mathematical model.The model not only retains the aspects of the current models,but also considers the degree of transportation difficulty from the logistics hub to the village,the capacity of drones for transportation,and the distribution of logistics hub locations.Moreover,this paper proposes an improved particle swarm optimization(PSO)algorithm which is a diversity-based hybrid PSO(DHPSO)algorithm to solve this model.In DHPSO,the Gaussian random walk can enhance global search in the model space,while the bubble-net attacking strategy can speed convergence.Besides,Archimedes spiral strategy is employed to overcome the local optima trap in the model and improve the exploitation of the algorithm.DHPSO maintains a balance between exploration and exploitation while better defining the distribution of logistics hub locations Numerical experiments show that the newly proposed model always achieves better locations than the current model.Comparing DHPSO with other state-of-the-art intelligent algorithms,the efficiency of the scheme can be improved by 42.58%.This means that logistics companies can reduce distribution costs and consumers can enjoy a more enjoyable shopping experience by using DHPSO’s location selection.All the results show the location of the drone logistics hub is solved by DHPSO effectively.
基金supported by the Second Tibetan Plateau Scientific Expedition and Research Program(Grant no.2019QZKK0904)Natural Science Foundation of Hebei Province(Grant no.D2022403032)S&T Program of Hebei(Grant no.E2021403001).
文摘The selection of important factors in machine learning-based susceptibility assessments is crucial to obtain reliable susceptibility results.In this study,metaheuristic optimization and feature selection techniques were applied to identify the most important input parameters for mapping debris flow susceptibility in the southern mountain area of Chengde City in Hebei Province,China,by using machine learning algorithms.In total,133 historical debris flow records and 16 related factors were selected.The support vector machine(SVM)was first used as the base classifier,and then a hybrid model was introduced by a two-step process.First,the particle swarm optimization(PSO)algorithm was employed to select the SVM model hyperparameters.Second,two feature selection algorithms,namely principal component analysis(PCA)and PSO,were integrated into the PSO-based SVM model,which generated the PCA-PSO-SVM and FS-PSO-SVM models,respectively.Three statistical metrics(accuracy,recall,and specificity)and the area under the receiver operating characteristic curve(AUC)were employed to evaluate and validate the performance of the models.The results indicated that the feature selection-based models exhibited the best performance,followed by the PSO-based SVM and SVM models.Moreover,the performance of the FS-PSO-SVM model was better than that of the PCA-PSO-SVM model,showing the highest AUC,accuracy,recall,and specificity values in both the training and testing processes.It was found that the selection of optimal features is crucial to improving the reliability of debris flow susceptibility assessment results.Moreover,the PSO algorithm was found to be not only an effective tool for hyperparameter optimization,but also a useful feature selection algorithm to improve prediction accuracies of debris flow susceptibility by using machine learning algorithms.The high and very high debris flow susceptibility zone appropriately covers 38.01%of the study area,where debris flow may occur under intensive human activities and heavy rainfall events.
基金funded by the Natural Science Foundation of Shandong Province (ZR2021MD061ZR2023QD025)+3 种基金China Postdoctoral Science Foundation (2022M721972)National Natural Science Foundation of China (41174098)Young Talents Foundation of Inner Mongolia University (10000-23112101/055)Qingdao Postdoctoral Science Foundation (QDBSH20230102094)。
文摘Conventional machine learning(CML)methods have been successfully applied for gas reservoir prediction.Their prediction accuracy largely depends on the quality of the sample data;therefore,feature optimization of the input samples is particularly important.Commonly used feature optimization methods increase the interpretability of gas reservoirs;however,their steps are cumbersome,and the selected features cannot sufficiently guide CML models to mine the intrinsic features of sample data efficiently.In contrast to CML methods,deep learning(DL)methods can directly extract the important features of targets from raw data.Therefore,this study proposes a feature optimization and gas-bearing prediction method based on a hybrid fusion model that combines a convolutional neural network(CNN)and an adaptive particle swarm optimization-least squares support vector machine(APSO-LSSVM).This model adopts an end-to-end algorithm structure to directly extract features from sensitive multicomponent seismic attributes,considerably simplifying the feature optimization.A CNN was used for feature optimization to highlight sensitive gas reservoir information.APSO-LSSVM was used to fully learn the relationship between the features extracted by the CNN to obtain the prediction results.The constructed hybrid fusion model improves gas-bearing prediction accuracy through two processes of feature optimization and intelligent prediction,giving full play to the advantages of DL and CML methods.The prediction results obtained are better than those of a single CNN model or APSO-LSSVM model.In the feature optimization process of multicomponent seismic attribute data,CNN has demonstrated better gas reservoir feature extraction capabilities than commonly used attribute optimization methods.In the prediction process,the APSO-LSSVM model can learn the gas reservoir characteristics better than the LSSVM model and has a higher prediction accuracy.The constructed CNN-APSO-LSSVM model had lower errors and a better fit on the test dataset than the other individual models.This method proves the effectiveness of DL technology for the feature extraction of gas reservoirs and provides a feasible way to combine DL and CML technologies to predict gas reservoirs.
基金the Natural Science Foundation of China under Grant 52077027in part by the Liaoning Province Science and Technology Major Project No.2020JH1/10100020.
文摘In the process of identifying parameters for a permanent magnet synchronous motor,the particle swarm optimization method is prone to being stuck in local optima in the later stages of iteration,resulting in low parameter accuracy.This work proposes a fuzzy particle swarm optimization approach based on the transformation function and the filled function.This approach addresses the topic of particle swarmoptimization in parameter identification from two perspectives.Firstly,the algorithm uses a transformation function to change the form of the fitness function without changing the position of the extreme point of the fitness function,making the extreme point of the fitness function more prominent and improving the algorithm’s search ability while reducing the algorithm’s computational burden.Secondly,on the basis of themulti-loop fuzzy control systembased onmultiplemembership functions,it is merged with the filled function to improve the algorithm’s capacity to skip out of the local optimal solution.This approach can be used to identify the parameters of permanent magnet synchronous motors by sampling only the stator current,voltage,and speed data.The simulation results show that the method can effectively identify the electrical parameters of a permanent magnet synchronous motor,and it has superior global convergence performance and robustness.
基金jointly supported by the Jiangsu Postgraduate Research and Practice Innovation Project under Grant KYCX22_1030,SJCX22_0283 and SJCX23_0293the NUPTSF under Grant NY220201.
文摘Task scheduling plays a key role in effectively managing and allocating computing resources to meet various computing tasks in a cloud computing environment.Short execution time and low load imbalance may be the challenges for some algorithms in resource scheduling scenarios.In this work,the Hierarchical Particle Swarm Optimization-Evolutionary Artificial Bee Colony Algorithm(HPSO-EABC)has been proposed,which hybrids our presented Evolutionary Artificial Bee Colony(EABC),and Hierarchical Particle Swarm Optimization(HPSO)algorithm.The HPSO-EABC algorithm incorporates both the advantages of the HPSO and the EABC algorithm.Comprehensive testing including evaluations of algorithm convergence speed,resource execution time,load balancing,and operational costs has been done.The results indicate that the EABC algorithm exhibits greater parallelism compared to the Artificial Bee Colony algorithm.Compared with the Particle Swarm Optimization algorithm,the HPSO algorithmnot only improves the global search capability but also effectively mitigates getting stuck in local optima.As a result,the hybrid HPSO-EABC algorithm demonstrates significant improvements in terms of stability and convergence speed.Moreover,it exhibits enhanced resource scheduling performance in both homogeneous and heterogeneous environments,effectively reducing execution time and cost,which also is verified by the ablation experimental.
文摘Numerous wireless networks have emerged that can be used for short communication ranges where the infrastructure-based networks may fail because of their installation and cost.One of them is a sensor network with embedded sensors working as the primary nodes,termed Wireless Sensor Networks(WSNs),in which numerous sensors are connected to at least one Base Station(BS).These sensors gather information from the environment and transmit it to a BS or gathering location.WSNs have several challenges,including throughput,energy usage,and network lifetime concerns.Different strategies have been applied to get over these restrictions.Clustering may,therefore,be thought of as the best way to solve such issues.Consequently,it is crucial to analyze effective Cluster Head(CH)selection to maximize efficiency throughput,extend the network lifetime,and minimize energy consumption.This paper proposed an Accelerated Particle Swarm Optimization(APSO)algorithm based on the Low Energy Adaptive Clustering Hierarchy(LEACH),Neighboring Based Energy Efficient Routing(NBEER),Cooperative Energy Efficient Routing(CEER),and Cooperative Relay Neighboring Based Energy Efficient Routing(CR-NBEER)techniques.With the help of APSO in the implementation of the WSN,the main methodology of this article has taken place.The simulation findings in this study demonstrated that the suggested approach uses less energy,with respective energy consumption ranges of 0.1441 to 0.013 for 5 CH,1.003 to 0.0521 for 10 CH,and 0.1734 to 0.0911 for 15 CH.The sending packets ratio was also raised for all three CH selection scenarios,increasing from 659 to 1730.The number of dead nodes likewise dropped for the given combination,falling between 71 and 66.The network lifetime was deemed to have risen based on the results found.A hybrid with a few valuable parameters can further improve the suggested APSO-based protocol.Similar to underwater,WSN can make use of the proposed protocol.The overall results have been evaluated and compared with the existing approaches of sensor networks.
基金the National Natural Science Foundation of China(52177074).
文摘The escalating deployment of distributed power sources and random loads in DC distribution networks hasamplified the potential consequences of faults if left uncontrolled. To expedite the process of achieving an optimalconfiguration of measurement points, this paper presents an optimal configuration scheme for fault locationmeasurement points in DC distribution networks based on an improved particle swarm optimization algorithm.Initially, a measurement point distribution optimization model is formulated, leveraging compressive sensing.The model aims to achieve the minimum number of measurement points while attaining the best compressivesensing reconstruction effect. It incorporates constraints from the compressive sensing algorithm and networkwide viewability. Subsequently, the traditional particle swarm algorithm is enhanced by utilizing the Haltonsequence for population initialization, generating uniformly distributed individuals. This enhancement reducesindividual search blindness and overlap probability, thereby promoting population diversity. Furthermore, anadaptive t-distribution perturbation strategy is introduced during the particle update process to enhance the globalsearch capability and search speed. The established model for the optimal configuration of measurement points issolved, and the results demonstrate the efficacy and practicality of the proposed method. The optimal configurationreduces the number of measurement points, enhances localization accuracy, and improves the convergence speedof the algorithm. These findings validate the effectiveness and utility of the proposed approach.
基金supported by the National Science and Technology Council under grants NSTC 112-2221-E-320-002the Buddhist Tzu Chi Medical Foundation in Taiwan under Grant TCMMP 112-02-02.
文摘In many Eastern and Western countries,falling birth rates have led to the gradual aging of society.Older adults are often left alone at home or live in a long-term care center,which results in them being susceptible to unsafe events(such as falls)that can have disastrous consequences.However,automatically detecting falls fromvideo data is challenging,and automatic fall detection methods usually require large volumes of training data,which can be difficult to acquire.To address this problem,video kinematic data can be used as training data,thereby avoiding the requirement of creating a large fall data set.This study integrated an improved particle swarm optimization method into a double interactively recurrent fuzzy cerebellar model articulation controller model to develop a costeffective and accurate fall detection system.First,it obtained an optical flow(OF)trajectory diagram from image sequences by using the OF method,and it solved problems related to focal length and object offset by employing the discrete Fourier transform(DFT)algorithm.Second,this study developed the D-IRFCMAC model,which combines spatial and temporal(recurrent)information.Third,it designed an IPSO(Improved Particle Swarm Optimization)algorithm that effectively strengthens the exploratory capabilities of the proposed D-IRFCMAC(Double-Interactively Recurrent Fuzzy Cerebellar Model Articulation Controller)model in the global search space.The proposed approach outperforms existing state-of-the-art methods in terms of action recognition accuracy on the UR-Fall,UP-Fall,and PRECIS HAR data sets.The UCF11 dataset had an average accuracy of 93.13%,whereas the UCF101 dataset had an average accuracy of 92.19%.The UR-Fall dataset had an accuracy of 100%,the UP-Fall dataset had an accuracy of 99.25%,and the PRECIS HAR dataset had an accuracy of 99.07%.
文摘This paper introduces a novel variant of particle swarm optimization that leverages local displacements through attractors for addressing multiobjective optimization problems. The method incorporates a square root distance mechanism into the external archives to enhance the diversity. We evaluate the performance of the proposed approach on a set of constrained and unconstrained multiobjective test functions, establishing a benchmark for comparison. In order to gauge its effectiveness relative to established techniques, we conduct a comprehensive comparison with well-known approaches such as SMPSO, NSGA2 and SPEA2. The numerical results demonstrate that our method not only achieves efficiency but also exhibits competitiveness when compared to evolutionary algorithms. Particularly noteworthy is its superior performance in terms of convergence and diversification, surpassing the capabilities of its predecessors.
文摘Analyzing big data, especially medical data, helps to provide good health care to patients and face the risks of death. The COVID-19 pandemic has had a significant impact on public health worldwide, emphasizing the need for effective risk prediction models. Machine learning (ML) techniques have shown promise in analyzing complex data patterns and predicting disease outcomes. The accuracy of these techniques is greatly affected by changing their parameters. Hyperparameter optimization plays a crucial role in improving model performance. In this work, the Particle Swarm Optimization (PSO) algorithm was used to effectively search the hyperparameter space and improve the predictive power of the machine learning models by identifying the optimal hyperparameters that can provide the highest accuracy. A dataset with a variety of clinical and epidemiological characteristics linked to COVID-19 cases was used in this study. Various machine learning models, including Random Forests, Decision Trees, Support Vector Machines, and Neural Networks, were utilized to capture the complex relationships present in the data. To evaluate the predictive performance of the models, the accuracy metric was employed. The experimental findings showed that the suggested method of estimating COVID-19 risk is effective. When compared to baseline models, the optimized machine learning models performed better and produced better results.
基金The authors extended their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work through the Large Groups Project under grant number RGP.2/132/43。
文摘At present Bayesian Networks(BN)are being used widely for demonstrating uncertain knowledge in many disciplines,including biology,computer science,risk analysis,service quality analysis,and business.But they suffer from the problem that when the nodes and edges increase,the structure learning difficulty increases and algorithms become inefficient.To solve this problem,heuristic optimization algorithms are used,which tend to find a near-optimal answer rather than an exact one,with particle swarm optimization(PSO)being one of them.PSO is a swarm intelligence-based algorithm having basic inspiration from flocks of birds(how they search for food).PSO is employed widely because it is easier to code,converges quickly,and can be parallelized easily.We use a recently proposed version of PSO called generalized particle swarm optimization(GEPSO)to learn bayesian network structure.We construct an initial directed acyclic graph(DAG)by using the max-min parent’s children(MMPC)algorithm and cross relative average entropy.ThisDAGis used to create a population for theGEPSO optimization procedure.Moreover,we propose a velocity update procedure to increase the efficiency of the algorithmic search process.Results of the experiments show that as the complexity of the dataset increases,our algorithm Bayesian network generalized particle swarm optimization(BN-GEPSO)outperforms the PSO algorithm in terms of the Bayesian information criterion(BIC)score.
文摘Background:To solve the cluster analysis better,we propose a new method based on the chaotic particle swarm optimization(CPSO)algorithm.Methods:In order to enhance the performance in clustering,we propose a novel method based on CPSO.We first evaluate the clustering performance of this model using the variance ratio criterion(VRC)as the evaluation metric.The effectiveness of the CPSO algorithm is compared with that of the traditional particle swarm optimization(PSO)algorithm.The CPSO aims to improve the VRC value while avoiding local optimal solutions.The simulated dataset is set at three levels of overlapping:non-overlapping,partial overlapping,and severe overlapping.Finally,we compare CPSO with two other methods.Results:By observing the comparative results,our proposed CPSO method performs outstandingly.In the conditions of non-overlapping,partial overlapping,and severe overlapping,our method has the best VRC values of 1683.2,620.5,and 275.6,respectively.The mean VRC values in these three cases are 1683.2,617.8,and 222.6.Conclusion:The CPSO performed better than other methods for cluster analysis problems.CPSO is effective for cluster analysis.