We introduce the potential-decomposition strategy (PDS), which can be used in Markov chain Monte Carlo sampling algorithms. PDS can be designed to make particles move in a modified potential that favors diffusion in...We introduce the potential-decomposition strategy (PDS), which can be used in Markov chain Monte Carlo sampling algorithms. PDS can be designed to make particles move in a modified potential that favors diffusion in phase space, then, by rejecting some trial samples, the target distributions can be sampled in an unbiased manner. Furthermore, if the accepted trial samples are insumcient, they can be recycled as initial states to form more unbiased samples. This strategy can greatly improve efficiency when the original potential has multiple metastable states separated by large barriers. We apply PDS to the 2d Ising model and a double-well potential model with a large barrier, demonstrating in these two representative examples that convergence is accelerated by orders of magnitude.展开更多
AIM To detect blood withdrawal for patients with arterial blood pressure monitoring to increase patient safety and provide better sample dating.METHODS Blood pressure information obtained from a patient monitor was fe...AIM To detect blood withdrawal for patients with arterial blood pressure monitoring to increase patient safety and provide better sample dating.METHODS Blood pressure information obtained from a patient monitor was fed as a real-time data stream to an experimental medical framework. This framework was connected to an analytical application which observes changes in systolic, diastolic and mean pressure to determine anomalies in the continuous data stream. Detection was based on an increased mean blood pressure caused by the closing of the withdrawal three-way tap and an absence of systolic and diastolic measurements during this manipulation. For evaluation of the proposed algorithm, measured data from animal studies in healthy pigs were used.RESULTS Using this novel approach for processing real-time measurement data of arterial pressure monitoring, the exact time of blood withdrawal could be successfully detected retrospectively and in real-time. The algorithm was able to detect 422 of 434(97%) blood withdrawals for blood gas analysis in the retrospective analysis of 7 study trials. Additionally, 64 sampling events for other procedures like laboratory and activated clotting time analyses were detected. The proposed algorithm achieved a sensitivity of 0.97, a precision of 0.96 and an F1 score of 0.97.CONCLUSION Arterial blood pressure monitoring data can be used toperform an accurate identification of individual blood samplings in order to reduce sample mix-ups and thereby increase patient safety.展开更多
We present a method to improve the execution time used to build the roadmap in probabilistic roadmap planners. Our method intelligently deactivates some of the configurations during the learning phase and allows the p...We present a method to improve the execution time used to build the roadmap in probabilistic roadmap planners. Our method intelligently deactivates some of the configurations during the learning phase and allows the planner to concentrate on those configurations that axe most likely going to be useful when building the roadmap. The method can be used with many of the existing sampling algorithms. We ran tests with four simulated robot problems typical in robotics literature. The sampling methods applied were purely random, using Halton numbers, Gaussian distribution, and bridge test technique. In our tests, the deactivation method clearly improved the execution times. Compared with pure random selections, the deactivation method also significantly decreased the size of the roadmap, which is a useful property to simplify roadmap planning tasks.展开更多
Gobi spans a large area of China,surpassing the combined expanse of mobile dunes and semi-fixed dunes.Its presence significantly influences the movement of sand and dust.However,the complex origins and diverse materia...Gobi spans a large area of China,surpassing the combined expanse of mobile dunes and semi-fixed dunes.Its presence significantly influences the movement of sand and dust.However,the complex origins and diverse materials constituting the Gobi result in notable differences in saltation processes across various Gobi surfaces.It is challenging to describe these processes according to a uniform morphology.Therefore,it becomes imperative to articulate surface characteristics through parameters such as the three-dimensional(3D)size and shape of gravel.Collecting morphology information for Gobi gravels is essential for studying its genesis and sand saltation.To enhance the efficiency and information yield of gravel parameter measurements,this study conducted field experiments in the Gobi region across Dunhuang City,Guazhou County,and Yumen City(administrated by Jiuquan City),Gansu Province,China in March 2023.A research framework and methodology for measuring 3D parameters of gravel using point cloud were developed,alongside improved calculation formulas for 3D parameters including gravel grain size,volume,flatness,roundness,sphericity,and equivalent grain size.Leveraging multi-view geometry technology for 3D reconstruction allowed for establishing an optimal data acquisition scheme characterized by high point cloud reconstruction efficiency and clear quality.Additionally,the proposed methodology incorporated point cloud clustering,segmentation,and filtering techniques to isolate individual gravel point clouds.Advanced point cloud algorithms,including the Oriented Bounding Box(OBB),point cloud slicing method,and point cloud triangulation,were then deployed to calculate the 3D parameters of individual gravels.These systematic processes allow precise and detailed characterization of individual gravels.For gravel grain size and volume,the correlation coefficients between point cloud and manual measurements all exceeded 0.9000,confirming the feasibility of the proposed methodology for measuring 3D parameters of individual gravels.The proposed workflow yields accurate calculations of relevant parameters for Gobi gravels,providing essential data support for subsequent studies on Gobi environments.展开更多
An efficient importance sampling algorithm is presented to analyze reliability of complex structural system with multiple failure modes and fuzzy-random uncertainties in basic variables and failure modes. In order to ...An efficient importance sampling algorithm is presented to analyze reliability of complex structural system with multiple failure modes and fuzzy-random uncertainties in basic variables and failure modes. In order to improve the sampling efficiency, the simulated annealing algorithm is adopted to optimize the density center of the importance sampling for each failure mode, and results that the more significant contribution the points make to fuzzy failure probability, the higher occurrence possibility the points are sampled. For the system with multiple fuzzy failure modes, a weighted and mixed importance sampling function is constructed. The contribution of each fuzzy failure mode to the system failure probability is represented by the appropriate factors, and the efficiency of sampling is improved furthermore. The variances and the coefficients of variation are derived for the failure probability estimations. Two examples are introduced to illustrate the rationality of the present method. Comparing with the direct Monte-Carlo method, the improved efficiency and the precision of the method are verified by the examples.展开更多
A new optimization method for the optimization of stacking of composite glass fiber laminates is developed. The fiber orientation and angle of the layers of the cylindrical shells are sought considering the buckling l...A new optimization method for the optimization of stacking of composite glass fiber laminates is developed. The fiber orientation and angle of the layers of the cylindrical shells are sought considering the buckling load. The proposed optimization algorithm applies both finite element analysis and the mode-pursuing sampling (MPS)method. The algorithms suggest the optimal stacking sequence for achieving the maximal buckling load. The procedure is implemented by integrating ANSYS and MATLAB. The stacking sequence designing for the symmetric angle-ply three-layered and five-layered composite cylinder shells is presented to illustrate the optimization process, respectively. Compared with the genetic algorithms, the proposed optimization method is much faster and efficient for composite staking sequence plan.展开更多
In the present paper,we provide an error bound for the learning rates of the regularized Shannon sampling learning scheme when the hypothesis space is a reproducing kernel Hilbert space(RKHS) derived by a Mercer kerne...In the present paper,we provide an error bound for the learning rates of the regularized Shannon sampling learning scheme when the hypothesis space is a reproducing kernel Hilbert space(RKHS) derived by a Mercer kernel and a determined net.We show that if the sample is taken according to the determined set,then,the sample error can be bounded by the Mercer matrix with respect to the samples and the determined net.The regularization error may be bounded by the approximation order of the reproducing kernel Hilbert space interpolation operator.The paper is an investigation on a remark provided by Smale and Zhou.展开更多
To optimize peaking operation when high proportion new energy accesses to power grid,evaluation indexes are proposed which simultaneously consider wind-solar complementation and source-load coupling.A typical wind-sol...To optimize peaking operation when high proportion new energy accesses to power grid,evaluation indexes are proposed which simultaneously consider wind-solar complementation and source-load coupling.A typical wind-solar power output scene model based on peaking demand is established which has anti-peaking characteristic.This model uses balancing scenes and key scenes with probability distribution based on improved Latin hypercube sampling(LHS)algorithm and scene reduction technology to illustrate the influence of wind-solar on peaking demand.Based on this,a peak shaving operation optimization model of high proportion new energy power generation is established.The various operating indexes after optimization in multi-scene peaking are calculated,and the ability of power grid peaking operation is compared whth that considering wind-solar complementation and source-load coupling.Finally,a case of high proportion new energy verifies the feasibility and validity of the proposed operation strategy.展开更多
Sample size re-estimation is essential in oncology studies. However, the use of blinded sample size reassessment for survival data has been rarely reported. Based on the density function of the exponential distributio...Sample size re-estimation is essential in oncology studies. However, the use of blinded sample size reassessment for survival data has been rarely reported. Based on the density function of the exponential distribution, an expectation-maximization(EM) algorithm of the hazard ratio was derived, and several simulation studies were used to verify its applications. The method had obvious variation in the hazard ratio estimates and overestimation for the relatively small hazard ratios. Our studies showed that the stability of the EM estimation results directly correlated with the sample size, the convergence of the EM algorithm was impacted by the initial values, and a balanced design produced the best estimates. No reliable blinded sample size re-estimation inference can be made in our studies, but the results provide useful information to steer the practitioners in this field from repeating the same endeavor.展开更多
Based on the sample entropy algorithm in nonlinear dynamics,an improved sample entropy method is proposed in the aerodynamic system instability identification for the stall precursor detection based on the nonlinear f...Based on the sample entropy algorithm in nonlinear dynamics,an improved sample entropy method is proposed in the aerodynamic system instability identification for the stall precursor detection based on the nonlinear feature extraction algorithm in an axial compressor.The sample entropy algorithm is an improved algorithm based on the approximate entropy algorithm,which quantifies the regularity and the predictability of data in time series.Combined with the spatial modes representing for the rotating stall in the circumferential direction,the recognition capacity of the sample entropy is displayed well on the detection of stall inception.The indications of rotating waves are extracted by the circumferential analysis from modal wave energy.The significant ascendant in the amplitude of the spatial mode is a pronounced feature well before the imminence of stall.Data processing with the spatial mode effectively avoids the problems of inaccurate identification of a single measuring point only depending on pressure.Due to the different selections of similarity tolerance,two kinds of sample entropy are obtained.The properties of the development process of the identification model show obvious mutation phenomena at the boundary of instability,which reveal the inherent characteristic in aerodynamic system.Then the dynamic difference quotient is computed according to the difference quotient criterion,after the smooth management by discrete wavelet.The rapid increase of difference quotient can be regarded as a significant feature of the system approaching the flow instability.It is proven that based on the principle of sample entropy algorithm,the nonlinear characteristic of rotating stall can be well described.The inception can be suggested by about 12-68 revolutions before the stall arrival.This prediction method presenting is accounted for the nonlinearity of the complex flow in stall,which is in a view of data fusion system of pressure for the spatial mode tracking.展开更多
DNA barcodes,short and unique DNA sequences,play a crucial role in sample identification when processing many samples simultaneously,which helps reduce experimental costs.Nevertheless,the low quality of long-read sequ...DNA barcodes,short and unique DNA sequences,play a crucial role in sample identification when processing many samples simultaneously,which helps reduce experimental costs.Nevertheless,the low quality of long-read sequencing makes it difficult to identify barcodes accurately,which poses significant challenges for the design of barcodes for large numbers of samples in a single sequencing run.Here,we present a comprehensive study of the generation of barcodes and develop a tool,PRO,that can be used for selecting optimal barcode sets and demultiplexing.We formulate the barcode design problem as a combinatorial problem and prove that finding the optimal largest barcode set in a given DNA sequence space in which all sequences have the same length is theoretically NP-complete.For practical applications,we developed the novel method PRO by introducing the probability divergence between two DNA sequences to expand the capacity of barcode kits while ensuring demultiplexing accuracy.Specifically,the maximum size of the barcode kits designed by PRO is 2,292,which keeps the length of barcodes the same as that of the official ones used by Oxford Nanopore Technologies(ONT).We validated the performance of PRO on a simulated nanopore dataset with high error rates.The demultiplexing accuracy of PRO reached 98.29%for a barcode kit of size 2,922,4.31%higher than that of Guppy,the official demultiplexing tool.When the size of the barcode kit generated by PRO is the same as the official size provided by ONT,both tools show superior and comparable demultiplexing accuracy.展开更多
Line parameters play an important role in the control and management of distribution systems.Currently,phasor measurement unit(PMU)systems and supervisory control and data acquisition(SCADA)systems coexist in distribu...Line parameters play an important role in the control and management of distribution systems.Currently,phasor measurement unit(PMU)systems and supervisory control and data acquisition(SCADA)systems coexist in distribution systems.Unfortunately,SCADA and PMU measurements usually do not match each other,resulting in inaccurate detection and identification of line parameters based on measurements.To solve this problem,a data-driven method is proposed.SCADA measurements are taken as samples and PMU measurements as the population.A probability parameter identification index(PPII)is derived to detect the whole line parameter based on the probability density function(PDF)parameters of the measurements.For parameter identification,a power-loss PDF with the PMU time stamps and a power-loss chronological PDF are derived via kernel density estimation(KDE)and a conditional PDF.Then,the power-loss samples with the PMU time stamps and chronological correlations are generated by the two PDFs of the power loss via the Metropolis-Hastings(MH)algorithm.Finally,using the power-loss samples and PMU current measurements,the line parameters are identified using the total least squares(TLS)algorithm.Hardware simulations demonstrate the effectiveness of the proposed method for distribution network line parameter detection and identification.展开更多
Bionic hands are promising devices for assisting individuals with hand disabilities in rehabilitation robotics.Controlled primarily by bioelectrical signals such as myoelectricity and EEG,these hands can compensate fo...Bionic hands are promising devices for assisting individuals with hand disabilities in rehabilitation robotics.Controlled primarily by bioelectrical signals such as myoelectricity and EEG,these hands can compensate for lost hand functions.However,developing model-based controllers for bionic hands is challenging and time-consuming due to varying control parameters and unknown application environments.To address these challenges,we propose a model-free approach using reinforcement learning(RL)for designing bionic hand controllers.Our method involves mimicking real human hand motion with the bionic hand and employing a human hand motion decomposition technique to learn complex motions from simpler ones.This approach significantly reduces the training time required.By utilizing real human hand motion data,we design a multidimensional sampling proximal policy optimization(PPO)algorithm that enables efficient motion control of the bionic hand.To validate the effectiveness of our approach,we compare it against advanced baseline methods.The results demonstrate the quick learning capabilities and high control success rate of our method.展开更多
Urban electricity and heat networks(UEHN)consist of the coupling and interactions between electric power systems and district heating systems,in which the geographical and functional features of integrated energy syst...Urban electricity and heat networks(UEHN)consist of the coupling and interactions between electric power systems and district heating systems,in which the geographical and functional features of integrated energy systems are demonstrated.UEHN have been expected to provide an effective way to accommodate the intermittent and unpredictable renewable energy sources,in which the application of stochastic optimization approaches to UEHN analysis is highly desired.In this paper,we propose a chance-constrained coordinated optimization approach for UEHN considering the uncertainties in electricity loads,heat loads,and photovoltaic outputs,as well as the correlations between these uncertain sources.A solution strategy,which combines the Latin Hypercube Sampling Monte Carlo Simulation(LHSMCS)approach and a heuristic algorithm,is specifically designed to deal with the proposed chance-constrained coordinated optimization.Finally,test results on an UEHN comprised of a modified IEEE 33-bus system and a 32-node district heating system at Barry Island have verified the feasibility and effectiveness of the proposed framework.展开更多
基金Supported by the National Natural Science Foundation of China under Grant Nos.10674016,10875013the Specialized Research Foundation for the Doctoral Program of Higher Education under Grant No.20080027005
文摘We introduce the potential-decomposition strategy (PDS), which can be used in Markov chain Monte Carlo sampling algorithms. PDS can be designed to make particles move in a modified potential that favors diffusion in phase space, then, by rejecting some trial samples, the target distributions can be sampled in an unbiased manner. Furthermore, if the accepted trial samples are insumcient, they can be recycled as initial states to form more unbiased samples. This strategy can greatly improve efficiency when the original potential has multiple metastable states separated by large barriers. We apply PDS to the 2d Ising model and a double-well potential model with a large barrier, demonstrating in these two representative examples that convergence is accelerated by orders of magnitude.
文摘AIM To detect blood withdrawal for patients with arterial blood pressure monitoring to increase patient safety and provide better sample dating.METHODS Blood pressure information obtained from a patient monitor was fed as a real-time data stream to an experimental medical framework. This framework was connected to an analytical application which observes changes in systolic, diastolic and mean pressure to determine anomalies in the continuous data stream. Detection was based on an increased mean blood pressure caused by the closing of the withdrawal three-way tap and an absence of systolic and diastolic measurements during this manipulation. For evaluation of the proposed algorithm, measured data from animal studies in healthy pigs were used.RESULTS Using this novel approach for processing real-time measurement data of arterial pressure monitoring, the exact time of blood withdrawal could be successfully detected retrospectively and in real-time. The algorithm was able to detect 422 of 434(97%) blood withdrawals for blood gas analysis in the retrospective analysis of 7 study trials. Additionally, 64 sampling events for other procedures like laboratory and activated clotting time analyses were detected. The proposed algorithm achieved a sensitivity of 0.97, a precision of 0.96 and an F1 score of 0.97.CONCLUSION Arterial blood pressure monitoring data can be used toperform an accurate identification of individual blood samplings in order to reduce sample mix-ups and thereby increase patient safety.
文摘We present a method to improve the execution time used to build the roadmap in probabilistic roadmap planners. Our method intelligently deactivates some of the configurations during the learning phase and allows the planner to concentrate on those configurations that axe most likely going to be useful when building the roadmap. The method can be used with many of the existing sampling algorithms. We ran tests with four simulated robot problems typical in robotics literature. The sampling methods applied were purely random, using Halton numbers, Gaussian distribution, and bridge test technique. In our tests, the deactivation method clearly improved the execution times. Compared with pure random selections, the deactivation method also significantly decreased the size of the roadmap, which is a useful property to simplify roadmap planning tasks.
基金funded by the National Natural Science Foundation of China(42071014).
文摘Gobi spans a large area of China,surpassing the combined expanse of mobile dunes and semi-fixed dunes.Its presence significantly influences the movement of sand and dust.However,the complex origins and diverse materials constituting the Gobi result in notable differences in saltation processes across various Gobi surfaces.It is challenging to describe these processes according to a uniform morphology.Therefore,it becomes imperative to articulate surface characteristics through parameters such as the three-dimensional(3D)size and shape of gravel.Collecting morphology information for Gobi gravels is essential for studying its genesis and sand saltation.To enhance the efficiency and information yield of gravel parameter measurements,this study conducted field experiments in the Gobi region across Dunhuang City,Guazhou County,and Yumen City(administrated by Jiuquan City),Gansu Province,China in March 2023.A research framework and methodology for measuring 3D parameters of gravel using point cloud were developed,alongside improved calculation formulas for 3D parameters including gravel grain size,volume,flatness,roundness,sphericity,and equivalent grain size.Leveraging multi-view geometry technology for 3D reconstruction allowed for establishing an optimal data acquisition scheme characterized by high point cloud reconstruction efficiency and clear quality.Additionally,the proposed methodology incorporated point cloud clustering,segmentation,and filtering techniques to isolate individual gravel point clouds.Advanced point cloud algorithms,including the Oriented Bounding Box(OBB),point cloud slicing method,and point cloud triangulation,were then deployed to calculate the 3D parameters of individual gravels.These systematic processes allow precise and detailed characterization of individual gravels.For gravel grain size and volume,the correlation coefficients between point cloud and manual measurements all exceeded 0.9000,confirming the feasibility of the proposed methodology for measuring 3D parameters of individual gravels.The proposed workflow yields accurate calculations of relevant parameters for Gobi gravels,providing essential data support for subsequent studies on Gobi environments.
基金This project is supported by National Natural Science Foundation of China (No.10572117)Aerospace Science Foundation of China(No.N3CH0502,No.N5CH0001)Provincial Natural Science Foundation of Shanxi, China(No.N3CS0501).
文摘An efficient importance sampling algorithm is presented to analyze reliability of complex structural system with multiple failure modes and fuzzy-random uncertainties in basic variables and failure modes. In order to improve the sampling efficiency, the simulated annealing algorithm is adopted to optimize the density center of the importance sampling for each failure mode, and results that the more significant contribution the points make to fuzzy failure probability, the higher occurrence possibility the points are sampled. For the system with multiple fuzzy failure modes, a weighted and mixed importance sampling function is constructed. The contribution of each fuzzy failure mode to the system failure probability is represented by the appropriate factors, and the efficiency of sampling is improved furthermore. The variances and the coefficients of variation are derived for the failure probability estimations. Two examples are introduced to illustrate the rationality of the present method. Comparing with the direct Monte-Carlo method, the improved efficiency and the precision of the method are verified by the examples.
基金Innovation Team Development Program of Ministry of Education of China (No. IRT0763)National Natural Science Foundation of China (No. 50205028).
文摘A new optimization method for the optimization of stacking of composite glass fiber laminates is developed. The fiber orientation and angle of the layers of the cylindrical shells are sought considering the buckling load. The proposed optimization algorithm applies both finite element analysis and the mode-pursuing sampling (MPS)method. The algorithms suggest the optimal stacking sequence for achieving the maximal buckling load. The procedure is implemented by integrating ANSYS and MATLAB. The stacking sequence designing for the symmetric angle-ply three-layered and five-layered composite cylinder shells is presented to illustrate the optimization process, respectively. Compared with the genetic algorithms, the proposed optimization method is much faster and efficient for composite staking sequence plan.
基金supported by National Natural Science Foundation of China (Grant No.10871226)Natural Science Foundation of Zhejiang Province (Grant No. Y6100096)
文摘In the present paper,we provide an error bound for the learning rates of the regularized Shannon sampling learning scheme when the hypothesis space is a reproducing kernel Hilbert space(RKHS) derived by a Mercer kernel and a determined net.We show that if the sample is taken according to the determined set,then,the sample error can be bounded by the Mercer matrix with respect to the samples and the determined net.The regularization error may be bounded by the approximation order of the reproducing kernel Hilbert space interpolation operator.The paper is an investigation on a remark provided by Smale and Zhou.
基金Youth Science and Technology Fund Project of Gansu Province(No.18JR3RA011)Major Projects in Gansu Province(No.17ZD2GA010)+1 种基金Science and Technology Projects Funding of State Grid Corporation(No.522727160001)Science and Technology Projects of State Grid Gansu Electric Power Company(No.52272716000K)
文摘To optimize peaking operation when high proportion new energy accesses to power grid,evaluation indexes are proposed which simultaneously consider wind-solar complementation and source-load coupling.A typical wind-solar power output scene model based on peaking demand is established which has anti-peaking characteristic.This model uses balancing scenes and key scenes with probability distribution based on improved Latin hypercube sampling(LHS)algorithm and scene reduction technology to illustrate the influence of wind-solar on peaking demand.Based on this,a peak shaving operation optimization model of high proportion new energy power generation is established.The various operating indexes after optimization in multi-scene peaking are calculated,and the ability of power grid peaking operation is compared whth that considering wind-solar complementation and source-load coupling.Finally,a case of high proportion new energy verifies the feasibility and validity of the proposed operation strategy.
基金supported by the National Natural Science Foundation of China(81273184)the National Natural Science Foundation of China Grant for Young Scientists (81302512)
文摘Sample size re-estimation is essential in oncology studies. However, the use of blinded sample size reassessment for survival data has been rarely reported. Based on the density function of the exponential distribution, an expectation-maximization(EM) algorithm of the hazard ratio was derived, and several simulation studies were used to verify its applications. The method had obvious variation in the hazard ratio estimates and overestimation for the relatively small hazard ratios. Our studies showed that the stability of the EM estimation results directly correlated with the sample size, the convergence of the EM algorithm was impacted by the initial values, and a balanced design produced the best estimates. No reliable blinded sample size re-estimation inference can be made in our studies, but the results provide useful information to steer the practitioners in this field from repeating the same endeavor.
基金supported by the National Science and Technology Major Project(J2017-Ⅱ-0009-0023,J2019-Ⅴ-0017-0112)Zhengzhou Aerotropolis Institute of Artificial Intelligence.
文摘Based on the sample entropy algorithm in nonlinear dynamics,an improved sample entropy method is proposed in the aerodynamic system instability identification for the stall precursor detection based on the nonlinear feature extraction algorithm in an axial compressor.The sample entropy algorithm is an improved algorithm based on the approximate entropy algorithm,which quantifies the regularity and the predictability of data in time series.Combined with the spatial modes representing for the rotating stall in the circumferential direction,the recognition capacity of the sample entropy is displayed well on the detection of stall inception.The indications of rotating waves are extracted by the circumferential analysis from modal wave energy.The significant ascendant in the amplitude of the spatial mode is a pronounced feature well before the imminence of stall.Data processing with the spatial mode effectively avoids the problems of inaccurate identification of a single measuring point only depending on pressure.Due to the different selections of similarity tolerance,two kinds of sample entropy are obtained.The properties of the development process of the identification model show obvious mutation phenomena at the boundary of instability,which reveal the inherent characteristic in aerodynamic system.Then the dynamic difference quotient is computed according to the difference quotient criterion,after the smooth management by discrete wavelet.The rapid increase of difference quotient can be regarded as a significant feature of the system approaching the flow instability.It is proven that based on the principle of sample entropy algorithm,the nonlinear characteristic of rotating stall can be well described.The inception can be suggested by about 12-68 revolutions before the stall arrival.This prediction method presenting is accounted for the nonlinearity of the complex flow in stall,which is in a view of data fusion system of pressure for the spatial mode tracking.
文摘DNA barcodes,short and unique DNA sequences,play a crucial role in sample identification when processing many samples simultaneously,which helps reduce experimental costs.Nevertheless,the low quality of long-read sequencing makes it difficult to identify barcodes accurately,which poses significant challenges for the design of barcodes for large numbers of samples in a single sequencing run.Here,we present a comprehensive study of the generation of barcodes and develop a tool,PRO,that can be used for selecting optimal barcode sets and demultiplexing.We formulate the barcode design problem as a combinatorial problem and prove that finding the optimal largest barcode set in a given DNA sequence space in which all sequences have the same length is theoretically NP-complete.For practical applications,we developed the novel method PRO by introducing the probability divergence between two DNA sequences to expand the capacity of barcode kits while ensuring demultiplexing accuracy.Specifically,the maximum size of the barcode kits designed by PRO is 2,292,which keeps the length of barcodes the same as that of the official ones used by Oxford Nanopore Technologies(ONT).We validated the performance of PRO on a simulated nanopore dataset with high error rates.The demultiplexing accuracy of PRO reached 98.29%for a barcode kit of size 2,922,4.31%higher than that of Guppy,the official demultiplexing tool.When the size of the barcode kit generated by PRO is the same as the official size provided by ONT,both tools show superior and comparable demultiplexing accuracy.
基金supported by the National Key Research and Development Program under Grant 2017YFB0902900 and Grant 2017YFB0902902。
文摘Line parameters play an important role in the control and management of distribution systems.Currently,phasor measurement unit(PMU)systems and supervisory control and data acquisition(SCADA)systems coexist in distribution systems.Unfortunately,SCADA and PMU measurements usually do not match each other,resulting in inaccurate detection and identification of line parameters based on measurements.To solve this problem,a data-driven method is proposed.SCADA measurements are taken as samples and PMU measurements as the population.A probability parameter identification index(PPII)is derived to detect the whole line parameter based on the probability density function(PDF)parameters of the measurements.For parameter identification,a power-loss PDF with the PMU time stamps and a power-loss chronological PDF are derived via kernel density estimation(KDE)and a conditional PDF.Then,the power-loss samples with the PMU time stamps and chronological correlations are generated by the two PDFs of the power loss via the Metropolis-Hastings(MH)algorithm.Finally,using the power-loss samples and PMU current measurements,the line parameters are identified using the total least squares(TLS)algorithm.Hardware simulations demonstrate the effectiveness of the proposed method for distribution network line parameter detection and identification.
基金supported by Su Yan Yuan(“Development and industrialization of intelligent multi-degree-of-freedom arm based on perceptual fusion and collaborative control"(Su Yan Yuan[2019]No.107))Shanghai DianJi University(“Research on flexible joint and adaptive control technology for new upper limb prosthesis"(scientific research start-up fund project of Shanghai DianJi University)and“Research on robot intelligent grasping technology based on visual touch fusion in unstructured environment"(Science and technology[2020]No.79 of Shanghai DianJi University)).
文摘Bionic hands are promising devices for assisting individuals with hand disabilities in rehabilitation robotics.Controlled primarily by bioelectrical signals such as myoelectricity and EEG,these hands can compensate for lost hand functions.However,developing model-based controllers for bionic hands is challenging and time-consuming due to varying control parameters and unknown application environments.To address these challenges,we propose a model-free approach using reinforcement learning(RL)for designing bionic hand controllers.Our method involves mimicking real human hand motion with the bionic hand and employing a human hand motion decomposition technique to learn complex motions from simpler ones.This approach significantly reduces the training time required.By utilizing real human hand motion data,we design a multidimensional sampling proximal policy optimization(PPO)algorithm that enables efficient motion control of the bionic hand.To validate the effectiveness of our approach,we compare it against advanced baseline methods.The results demonstrate the quick learning capabilities and high control success rate of our method.
基金This work was supported in part by Natural Science Foundation of Jiangsu Province,China(No.BK20171433)in part by Science and Technology Project of State Grid Jiangsu Electric Power Corporation,China(No.J2018066).
文摘Urban electricity and heat networks(UEHN)consist of the coupling and interactions between electric power systems and district heating systems,in which the geographical and functional features of integrated energy systems are demonstrated.UEHN have been expected to provide an effective way to accommodate the intermittent and unpredictable renewable energy sources,in which the application of stochastic optimization approaches to UEHN analysis is highly desired.In this paper,we propose a chance-constrained coordinated optimization approach for UEHN considering the uncertainties in electricity loads,heat loads,and photovoltaic outputs,as well as the correlations between these uncertain sources.A solution strategy,which combines the Latin Hypercube Sampling Monte Carlo Simulation(LHSMCS)approach and a heuristic algorithm,is specifically designed to deal with the proposed chance-constrained coordinated optimization.Finally,test results on an UEHN comprised of a modified IEEE 33-bus system and a 32-node district heating system at Barry Island have verified the feasibility and effectiveness of the proposed framework.