An accurate plasma current profile has irreplaceable value for the steady-state operation of the plasma.In this study,plasma current tomography based on Bayesian inference is applied to an HL-2A device and used to rec...An accurate plasma current profile has irreplaceable value for the steady-state operation of the plasma.In this study,plasma current tomography based on Bayesian inference is applied to an HL-2A device and used to reconstruct the plasma current profile.Two different Bayesian probability priors are tried,namely the Conditional Auto Regressive(CAR)prior and the Advanced Squared Exponential(ASE)kernel prior.Compared to the CAR prior,the ASE kernel prior adopts nonstationary hyperparameters and introduces the current profile of the reference discharge into the hyperparameters,which can make the shape of the current profile more flexible in space.The results indicate that the ASE prior couples more information,reduces the probability of unreasonable solutions,and achieves higher reconstruction accuracy.展开更多
In this work,we perform a Bayesian inference of the crust-core transition density ρ_(t) of neutron stars based on the neutron-star radius and neutron-skin thickness data using a thermodynamical method.Uniform and Gau...In this work,we perform a Bayesian inference of the crust-core transition density ρ_(t) of neutron stars based on the neutron-star radius and neutron-skin thickness data using a thermodynamical method.Uniform and Gaussian distributions for the ρ_(t) prior were adopted in the Bayesian approach.It has a larger probability of having values higher than 0.1 fm^(−3) for ρ_(t) as the uniform prior and neutron-star radius data were used.This was found to be controlled by the curvature K_(sym) of the nuclear symmetry energy.This phenomenon did not occur if K_(sym) was not extremely negative,namely,K_(sym)>−200 MeV.The value ofρ_(t) obtained was 0.075_(−0.01)^(+0.005) fm^(−3) at a confidence level of 68%when both the neutron-star radius and neutron-skin thickness data were considered.Strong anti-correlations were observed between ρ_(t),slope L,and curvature of the nuclear symmetry energy.The dependence of the three L-K_(sym) correlations predicted in the literature on crust-core density and pressure was quantitatively investigated.The most probable value of 0.08 fm^(−3) for ρ_(t) was obtained from the L-K_(sym) relationship proposed by Holt et al.while larger values were preferred for the other two relationships.展开更多
Knowledge graph technology has distinct advantages in terms of fault diagnosis.In this study,the control rod drive mechanism(CRDM)of the liquid fuel thorium molten salt reactor(TMSR-LF1)was taken as the research objec...Knowledge graph technology has distinct advantages in terms of fault diagnosis.In this study,the control rod drive mechanism(CRDM)of the liquid fuel thorium molten salt reactor(TMSR-LF1)was taken as the research object,and a fault diagnosis system was proposed based on knowledge graph.The subject–relation–object triples are defined based on CRDM unstructured data,including design specification,operation and maintenance manual,alarm list,and other forms of expert experience.In this study,we constructed a fault event ontology model to label the entity and relationship involved in the corpus of CRDM fault events.A three-layer robustly optimized bidirectional encoder representation from transformers(RBT3)pre-training approach combined with a text convolutional neural network(TextCNN)was introduced to facilitate the application of the constructed CRDM fault diagnosis graph database for fault query.The RBT3-TextCNN model along with the Jieba tool is proposed for extracting entities and recognizing the fault query intent simultaneously.Experiments on the dataset collected from TMSR-LF1 CRDM fault diagnosis unstructured data demonstrate that this model has the potential to improve the effect of intent recognition and entity extraction.Additionally,a fault alarm monitoring module was developed based on WebSocket protocol to deliver detailed information about the appeared fault to the operator automatically.Furthermore,the Bayesian inference method combined with the variable elimination algorithm was proposed to enable the development of a relatively intelligent and reliable fault diagnosis system.Finally,a CRDM fault diagnosis Web interface integrated with graph data visualization was constructed,making the CRDM fault diagnosis process intuitive and effective.展开更多
In this proceeding,some highlight results on the constraints of the nuclear matter equation of state(EOS)from the data of nucleus resonance and neutron-skin thickness using the Bayesian approach based on the Skyrme-Ha...In this proceeding,some highlight results on the constraints of the nuclear matter equation of state(EOS)from the data of nucleus resonance and neutron-skin thickness using the Bayesian approach based on the Skyrme-Hartree-Fock model and its extension have been presented.Typically,the anti-correlation and positive correlations between the slope parameter and the value of the symmetry energy at the saturation density under the constraint of the neutron-skin thickness and the isovector giant dipole resonance have been discussed respectively.It’s shown that the Bayesian analysis can help to find a compromise for the“PREXII puzzle”and the“soft Tin puzzle”.The possible modifications on the constraints of lower-order EOS parameters as well as the relevant correlation when higher-order EOS parameters are incorporated as independent variables have been further illustrated.For a given model and parameter space,the Bayesian approach serves as a good analysis tool suitable for multi-messengers versus multi-variables,and is helpful for constraining quantitatively the model parameters as well as their correlations.展开更多
To improve the accuracy and speed in cycle-accurate power estimation, this paper uses multiple dimensional coefficients to build a Bayesian inference dynamic power model. By analyzing the power distribution and intern...To improve the accuracy and speed in cycle-accurate power estimation, this paper uses multiple dimensional coefficients to build a Bayesian inference dynamic power model. By analyzing the power distribution and internal node state, we find the deficiency of only using port information. Then, we define the gate level number computing method and the concept of slice, and propose using slice analysis to distill switching density as coefficients in a special circuit stage and participate in Bayesian inference with port information. Experiments show that this method can reduce the power-per-cycle estimation error by 21.9% and the root mean square error by 25.0% compared with the original model, and maintain a 700 + speedup compared with the existing gate-level power analysis technique.展开更多
A nonparametric Bayesian method is presented to classify the MPSK (M-ary phase shift keying) signals. The MPSK signals with unknown signal noise ratios (SNRs) are modeled as a Gaussian mixture model with unknown m...A nonparametric Bayesian method is presented to classify the MPSK (M-ary phase shift keying) signals. The MPSK signals with unknown signal noise ratios (SNRs) are modeled as a Gaussian mixture model with unknown means and covariances in the constellation plane, and a clustering method is proposed to estimate the probability density of the MPSK signals. The method is based on the nonparametric Bayesian inference, which introduces the Dirichlet process as the prior probability of the mixture coefficient, and applies a normal inverse Wishart (NIW) distribution as the prior probability of the unknown mean and covariance. Then, according to the received signals, the parameters are adjusted by the Monte Carlo Markov chain (MCMC) random sampling algorithm. By iterations, the density estimation of the MPSK signals can be estimated. Simulation results show that the correct recognition ratio of 2/4/8PSK is greater than 95% under the condition that SNR 〉5 dB and 1 600 symbols are used in this method.展开更多
To reach a higher level of autonomy for unmanned combat aerial vehicle(UCAV) in air combat games, this paper builds an autonomous maneuver decision system. In this system,the air combat game is regarded as a Markov pr...To reach a higher level of autonomy for unmanned combat aerial vehicle(UCAV) in air combat games, this paper builds an autonomous maneuver decision system. In this system,the air combat game is regarded as a Markov process, so that the air combat situation can be effectively calculated via Bayesian inference theory. According to the situation assessment result,adaptively adjusts the weights of maneuver decision factors, which makes the objective function more reasonable and ensures the superiority situation for UCAV. As the air combat game is characterized by highly dynamic and a significant amount of uncertainty,to enhance the robustness and effectiveness of maneuver decision results, fuzzy logic is used to build the functions of four maneuver decision factors. Accuracy prediction of opponent aircraft is also essential to ensure making a good decision; therefore, a prediction model of opponent aircraft is designed based on the elementary maneuver method. Finally, the moving horizon optimization strategy is used to effectively model the whole air combat maneuver decision process. Various simulations are performed on typical scenario test and close-in dogfight, the results sufficiently demonstrate the superiority of the designed maneuver decision method.展开更多
Aiming at the problem that the consumption data of new ammunition is less and the demand is difficult to predict,combined with the law of ammunition consumption under different damage grades,a Bayesian inference metho...Aiming at the problem that the consumption data of new ammunition is less and the demand is difficult to predict,combined with the law of ammunition consumption under different damage grades,a Bayesian inference method for ammunition demand based on Gompertz distribution is proposed.The Bayesian inference model based on Gompertz distribution is constructed,and the system contribution degree is introduced to determine the weight of the multi-source information.In the case where the prior distribution is known and the distribution of the field data is unknown,the consistency test is performed on the prior information,and the consistency test problem is transformed into the goodness of the fit test problem.Then the Bayesian inference is solved by the Markov chain-Monte Carlo(MCMC)method,and the ammunition demand under different damage grades is gained.The example verifies the accuracy of this method and solves the problem of ammunition demand prediction in the case of insufficient samples.展开更多
Mining penetration testing semantic knowledge hidden in vast amounts of raw penetration testing data is of vital importance for automated penetration testing.Associative rule mining,a data mining technique,has been st...Mining penetration testing semantic knowledge hidden in vast amounts of raw penetration testing data is of vital importance for automated penetration testing.Associative rule mining,a data mining technique,has been studied and explored for a long time.However,few studies have focused on knowledge discovery in the penetration testing area.The experimental result reveals that the long-tail distribution of penetration testing data nullifies the effectiveness of associative rule mining algorithms that are based on frequent pattern.To address this problem,a Bayesian inference based penetration semantic knowledge mining algorithm is proposed.First,a directed bipartite graph model,a kind of Bayesian network,is constructed to formalize penetration testing data.Then,we adopt the maximum likelihood estimate method to optimize the model parameters and decompose a large Bayesian network into smaller networks based on conditional independence of variables for improved solution efficiency.Finally,irrelevant variable elimination is adopted to extract penetration semantic knowledge from the conditional probability distribution of the model.The experimental results show that the proposed method can discover penetration semantic knowledge from raw penetration testing data effectively and efficiently.展开更多
Rock mechanical parameters and their uncertainties are critical to rock stability analysis,engineering design,and safe construction in rock mechanics and engineering.The back analysis is widely adopted in rock enginee...Rock mechanical parameters and their uncertainties are critical to rock stability analysis,engineering design,and safe construction in rock mechanics and engineering.The back analysis is widely adopted in rock engineering to determine the mechanical parameters of the surrounding rock mass,but this does not consider the uncertainty.This problem is addressed here by the proposed approach by developing a system of Bayesian inferences for updating mechanical parameters and their statistical properties using monitored field data,then integrating the monitored data,prior knowledge of geotechnical parameters,and a mechanical model of a rock tunnel using Markov chain Monte Carlo(MCMC)simulation.The proposed approach is illustrated by a circular tunnel with an analytical solution,which was then applied to an experimental tunnel in Goupitan Hydropower Station,China.The mechanical properties and strength parameters of the surrounding rock mass were modeled as random variables.The displacement was predicted with the aid of the parameters updated by Bayesian inferences and agreed closely with monitored displacements.It indicates that Bayesian inferences combined the monitored data into the tunnel model to update its parameters dynamically.Further study indicated that the performance of Bayesian inferences is improved greatly by regularly supplementing field monitoring data.Bayesian inference is a significant and new approach for determining the mechanical parameters of the surrounding rock mass in a tunnel model and contributes to safe construction in rock engineering.展开更多
This paper is focused on the state estimation problem for nonlinear systems with unknown statistics of measurement noise.Based on the cubature Kalman filter,we propose a new nonlinear filtering algorithm that employs ...This paper is focused on the state estimation problem for nonlinear systems with unknown statistics of measurement noise.Based on the cubature Kalman filter,we propose a new nonlinear filtering algorithm that employs a skew t distribution to characterize the asymmetry of the measurement noise.The system states and the statistics of skew t noise distribution,including the shape matrix,the scale matrix,and the degree of freedom(DOF)are estimated jointly by employing variational Bayesian(VB)inference.The proposed method is validated in a target tracking example.Results of the simulation indicate that the proposed nonlinear filter can perform satisfactorily in the presence of unknown statistics of measurement noise and outperform than the existing state-of-the-art nonlinear filters.展开更多
In this paper, we construct a Bayesian framework combining Type-Ⅰ progressively hybrid censoring scheme and competing risks which are independently distributed as exponentiated Weibull distribution with one scale par...In this paper, we construct a Bayesian framework combining Type-Ⅰ progressively hybrid censoring scheme and competing risks which are independently distributed as exponentiated Weibull distribution with one scale parameter and two shape parameters. Since there exist unknown hyper-parameters in prior density functions of shape parameters, we consider the hierarchical priors to obtain the individual marginal posterior density functions,Bayesian estimates and highest posterior density credible intervals. As explicit expressions of estimates cannot be obtained, the componentwise updating algorithm of Metropolis-Hastings method is employed to compute the numerical results. Finally, it is concluded that Bayesian estimates have a good performance.展开更多
Efficient estimation of line spectral from quantized samples is of significant importance in information theory and signal processing,e.g.,channel estimation in energy efficient massive MIMO systems and direction of a...Efficient estimation of line spectral from quantized samples is of significant importance in information theory and signal processing,e.g.,channel estimation in energy efficient massive MIMO systems and direction of arrival estimation.The goal of this paper is to recover the line spectral as well as its corresponding parameters including the model order,frequencies and amplitudes from heavily quantized samples.To this end,we propose an efficient gridless Bayesian algorithm named VALSE-EP,which is a combination of the high resolution and low complexity gridless variational line spectral estimation(VALSE)and expectation propagation(EP).The basic idea of VALSE-EP is to iteratively approximate the challenging quantized model of line spectral estimation as a sequence of simple pseudo unquantized models,where VALSE is applied.Moreover,to obtain a benchmark of the performance of the proposed algorithm,the Cram′er Rao bound(CRB)is derived.Finally,numerical experiments on both synthetic and real data are performed,demonstrating the near CRB performance of the proposed VALSE-EP for line spectral estimation from quantized samples.展开更多
A novel variational Bayesian inference based on adaptive cubature Kalman filter(VBACKF)algorithm is proposed for the problem of state estimation in a target tracking system with time-varying measurement noise and rand...A novel variational Bayesian inference based on adaptive cubature Kalman filter(VBACKF)algorithm is proposed for the problem of state estimation in a target tracking system with time-varying measurement noise and random measurement losses.Firstly,the Inverse-Wishart(IW)distribution is chosen to model the covariance matrix of time-varying measurement noise in the cubature Kalman filter framework.Secondly,the Bernoulli random variable is introduced as the judgement factor of the measurement losses,and the Beta distribution is selected as the conjugate prior distribution of measurement loss probability to ensure that the posterior distribution and prior distribution have the same function form.Finally,the joint posterior probability density function of the estimated variables is approximately decoupled by the variational Bayesian inference,and the fixed-point iteration approach is used to update the estimated variables.The simulation results show that the proposed VBACKF algorithm considers the comprehensive effects of system nonlinearity,time-varying measurement noise and unknown measurement loss probability,moreover,effectively improves the accuracy of target state estimation in complex scene.展开更多
Aiming at the problem of filtering precision degradation caused by the random outliers of process noise and measurement noise in multi-target tracking(MTT) system,a new Gaussian-Student’s t mixture distribution proba...Aiming at the problem of filtering precision degradation caused by the random outliers of process noise and measurement noise in multi-target tracking(MTT) system,a new Gaussian-Student’s t mixture distribution probability hypothesis density(PHD) robust filtering algorithm based on variational Bayesian inference(GST-vbPHD) is proposed.Firstly,since it can accurately describe the heavy-tailed characteristics of noise with outliers,Gaussian-Student’s t mixture distribution is employed to model process noise and measurement noise respectively.Then Bernoulli random variable is introduced to correct the likelihood distribution of the mixture probability,leading hierarchical Gaussian distribution constructed by the Gaussian-Student’s t mixture distribution suitable to model non-stationary noise.Finally,the approximate solutions including target weights,measurement noise covariance and state estimation error covariance are obtained according to variational Bayesian inference approach.The simulation results show that,in the heavy-tailed noise environment,the proposed algorithm leads to strong improvements over the traditional PHD filter and the Student’s t distribution PHD filter.展开更多
We devise an approach to Bayesian statistics and their applications in the analysis of the Monty Hall problem. We combine knowledge gained through applications of the Maximum Entropy Principle and Nash equilibrium str...We devise an approach to Bayesian statistics and their applications in the analysis of the Monty Hall problem. We combine knowledge gained through applications of the Maximum Entropy Principle and Nash equilibrium strategies to provide results concerning the use of Bayesian approaches unique to the Monty Hall problem. We use a model to describe Monty’s decision process and clarify that Bayesian inference results in an “irrelevant, therefore invariant” hypothesis. We discuss the advantages of Bayesian inference over the frequentist inference in tackling the uneven prior probability Monty Hall variant. We demonstrate that the use of Bayesian statistics conforms to the Maximum Entropy Principle in information theory and Bayesian approach successfully resolves dilemmas in the uneven probability Monty Hall variant. Our findings have applications in the decision making, information theory, bioinformatics, quantum game theory and beyond.展开更多
Traditional global sensitivity analysis(GSA)neglects the epistemic uncertainties associated with the probabilistic characteristics(i.e.type of distribution type and its parameters)of input rock properties emanating du...Traditional global sensitivity analysis(GSA)neglects the epistemic uncertainties associated with the probabilistic characteristics(i.e.type of distribution type and its parameters)of input rock properties emanating due to the small size of datasets while mapping the relative importance of properties to the model response.This paper proposes an augmented Bayesian multi-model inference(BMMI)coupled with GSA methodology(BMMI-GSA)to address this issue by estimating the imprecision in the momentindependent sensitivity indices of rock structures arising from the small size of input data.The methodology employs BMMI to quantify the epistemic uncertainties associated with model type and parameters of input properties.The estimated uncertainties are propagated in estimating imprecision in moment-independent Borgonovo’s indices by employing a reweighting approach on candidate probabilistic models.The proposed methodology is showcased for a rock slope prone to stress-controlled failure in the Himalayan region of India.The proposed methodology was superior to the conventional GSA(neglects all epistemic uncertainties)and Bayesian coupled GSA(B-GSA)(neglects model uncertainty)due to its capability to incorporate the uncertainties in both model type and parameters of properties.Imprecise Borgonovo’s indices estimated via proposed methodology provide the confidence intervals of the sensitivity indices instead of their fixed-point estimates,which makes the user more informed in the data collection efforts.Analyses performed with the varying sample sizes suggested that the uncertainties in sensitivity indices reduce significantly with the increasing sample sizes.The accurate importance ranking of properties was only possible via samples of large sizes.Further,the impact of the prior knowledge in terms of prior ranges and distributions was significant;hence,any related assumption should be made carefully.展开更多
This study introduces a method to predict the remaining useful life(RUL)of plain bearings operating under stationary,wear-critical conditions.In this method,the transient wear data of a coupled elastohydrodynamic lubr...This study introduces a method to predict the remaining useful life(RUL)of plain bearings operating under stationary,wear-critical conditions.In this method,the transient wear data of a coupled elastohydrodynamic lubrication(mixed-EHL)and wear simulation approach is used to parametrize a statistical,linear degradation model.The method incorporates Bayesian inference to update the linear degradation model throughout the runtime and thereby consider the transient,system-dependent wear progression within the RUL prediction.A case study is used to show the suitability of the proposed method.The results show that the method can be applied to three distinct types of post-wearing-in behavior:wearing-in with subsequent hydrodynamic,stationary wear,and progressive wear operation.While hydrodynamic operation leads to an infinite lifetime,the method is successfully applied to predict RUL in cases with stationary and progressive wear.展开更多
Modern industrial processes are typically characterized by large-scale and intricate internal relationships.Therefore,the distributed modeling process monitoring method is effective.A novel distributed monitoring sche...Modern industrial processes are typically characterized by large-scale and intricate internal relationships.Therefore,the distributed modeling process monitoring method is effective.A novel distributed monitoring scheme utilizing the Kantorovich distance-multiblock variational autoencoder(KD-MBVAE)is introduced.Firstly,given the high consistency of relevant variables within each sub-block during the change process,the variables exhibiting analogous statistical features are grouped into identical segments according to the optimal quality transfer theory.Subsequently,the variational autoencoder(VAE)model was separately established,and corresponding T^(2)statistics were calculated.To improve fault sensitivity further,a novel statistic,derived from Kantorovich distance,is introduced by analyzing model residuals from the perspective of probability distribution.The thresholds of both statistics were determined by kernel density estimation.Finally,monitoring results for both types of statistics within all blocks are amalgamated using Bayesian inference.Additionally,a novel approach for fault diagnosis is introduced.The feasibility and efficiency of the introduced scheme are verified through two cases.展开更多
The critical slip distance in rate and state model for fault friction in the study of potential earthquakes can vary wildly from micrometers to few me-ters depending on the length scale of the critically stressed faul...The critical slip distance in rate and state model for fault friction in the study of potential earthquakes can vary wildly from micrometers to few me-ters depending on the length scale of the critically stressed fault.This makes it incredibly important to construct an inversion framework that provides good estimates of the critical slip distance purely based on the observed ac-celeration at the seismogram.To eventually construct a framework that takes noisy seismogram acceleration data as input and spits out robust estimates of critical slip distance as the output,we first present the performance of the framework for synthetic data.The framework is based on Bayesian inference and Markov chain Monte Carlo methods.The synthetic data is generated by adding noise to the acceleration output of spring-slider-damper idealization of the rate and state model as the forward model.展开更多
基金supported by the National MCF Energy R&D Program of China (Nos. 2018 YFE0301105, 2022YFE03010002 and 2018YFE0302100)the National Key R&D Program of China (Nos. 2022YFE03070004 and 2022YFE03070000)National Natural Science Foundation of China (Nos. 12205195, 12075155 and 11975277)
文摘An accurate plasma current profile has irreplaceable value for the steady-state operation of the plasma.In this study,plasma current tomography based on Bayesian inference is applied to an HL-2A device and used to reconstruct the plasma current profile.Two different Bayesian probability priors are tried,namely the Conditional Auto Regressive(CAR)prior and the Advanced Squared Exponential(ASE)kernel prior.Compared to the CAR prior,the ASE kernel prior adopts nonstationary hyperparameters and introduces the current profile of the reference discharge into the hyperparameters,which can make the shape of the current profile more flexible in space.The results indicate that the ASE prior couples more information,reduces the probability of unreasonable solutions,and achieves higher reconstruction accuracy.
基金supported by the Shanxi Provincial Foundation for Returned Overseas Scholars (No. 20220037)Natural Science Foundation of Shanxi Province (No. 20210302123085)Discipline Construction Project of Yuncheng University
文摘In this work,we perform a Bayesian inference of the crust-core transition density ρ_(t) of neutron stars based on the neutron-star radius and neutron-skin thickness data using a thermodynamical method.Uniform and Gaussian distributions for the ρ_(t) prior were adopted in the Bayesian approach.It has a larger probability of having values higher than 0.1 fm^(−3) for ρ_(t) as the uniform prior and neutron-star radius data were used.This was found to be controlled by the curvature K_(sym) of the nuclear symmetry energy.This phenomenon did not occur if K_(sym) was not extremely negative,namely,K_(sym)>−200 MeV.The value ofρ_(t) obtained was 0.075_(−0.01)^(+0.005) fm^(−3) at a confidence level of 68%when both the neutron-star radius and neutron-skin thickness data were considered.Strong anti-correlations were observed between ρ_(t),slope L,and curvature of the nuclear symmetry energy.The dependence of the three L-K_(sym) correlations predicted in the literature on crust-core density and pressure was quantitatively investigated.The most probable value of 0.08 fm^(−3) for ρ_(t) was obtained from the L-K_(sym) relationship proposed by Holt et al.while larger values were preferred for the other two relationships.
基金the Young Potential Program of Shanghai Institute of Applied Physics,Chinese Academy of Sciences(No.E0553101).
文摘Knowledge graph technology has distinct advantages in terms of fault diagnosis.In this study,the control rod drive mechanism(CRDM)of the liquid fuel thorium molten salt reactor(TMSR-LF1)was taken as the research object,and a fault diagnosis system was proposed based on knowledge graph.The subject–relation–object triples are defined based on CRDM unstructured data,including design specification,operation and maintenance manual,alarm list,and other forms of expert experience.In this study,we constructed a fault event ontology model to label the entity and relationship involved in the corpus of CRDM fault events.A three-layer robustly optimized bidirectional encoder representation from transformers(RBT3)pre-training approach combined with a text convolutional neural network(TextCNN)was introduced to facilitate the application of the constructed CRDM fault diagnosis graph database for fault query.The RBT3-TextCNN model along with the Jieba tool is proposed for extracting entities and recognizing the fault query intent simultaneously.Experiments on the dataset collected from TMSR-LF1 CRDM fault diagnosis unstructured data demonstrate that this model has the potential to improve the effect of intent recognition and entity extraction.Additionally,a fault alarm monitoring module was developed based on WebSocket protocol to deliver detailed information about the appeared fault to the operator automatically.Furthermore,the Bayesian inference method combined with the variable elimination algorithm was proposed to enable the development of a relatively intelligent and reliable fault diagnosis system.Finally,a CRDM fault diagnosis Web interface integrated with graph data visualization was constructed,making the CRDM fault diagnosis process intuitive and effective.
基金Supported by National Natural Science Foundation of China (11922514)。
文摘In this proceeding,some highlight results on the constraints of the nuclear matter equation of state(EOS)from the data of nucleus resonance and neutron-skin thickness using the Bayesian approach based on the Skyrme-Hartree-Fock model and its extension have been presented.Typically,the anti-correlation and positive correlations between the slope parameter and the value of the symmetry energy at the saturation density under the constraint of the neutron-skin thickness and the isovector giant dipole resonance have been discussed respectively.It’s shown that the Bayesian analysis can help to find a compromise for the“PREXII puzzle”and the“soft Tin puzzle”.The possible modifications on the constraints of lower-order EOS parameters as well as the relevant correlation when higher-order EOS parameters are incorporated as independent variables have been further illustrated.For a given model and parameter space,the Bayesian approach serves as a good analysis tool suitable for multi-messengers versus multi-variables,and is helpful for constraining quantitatively the model parameters as well as their correlations.
文摘To improve the accuracy and speed in cycle-accurate power estimation, this paper uses multiple dimensional coefficients to build a Bayesian inference dynamic power model. By analyzing the power distribution and internal node state, we find the deficiency of only using port information. Then, we define the gate level number computing method and the concept of slice, and propose using slice analysis to distill switching density as coefficients in a special circuit stage and participate in Bayesian inference with port information. Experiments show that this method can reduce the power-per-cycle estimation error by 21.9% and the root mean square error by 25.0% compared with the original model, and maintain a 700 + speedup compared with the existing gate-level power analysis technique.
基金Cultivation Fund of the Key Scientific and Technical Innovation Project of Ministry of Education of China(No.3104001014)
文摘A nonparametric Bayesian method is presented to classify the MPSK (M-ary phase shift keying) signals. The MPSK signals with unknown signal noise ratios (SNRs) are modeled as a Gaussian mixture model with unknown means and covariances in the constellation plane, and a clustering method is proposed to estimate the probability density of the MPSK signals. The method is based on the nonparametric Bayesian inference, which introduces the Dirichlet process as the prior probability of the mixture coefficient, and applies a normal inverse Wishart (NIW) distribution as the prior probability of the unknown mean and covariance. Then, according to the received signals, the parameters are adjusted by the Monte Carlo Markov chain (MCMC) random sampling algorithm. By iterations, the density estimation of the MPSK signals can be estimated. Simulation results show that the correct recognition ratio of 2/4/8PSK is greater than 95% under the condition that SNR 〉5 dB and 1 600 symbols are used in this method.
基金supported by the National Natural Science Foundation of China(61601505)the Aeronautical Science Foundation of China(20155196022)the Shaanxi Natural Science Foundation of China(2016JQ6050)
文摘To reach a higher level of autonomy for unmanned combat aerial vehicle(UCAV) in air combat games, this paper builds an autonomous maneuver decision system. In this system,the air combat game is regarded as a Markov process, so that the air combat situation can be effectively calculated via Bayesian inference theory. According to the situation assessment result,adaptively adjusts the weights of maneuver decision factors, which makes the objective function more reasonable and ensures the superiority situation for UCAV. As the air combat game is characterized by highly dynamic and a significant amount of uncertainty,to enhance the robustness and effectiveness of maneuver decision results, fuzzy logic is used to build the functions of four maneuver decision factors. Accuracy prediction of opponent aircraft is also essential to ensure making a good decision; therefore, a prediction model of opponent aircraft is designed based on the elementary maneuver method. Finally, the moving horizon optimization strategy is used to effectively model the whole air combat maneuver decision process. Various simulations are performed on typical scenario test and close-in dogfight, the results sufficiently demonstrate the superiority of the designed maneuver decision method.
基金the Army Scientific Research(KYSZJWJK1744,012016012600B11403).
文摘Aiming at the problem that the consumption data of new ammunition is less and the demand is difficult to predict,combined with the law of ammunition consumption under different damage grades,a Bayesian inference method for ammunition demand based on Gompertz distribution is proposed.The Bayesian inference model based on Gompertz distribution is constructed,and the system contribution degree is introduced to determine the weight of the multi-source information.In the case where the prior distribution is known and the distribution of the field data is unknown,the consistency test is performed on the prior information,and the consistency test problem is transformed into the goodness of the fit test problem.Then the Bayesian inference is solved by the Markov chain-Monte Carlo(MCMC)method,and the ammunition demand under different damage grades is gained.The example verifies the accuracy of this method and solves the problem of ammunition demand prediction in the case of insufficient samples.
基金the National Natural Science Foundation of China No.61502528.
文摘Mining penetration testing semantic knowledge hidden in vast amounts of raw penetration testing data is of vital importance for automated penetration testing.Associative rule mining,a data mining technique,has been studied and explored for a long time.However,few studies have focused on knowledge discovery in the penetration testing area.The experimental result reveals that the long-tail distribution of penetration testing data nullifies the effectiveness of associative rule mining algorithms that are based on frequent pattern.To address this problem,a Bayesian inference based penetration semantic knowledge mining algorithm is proposed.First,a directed bipartite graph model,a kind of Bayesian network,is constructed to formalize penetration testing data.Then,we adopt the maximum likelihood estimate method to optimize the model parameters and decompose a large Bayesian network into smaller networks based on conditional independence of variables for improved solution efficiency.Finally,irrelevant variable elimination is adopted to extract penetration semantic knowledge from the conditional probability distribution of the model.The experimental results show that the proposed method can discover penetration semantic knowledge from raw penetration testing data effectively and efficiently.
基金support from the Open Research Fund of State Key Laboratory of Geomechanics and Geotechnical Engineering,Institute of Rock and Soil Mechanics,Chinese Academy of Sciences(Grant No.Z020006)the National Natural Science Foundation of China(Grant Nos.U1765206 and 51874119).
文摘Rock mechanical parameters and their uncertainties are critical to rock stability analysis,engineering design,and safe construction in rock mechanics and engineering.The back analysis is widely adopted in rock engineering to determine the mechanical parameters of the surrounding rock mass,but this does not consider the uncertainty.This problem is addressed here by the proposed approach by developing a system of Bayesian inferences for updating mechanical parameters and their statistical properties using monitored field data,then integrating the monitored data,prior knowledge of geotechnical parameters,and a mechanical model of a rock tunnel using Markov chain Monte Carlo(MCMC)simulation.The proposed approach is illustrated by a circular tunnel with an analytical solution,which was then applied to an experimental tunnel in Goupitan Hydropower Station,China.The mechanical properties and strength parameters of the surrounding rock mass were modeled as random variables.The displacement was predicted with the aid of the parameters updated by Bayesian inferences and agreed closely with monitored displacements.It indicates that Bayesian inferences combined the monitored data into the tunnel model to update its parameters dynamically.Further study indicated that the performance of Bayesian inferences is improved greatly by regularly supplementing field monitoring data.Bayesian inference is a significant and new approach for determining the mechanical parameters of the surrounding rock mass in a tunnel model and contributes to safe construction in rock engineering.
基金This work was supported in part by National Natural Science Foundation of China under Grants 62103167 and 61833007in part by the Natural Science Foundation of Jiangsu Province under Grant BK20210451.
文摘This paper is focused on the state estimation problem for nonlinear systems with unknown statistics of measurement noise.Based on the cubature Kalman filter,we propose a new nonlinear filtering algorithm that employs a skew t distribution to characterize the asymmetry of the measurement noise.The system states and the statistics of skew t noise distribution,including the shape matrix,the scale matrix,and the degree of freedom(DOF)are estimated jointly by employing variational Bayesian(VB)inference.The proposed method is validated in a target tracking example.Results of the simulation indicate that the proposed nonlinear filter can perform satisfactorily in the presence of unknown statistics of measurement noise and outperform than the existing state-of-the-art nonlinear filters.
基金Supported by the National Natural Science Foundation of China(71571144,71401134,71171164,11701406) Supported by the International Cooperation and Exchanges in Science and Technology Program of Shaanxi Province(2016KW-033)
文摘In this paper, we construct a Bayesian framework combining Type-Ⅰ progressively hybrid censoring scheme and competing risks which are independently distributed as exponentiated Weibull distribution with one scale parameter and two shape parameters. Since there exist unknown hyper-parameters in prior density functions of shape parameters, we consider the hierarchical priors to obtain the individual marginal posterior density functions,Bayesian estimates and highest posterior density credible intervals. As explicit expressions of estimates cannot be obtained, the componentwise updating algorithm of Metropolis-Hastings method is employed to compute the numerical results. Finally, it is concluded that Bayesian estimates have a good performance.
基金supported by National Natural Science Foundation of China(No.61901415)。
文摘Efficient estimation of line spectral from quantized samples is of significant importance in information theory and signal processing,e.g.,channel estimation in energy efficient massive MIMO systems and direction of arrival estimation.The goal of this paper is to recover the line spectral as well as its corresponding parameters including the model order,frequencies and amplitudes from heavily quantized samples.To this end,we propose an efficient gridless Bayesian algorithm named VALSE-EP,which is a combination of the high resolution and low complexity gridless variational line spectral estimation(VALSE)and expectation propagation(EP).The basic idea of VALSE-EP is to iteratively approximate the challenging quantized model of line spectral estimation as a sequence of simple pseudo unquantized models,where VALSE is applied.Moreover,to obtain a benchmark of the performance of the proposed algorithm,the Cram′er Rao bound(CRB)is derived.Finally,numerical experiments on both synthetic and real data are performed,demonstrating the near CRB performance of the proposed VALSE-EP for line spectral estimation from quantized samples.
基金Supported by the National Natural Science Foundation of China(No.61976080)the Science and Technology Key Project of Science and TechnologyDepartment of Henan Province(No.212102310298)+1 种基金the Academic Degrees&Graduate Education Reform Project of Henan Province(No.2021SJGLX195Y)the Innovation and Quality Improvement Project for Graduate Education of Henan University(No.SYL20010101)。
文摘A novel variational Bayesian inference based on adaptive cubature Kalman filter(VBACKF)algorithm is proposed for the problem of state estimation in a target tracking system with time-varying measurement noise and random measurement losses.Firstly,the Inverse-Wishart(IW)distribution is chosen to model the covariance matrix of time-varying measurement noise in the cubature Kalman filter framework.Secondly,the Bernoulli random variable is introduced as the judgement factor of the measurement losses,and the Beta distribution is selected as the conjugate prior distribution of measurement loss probability to ensure that the posterior distribution and prior distribution have the same function form.Finally,the joint posterior probability density function of the estimated variables is approximately decoupled by the variational Bayesian inference,and the fixed-point iteration approach is used to update the estimated variables.The simulation results show that the proposed VBACKF algorithm considers the comprehensive effects of system nonlinearity,time-varying measurement noise and unknown measurement loss probability,moreover,effectively improves the accuracy of target state estimation in complex scene.
基金Supported by the National Natural Science Foundation of China(No.61976080)the Science and Technology Key Project of Science and Technology Department of Henan Province(No.212102310298)the Innovation and Quality Improvement Project for Graduate Education of Henan University(No.SYL20010101)。
文摘Aiming at the problem of filtering precision degradation caused by the random outliers of process noise and measurement noise in multi-target tracking(MTT) system,a new Gaussian-Student’s t mixture distribution probability hypothesis density(PHD) robust filtering algorithm based on variational Bayesian inference(GST-vbPHD) is proposed.Firstly,since it can accurately describe the heavy-tailed characteristics of noise with outliers,Gaussian-Student’s t mixture distribution is employed to model process noise and measurement noise respectively.Then Bernoulli random variable is introduced to correct the likelihood distribution of the mixture probability,leading hierarchical Gaussian distribution constructed by the Gaussian-Student’s t mixture distribution suitable to model non-stationary noise.Finally,the approximate solutions including target weights,measurement noise covariance and state estimation error covariance are obtained according to variational Bayesian inference approach.The simulation results show that,in the heavy-tailed noise environment,the proposed algorithm leads to strong improvements over the traditional PHD filter and the Student’s t distribution PHD filter.
文摘We devise an approach to Bayesian statistics and their applications in the analysis of the Monty Hall problem. We combine knowledge gained through applications of the Maximum Entropy Principle and Nash equilibrium strategies to provide results concerning the use of Bayesian approaches unique to the Monty Hall problem. We use a model to describe Monty’s decision process and clarify that Bayesian inference results in an “irrelevant, therefore invariant” hypothesis. We discuss the advantages of Bayesian inference over the frequentist inference in tackling the uneven prior probability Monty Hall variant. We demonstrate that the use of Bayesian statistics conforms to the Maximum Entropy Principle in information theory and Bayesian approach successfully resolves dilemmas in the uneven probability Monty Hall variant. Our findings have applications in the decision making, information theory, bioinformatics, quantum game theory and beyond.
文摘Traditional global sensitivity analysis(GSA)neglects the epistemic uncertainties associated with the probabilistic characteristics(i.e.type of distribution type and its parameters)of input rock properties emanating due to the small size of datasets while mapping the relative importance of properties to the model response.This paper proposes an augmented Bayesian multi-model inference(BMMI)coupled with GSA methodology(BMMI-GSA)to address this issue by estimating the imprecision in the momentindependent sensitivity indices of rock structures arising from the small size of input data.The methodology employs BMMI to quantify the epistemic uncertainties associated with model type and parameters of input properties.The estimated uncertainties are propagated in estimating imprecision in moment-independent Borgonovo’s indices by employing a reweighting approach on candidate probabilistic models.The proposed methodology is showcased for a rock slope prone to stress-controlled failure in the Himalayan region of India.The proposed methodology was superior to the conventional GSA(neglects all epistemic uncertainties)and Bayesian coupled GSA(B-GSA)(neglects model uncertainty)due to its capability to incorporate the uncertainties in both model type and parameters of properties.Imprecise Borgonovo’s indices estimated via proposed methodology provide the confidence intervals of the sensitivity indices instead of their fixed-point estimates,which makes the user more informed in the data collection efforts.Analyses performed with the varying sample sizes suggested that the uncertainties in sensitivity indices reduce significantly with the increasing sample sizes.The accurate importance ranking of properties was only possible via samples of large sizes.Further,the impact of the prior knowledge in terms of prior ranges and distributions was significant;hence,any related assumption should be made carefully.
文摘This study introduces a method to predict the remaining useful life(RUL)of plain bearings operating under stationary,wear-critical conditions.In this method,the transient wear data of a coupled elastohydrodynamic lubrication(mixed-EHL)and wear simulation approach is used to parametrize a statistical,linear degradation model.The method incorporates Bayesian inference to update the linear degradation model throughout the runtime and thereby consider the transient,system-dependent wear progression within the RUL prediction.A case study is used to show the suitability of the proposed method.The results show that the method can be applied to three distinct types of post-wearing-in behavior:wearing-in with subsequent hydrodynamic,stationary wear,and progressive wear operation.While hydrodynamic operation leads to an infinite lifetime,the method is successfully applied to predict RUL in cases with stationary and progressive wear.
基金support from the National Key Research&Development Program of China(2021YFC2101100)the National Natural Science Foundation of China(62322309,61973119).
文摘Modern industrial processes are typically characterized by large-scale and intricate internal relationships.Therefore,the distributed modeling process monitoring method is effective.A novel distributed monitoring scheme utilizing the Kantorovich distance-multiblock variational autoencoder(KD-MBVAE)is introduced.Firstly,given the high consistency of relevant variables within each sub-block during the change process,the variables exhibiting analogous statistical features are grouped into identical segments according to the optimal quality transfer theory.Subsequently,the variational autoencoder(VAE)model was separately established,and corresponding T^(2)statistics were calculated.To improve fault sensitivity further,a novel statistic,derived from Kantorovich distance,is introduced by analyzing model residuals from the perspective of probability distribution.The thresholds of both statistics were determined by kernel density estimation.Finally,monitoring results for both types of statistics within all blocks are amalgamated using Bayesian inference.Additionally,a novel approach for fault diagnosis is introduced.The feasibility and efficiency of the introduced scheme are verified through two cases.
文摘The critical slip distance in rate and state model for fault friction in the study of potential earthquakes can vary wildly from micrometers to few me-ters depending on the length scale of the critically stressed fault.This makes it incredibly important to construct an inversion framework that provides good estimates of the critical slip distance purely based on the observed ac-celeration at the seismogram.To eventually construct a framework that takes noisy seismogram acceleration data as input and spits out robust estimates of critical slip distance as the output,we first present the performance of the framework for synthetic data.The framework is based on Bayesian inference and Markov chain Monte Carlo methods.The synthetic data is generated by adding noise to the acceleration output of spring-slider-damper idealization of the rate and state model as the forward model.