In this work,we perform a Bayesian inference of the crust-core transition density ρ_(t) of neutron stars based on the neutron-star radius and neutron-skin thickness data using a thermodynamical method.Uniform and Gau...In this work,we perform a Bayesian inference of the crust-core transition density ρ_(t) of neutron stars based on the neutron-star radius and neutron-skin thickness data using a thermodynamical method.Uniform and Gaussian distributions for the ρ_(t) prior were adopted in the Bayesian approach.It has a larger probability of having values higher than 0.1 fm^(−3) for ρ_(t) as the uniform prior and neutron-star radius data were used.This was found to be controlled by the curvature K_(sym) of the nuclear symmetry energy.This phenomenon did not occur if K_(sym) was not extremely negative,namely,K_(sym)>−200 MeV.The value ofρ_(t) obtained was 0.075_(−0.01)^(+0.005) fm^(−3) at a confidence level of 68%when both the neutron-star radius and neutron-skin thickness data were considered.Strong anti-correlations were observed between ρ_(t),slope L,and curvature of the nuclear symmetry energy.The dependence of the three L-K_(sym) correlations predicted in the literature on crust-core density and pressure was quantitatively investigated.The most probable value of 0.08 fm^(−3) for ρ_(t) was obtained from the L-K_(sym) relationship proposed by Holt et al.while larger values were preferred for the other two relationships.展开更多
Knowledge graph technology has distinct advantages in terms of fault diagnosis.In this study,the control rod drive mechanism(CRDM)of the liquid fuel thorium molten salt reactor(TMSR-LF1)was taken as the research objec...Knowledge graph technology has distinct advantages in terms of fault diagnosis.In this study,the control rod drive mechanism(CRDM)of the liquid fuel thorium molten salt reactor(TMSR-LF1)was taken as the research object,and a fault diagnosis system was proposed based on knowledge graph.The subject–relation–object triples are defined based on CRDM unstructured data,including design specification,operation and maintenance manual,alarm list,and other forms of expert experience.In this study,we constructed a fault event ontology model to label the entity and relationship involved in the corpus of CRDM fault events.A three-layer robustly optimized bidirectional encoder representation from transformers(RBT3)pre-training approach combined with a text convolutional neural network(TextCNN)was introduced to facilitate the application of the constructed CRDM fault diagnosis graph database for fault query.The RBT3-TextCNN model along with the Jieba tool is proposed for extracting entities and recognizing the fault query intent simultaneously.Experiments on the dataset collected from TMSR-LF1 CRDM fault diagnosis unstructured data demonstrate that this model has the potential to improve the effect of intent recognition and entity extraction.Additionally,a fault alarm monitoring module was developed based on WebSocket protocol to deliver detailed information about the appeared fault to the operator automatically.Furthermore,the Bayesian inference method combined with the variable elimination algorithm was proposed to enable the development of a relatively intelligent and reliable fault diagnosis system.Finally,a CRDM fault diagnosis Web interface integrated with graph data visualization was constructed,making the CRDM fault diagnosis process intuitive and effective.展开更多
In this proceeding,some highlight results on the constraints of the nuclear matter equation of state(EOS)from the data of nucleus resonance and neutron-skin thickness using the Bayesian approach based on the Skyrme-Ha...In this proceeding,some highlight results on the constraints of the nuclear matter equation of state(EOS)from the data of nucleus resonance and neutron-skin thickness using the Bayesian approach based on the Skyrme-Hartree-Fock model and its extension have been presented.Typically,the anti-correlation and positive correlations between the slope parameter and the value of the symmetry energy at the saturation density under the constraint of the neutron-skin thickness and the isovector giant dipole resonance have been discussed respectively.It’s shown that the Bayesian analysis can help to find a compromise for the“PREXII puzzle”and the“soft Tin puzzle”.The possible modifications on the constraints of lower-order EOS parameters as well as the relevant correlation when higher-order EOS parameters are incorporated as independent variables have been further illustrated.For a given model and parameter space,the Bayesian approach serves as a good analysis tool suitable for multi-messengers versus multi-variables,and is helpful for constraining quantitatively the model parameters as well as their correlations.展开更多
Aiming at the problem that the consumption data of new ammunition is less and the demand is difficult to predict,combined with the law of ammunition consumption under different damage grades,a Bayesian inference metho...Aiming at the problem that the consumption data of new ammunition is less and the demand is difficult to predict,combined with the law of ammunition consumption under different damage grades,a Bayesian inference method for ammunition demand based on Gompertz distribution is proposed.The Bayesian inference model based on Gompertz distribution is constructed,and the system contribution degree is introduced to determine the weight of the multi-source information.In the case where the prior distribution is known and the distribution of the field data is unknown,the consistency test is performed on the prior information,and the consistency test problem is transformed into the goodness of the fit test problem.Then the Bayesian inference is solved by the Markov chain-Monte Carlo(MCMC)method,and the ammunition demand under different damage grades is gained.The example verifies the accuracy of this method and solves the problem of ammunition demand prediction in the case of insufficient samples.展开更多
Rock mechanical parameters and their uncertainties are critical to rock stability analysis,engineering design,and safe construction in rock mechanics and engineering.The back analysis is widely adopted in rock enginee...Rock mechanical parameters and their uncertainties are critical to rock stability analysis,engineering design,and safe construction in rock mechanics and engineering.The back analysis is widely adopted in rock engineering to determine the mechanical parameters of the surrounding rock mass,but this does not consider the uncertainty.This problem is addressed here by the proposed approach by developing a system of Bayesian inferences for updating mechanical parameters and their statistical properties using monitored field data,then integrating the monitored data,prior knowledge of geotechnical parameters,and a mechanical model of a rock tunnel using Markov chain Monte Carlo(MCMC)simulation.The proposed approach is illustrated by a circular tunnel with an analytical solution,which was then applied to an experimental tunnel in Goupitan Hydropower Station,China.The mechanical properties and strength parameters of the surrounding rock mass were modeled as random variables.The displacement was predicted with the aid of the parameters updated by Bayesian inferences and agreed closely with monitored displacements.It indicates that Bayesian inferences combined the monitored data into the tunnel model to update its parameters dynamically.Further study indicated that the performance of Bayesian inferences is improved greatly by regularly supplementing field monitoring data.Bayesian inference is a significant and new approach for determining the mechanical parameters of the surrounding rock mass in a tunnel model and contributes to safe construction in rock engineering.展开更多
This paper is focused on the state estimation problem for nonlinear systems with unknown statistics of measurement noise.Based on the cubature Kalman filter,we propose a new nonlinear filtering algorithm that employs ...This paper is focused on the state estimation problem for nonlinear systems with unknown statistics of measurement noise.Based on the cubature Kalman filter,we propose a new nonlinear filtering algorithm that employs a skew t distribution to characterize the asymmetry of the measurement noise.The system states and the statistics of skew t noise distribution,including the shape matrix,the scale matrix,and the degree of freedom(DOF)are estimated jointly by employing variational Bayesian(VB)inference.The proposed method is validated in a target tracking example.Results of the simulation indicate that the proposed nonlinear filter can perform satisfactorily in the presence of unknown statistics of measurement noise and outperform than the existing state-of-the-art nonlinear filters.展开更多
Efficient estimation of line spectral from quantized samples is of significant importance in information theory and signal processing,e.g.,channel estimation in energy efficient massive MIMO systems and direction of a...Efficient estimation of line spectral from quantized samples is of significant importance in information theory and signal processing,e.g.,channel estimation in energy efficient massive MIMO systems and direction of arrival estimation.The goal of this paper is to recover the line spectral as well as its corresponding parameters including the model order,frequencies and amplitudes from heavily quantized samples.To this end,we propose an efficient gridless Bayesian algorithm named VALSE-EP,which is a combination of the high resolution and low complexity gridless variational line spectral estimation(VALSE)and expectation propagation(EP).The basic idea of VALSE-EP is to iteratively approximate the challenging quantized model of line spectral estimation as a sequence of simple pseudo unquantized models,where VALSE is applied.Moreover,to obtain a benchmark of the performance of the proposed algorithm,the Cram′er Rao bound(CRB)is derived.Finally,numerical experiments on both synthetic and real data are performed,demonstrating the near CRB performance of the proposed VALSE-EP for line spectral estimation from quantized samples.展开更多
Mining penetration testing semantic knowledge hidden in vast amounts of raw penetration testing data is of vital importance for automated penetration testing.Associative rule mining,a data mining technique,has been st...Mining penetration testing semantic knowledge hidden in vast amounts of raw penetration testing data is of vital importance for automated penetration testing.Associative rule mining,a data mining technique,has been studied and explored for a long time.However,few studies have focused on knowledge discovery in the penetration testing area.The experimental result reveals that the long-tail distribution of penetration testing data nullifies the effectiveness of associative rule mining algorithms that are based on frequent pattern.To address this problem,a Bayesian inference based penetration semantic knowledge mining algorithm is proposed.First,a directed bipartite graph model,a kind of Bayesian network,is constructed to formalize penetration testing data.Then,we adopt the maximum likelihood estimate method to optimize the model parameters and decompose a large Bayesian network into smaller networks based on conditional independence of variables for improved solution efficiency.Finally,irrelevant variable elimination is adopted to extract penetration semantic knowledge from the conditional probability distribution of the model.The experimental results show that the proposed method can discover penetration semantic knowledge from raw penetration testing data effectively and efficiently.展开更多
A novel variational Bayesian inference based on adaptive cubature Kalman filter(VBACKF)algorithm is proposed for the problem of state estimation in a target tracking system with time-varying measurement noise and rand...A novel variational Bayesian inference based on adaptive cubature Kalman filter(VBACKF)algorithm is proposed for the problem of state estimation in a target tracking system with time-varying measurement noise and random measurement losses.Firstly,the Inverse-Wishart(IW)distribution is chosen to model the covariance matrix of time-varying measurement noise in the cubature Kalman filter framework.Secondly,the Bernoulli random variable is introduced as the judgement factor of the measurement losses,and the Beta distribution is selected as the conjugate prior distribution of measurement loss probability to ensure that the posterior distribution and prior distribution have the same function form.Finally,the joint posterior probability density function of the estimated variables is approximately decoupled by the variational Bayesian inference,and the fixed-point iteration approach is used to update the estimated variables.The simulation results show that the proposed VBACKF algorithm considers the comprehensive effects of system nonlinearity,time-varying measurement noise and unknown measurement loss probability,moreover,effectively improves the accuracy of target state estimation in complex scene.展开更多
Aiming at the problem of filtering precision degradation caused by the random outliers of process noise and measurement noise in multi-target tracking(MTT) system,a new Gaussian-Student’s t mixture distribution proba...Aiming at the problem of filtering precision degradation caused by the random outliers of process noise and measurement noise in multi-target tracking(MTT) system,a new Gaussian-Student’s t mixture distribution probability hypothesis density(PHD) robust filtering algorithm based on variational Bayesian inference(GST-vbPHD) is proposed.Firstly,since it can accurately describe the heavy-tailed characteristics of noise with outliers,Gaussian-Student’s t mixture distribution is employed to model process noise and measurement noise respectively.Then Bernoulli random variable is introduced to correct the likelihood distribution of the mixture probability,leading hierarchical Gaussian distribution constructed by the Gaussian-Student’s t mixture distribution suitable to model non-stationary noise.Finally,the approximate solutions including target weights,measurement noise covariance and state estimation error covariance are obtained according to variational Bayesian inference approach.The simulation results show that,in the heavy-tailed noise environment,the proposed algorithm leads to strong improvements over the traditional PHD filter and the Student’s t distribution PHD filter.展开更多
In this paper, we construct a Bayesian framework combining Type-Ⅰ progressively hybrid censoring scheme and competing risks which are independently distributed as exponentiated Weibull distribution with one scale par...In this paper, we construct a Bayesian framework combining Type-Ⅰ progressively hybrid censoring scheme and competing risks which are independently distributed as exponentiated Weibull distribution with one scale parameter and two shape parameters. Since there exist unknown hyper-parameters in prior density functions of shape parameters, we consider the hierarchical priors to obtain the individual marginal posterior density functions,Bayesian estimates and highest posterior density credible intervals. As explicit expressions of estimates cannot be obtained, the componentwise updating algorithm of Metropolis-Hastings method is employed to compute the numerical results. Finally, it is concluded that Bayesian estimates have a good performance.展开更多
We devise an approach to Bayesian statistics and their applications in the analysis of the Monty Hall problem. We combine knowledge gained through applications of the Maximum Entropy Principle and Nash equilibrium str...We devise an approach to Bayesian statistics and their applications in the analysis of the Monty Hall problem. We combine knowledge gained through applications of the Maximum Entropy Principle and Nash equilibrium strategies to provide results concerning the use of Bayesian approaches unique to the Monty Hall problem. We use a model to describe Monty’s decision process and clarify that Bayesian inference results in an “irrelevant, therefore invariant” hypothesis. We discuss the advantages of Bayesian inference over the frequentist inference in tackling the uneven prior probability Monty Hall variant. We demonstrate that the use of Bayesian statistics conforms to the Maximum Entropy Principle in information theory and Bayesian approach successfully resolves dilemmas in the uneven probability Monty Hall variant. Our findings have applications in the decision making, information theory, bioinformatics, quantum game theory and beyond.展开更多
Traditional global sensitivity analysis(GSA)neglects the epistemic uncertainties associated with the probabilistic characteristics(i.e.type of distribution type and its parameters)of input rock properties emanating du...Traditional global sensitivity analysis(GSA)neglects the epistemic uncertainties associated with the probabilistic characteristics(i.e.type of distribution type and its parameters)of input rock properties emanating due to the small size of datasets while mapping the relative importance of properties to the model response.This paper proposes an augmented Bayesian multi-model inference(BMMI)coupled with GSA methodology(BMMI-GSA)to address this issue by estimating the imprecision in the momentindependent sensitivity indices of rock structures arising from the small size of input data.The methodology employs BMMI to quantify the epistemic uncertainties associated with model type and parameters of input properties.The estimated uncertainties are propagated in estimating imprecision in moment-independent Borgonovo’s indices by employing a reweighting approach on candidate probabilistic models.The proposed methodology is showcased for a rock slope prone to stress-controlled failure in the Himalayan region of India.The proposed methodology was superior to the conventional GSA(neglects all epistemic uncertainties)and Bayesian coupled GSA(B-GSA)(neglects model uncertainty)due to its capability to incorporate the uncertainties in both model type and parameters of properties.Imprecise Borgonovo’s indices estimated via proposed methodology provide the confidence intervals of the sensitivity indices instead of their fixed-point estimates,which makes the user more informed in the data collection efforts.Analyses performed with the varying sample sizes suggested that the uncertainties in sensitivity indices reduce significantly with the increasing sample sizes.The accurate importance ranking of properties was only possible via samples of large sizes.Further,the impact of the prior knowledge in terms of prior ranges and distributions was significant;hence,any related assumption should be made carefully.展开更多
The estimation of model parameters is an important subject in engineering.In this area of work,the prevailing approach is to estimate or calculate these as deterministic parameters.In this study,we consider the model ...The estimation of model parameters is an important subject in engineering.In this area of work,the prevailing approach is to estimate or calculate these as deterministic parameters.In this study,we consider the model parameters from the perspective of random variables and describe the general form of the parameter distribution inference problem.Under this framework,we propose an ensemble Bayesian method by introducing Bayesian inference and the Markov chain Monte Carlo(MCMC)method.Experiments on a finite cylindrical reactor and a 2D IAEA benchmark problem show that the proposed method converges quickly and can estimate parameters effectively,even for several correlated parameters simultaneously.Our experiments include cases of engineering software calls,demonstrating that the method can be applied to engineering,such as nuclear reactor engineering.展开更多
For building heating,ventilation and air-conditioning systems(HVACs),sensor faults significantly affect the operation and control.Sensors with accurate and reliable measurements are critical for ensuring the precise i...For building heating,ventilation and air-conditioning systems(HVACs),sensor faults significantly affect the operation and control.Sensors with accurate and reliable measurements are critical for ensuring the precise indoor thermal demand.Owing to its high calibration accuracy and in-situ effectiveness,a virtual sensor(VS)-assisted Bayesian inference(VS-BI)sensor calibration strategy has been applied for HVACs.However,the application feasibility of this strategy for wider ranges of different sensor types(within-control-loop and out-of-control-loop)with various sensor bias fault amplitudes,and influencing factors that affect the practical in-situ calibration performance are still remained to be explored.Hence,to further validate its in-situ calibration performance and analyze the influencing factors,this study applied the VS-BI strategy in a HVAC system including a chiller plant with air handle unit(AHU)terminal.Three target sensors including air supply(SAT),chilled water supply(CHS)and cooling water return(CWR)temperatures are investigated using introduced sensor bias faults with eight different amplitudes of[−2℃,+2℃]with a 0.5℃ interval.Calibration performance is evaluated by considering three influencing factors:(1)performance of different data-driven VSs,(2)the influence of prior standard deviationsσon in-situ sensor calibration and(3)the influence of data quality on in-situ sensor calibration from the perspective of energy conservation and data volumes.After comparison,a long short term memory(LSTM)is adopted for VS construction with determination coefficient R-squared of 0.984.Results indicate thatσhas almost no impact on calibration accuracy of CHS but scanty impact on that of SAT and CWR.The potential of using a prior standard deviationσto improve the calibration accuracy is limited,only 8.61%on average.For system within-control-loop sensors like SAT and CHS,VS-BI obtains relatively high in-situ sensor calibration accuracy if the data quality is relatively high.展开更多
For dense time delay estimation(TDE),when multiple time delays are located within a grid interval,it is dificult for the existing sparse Bayesian learning/inference(SBL/SBI)methods to obtain high estimation accuracy t...For dense time delay estimation(TDE),when multiple time delays are located within a grid interval,it is dificult for the existing sparse Bayesian learning/inference(SBL/SBI)methods to obtain high estimation accuracy to meet the application requirements.To solve this problem,this paper proposes a method named off-grid sparse Bayesian inference-biased total grid(OGSBI-BTG),where a mesh evolution process is conducted to move the total grids iteratively based on the position of the off-grid between two grids.The proposed method updates the off-grid dictionary matrix by further reconstructing an optimum mesh and offsetting the off-grid vector.Experimental results demonstrate that the proposed approach performs better than other state-of-the-art SBI methods and multiple signal classification even when the grid interval is larger than the gap of true time delays.In this paper,the time domain model and frequency domain model of TDE are studied.展开更多
We present an efficient numerical strategy for the Bayesian solution of inverse problems.Stochastic collocation methods,based on generalized polynomial chaos(gPC),are used to construct a polynomial approximation of th...We present an efficient numerical strategy for the Bayesian solution of inverse problems.Stochastic collocation methods,based on generalized polynomial chaos(gPC),are used to construct a polynomial approximation of the forward solution over the support of the prior distribution.This approximation then defines a surrogate posterior probability density that can be evaluated repeatedly at minimal computational cost.The ability to simulate a large number of samples from the posterior distribution results in very accurate estimates of the inverse solution and its associated uncertainty.Combined with high accuracy of the gPC-based forward solver,the new algorithm can provide great efficiency in practical applications.A rigorous error analysis of the algorithm is conducted,where we establish convergence of the approximate posterior to the true posterior and obtain an estimate of the convergence rate.It is proved that fast(exponential)convergence of the gPC forward solution yields similarly fast(exponential)convergence of the posterior.The numerical strategy and the predicted convergence rates are then demonstrated on nonlinear inverse problems of varying smoothness and dimension.展开更多
An accurate understanding of the condition of a pipe is important for maintaining acceptable levels of service and providing appropriate strategies for maintenance and rehabilitation in water supply systems. Many fact...An accurate understanding of the condition of a pipe is important for maintaining acceptable levels of service and providing appropriate strategies for maintenance and rehabilitation in water supply systems. Many factors contribute to pipe deterioration. To consolidate information on these factors to assess the condition of water pipes, this study employed a new ap-proach based on Bayesian configuration against pipe condition to generate factor weights. Ten pipe factors from three pipe ma-terials (cast iron, ductile cast iron and steel) were used in this study. The factors included size, age, inner coating, outer coating, soil condition, bedding condition, trench depth, electrical recharge, the number of road lanes, material, and operational pressure. To address identification problems that arise when switching from pipe factor information to actual pipe condition, informative prior factor weight distribution based on the literature and previous knowledge of water pipe assessment was used. The influence of each factor on the results of pipe assessment was estimated. Results suggested that factors that with smaller weight values or with weights having relative stable posterior means and narrow uncertainty bounds, would have less influence on pipe conditions. The model was the most sensitive to variations of pipe age. Using numerical experiments of different factor combinations, a simplified model, excluding factors such as trench depth, electrical recharge, and the number of road lanes, is provided. The proposed Bayesian inference approach provides a more reliable assessment of pipe deterioration.展开更多
Data analysis on tokamak plasmas is mainly based on various diagnostic systems,which are usually modularized and independent of each other.This leads to a large amount of data not being fully and effectively exploited...Data analysis on tokamak plasmas is mainly based on various diagnostic systems,which are usually modularized and independent of each other.This leads to a large amount of data not being fully and effectively exploited so that it is not conducive to revealing the deep physical mechanism.In this work,Bayesian probability inference with machine learning methods have been applied to the electron cyclotron emission and Thomson scattering diagnostic systems on HL-2A/2M,and the effects of integrated data analysis(IDA)on the electron temperature of HL-2A with Bayesian probability inference are demonstrated.A program is developed to infer the whole electron temperature profile with a confidence interval,and the program can be applied in online analysis.The IDA results show that the full profile of the electron temperature can be obtained and the diagnostic information is more comprehensive and abundant with IDA.The inference models for electron temperature analysis are established and the developed programs will serve as an experimental data analysis tool for HL-2A/2M in the near future.展开更多
Various works from the literature aimed at accelerating Bayesian inference in inverse problems.Stochastic spectral methods have been recently proposed as surrogate approximations of the forward uncertainty propagation...Various works from the literature aimed at accelerating Bayesian inference in inverse problems.Stochastic spectral methods have been recently proposed as surrogate approximations of the forward uncertainty propagation model over the support of the prior distribution.These representations are efficient because they allow affordable simulation of a large number of samples from the posterior distribution.Unfortunately,they do not perform well when the forward model exhibits strong nonlinear behavior with respect to its input.In this work,we first relate the fast(exponential)L2-convergence of the forward approximation to the fast(exponential)convergence(in terms of Kullback-Leibler divergence)of the approximate posterior.In particular,we prove that in case the prior distribution is uniform,the posterior is at least twice as fast as the convergence rate of the forward model in those norms.The Bayesian inference strategy is developed in the framework of a stochastic spectral projection method.The predicted convergence rates are then demonstrated for simple nonlinear inverse problems of varying smoothness.We then propose an efficient numerical approach for the Bayesian solution of inverse problems presenting strongly nonlinear or discontinuous system responses.This comes with the improvement of the forward model that is adaptively approximated by an iterative generalized Polynomial Chaos-based representation.The numerical approximations and predicted convergence rates of the former approach are compared to the new iterative numerical method for nonlinear time-dependent test cases of varying dimension and complexity,which are relevant regarding our hydrodynamics motivations and therefore regarding hyperbolic conservation laws and the apparition of discontinuities in finite time.展开更多
基金supported by the Shanxi Provincial Foundation for Returned Overseas Scholars (No. 20220037)Natural Science Foundation of Shanxi Province (No. 20210302123085)Discipline Construction Project of Yuncheng University
文摘In this work,we perform a Bayesian inference of the crust-core transition density ρ_(t) of neutron stars based on the neutron-star radius and neutron-skin thickness data using a thermodynamical method.Uniform and Gaussian distributions for the ρ_(t) prior were adopted in the Bayesian approach.It has a larger probability of having values higher than 0.1 fm^(−3) for ρ_(t) as the uniform prior and neutron-star radius data were used.This was found to be controlled by the curvature K_(sym) of the nuclear symmetry energy.This phenomenon did not occur if K_(sym) was not extremely negative,namely,K_(sym)>−200 MeV.The value ofρ_(t) obtained was 0.075_(−0.01)^(+0.005) fm^(−3) at a confidence level of 68%when both the neutron-star radius and neutron-skin thickness data were considered.Strong anti-correlations were observed between ρ_(t),slope L,and curvature of the nuclear symmetry energy.The dependence of the three L-K_(sym) correlations predicted in the literature on crust-core density and pressure was quantitatively investigated.The most probable value of 0.08 fm^(−3) for ρ_(t) was obtained from the L-K_(sym) relationship proposed by Holt et al.while larger values were preferred for the other two relationships.
基金the Young Potential Program of Shanghai Institute of Applied Physics,Chinese Academy of Sciences(No.E0553101).
文摘Knowledge graph technology has distinct advantages in terms of fault diagnosis.In this study,the control rod drive mechanism(CRDM)of the liquid fuel thorium molten salt reactor(TMSR-LF1)was taken as the research object,and a fault diagnosis system was proposed based on knowledge graph.The subject–relation–object triples are defined based on CRDM unstructured data,including design specification,operation and maintenance manual,alarm list,and other forms of expert experience.In this study,we constructed a fault event ontology model to label the entity and relationship involved in the corpus of CRDM fault events.A three-layer robustly optimized bidirectional encoder representation from transformers(RBT3)pre-training approach combined with a text convolutional neural network(TextCNN)was introduced to facilitate the application of the constructed CRDM fault diagnosis graph database for fault query.The RBT3-TextCNN model along with the Jieba tool is proposed for extracting entities and recognizing the fault query intent simultaneously.Experiments on the dataset collected from TMSR-LF1 CRDM fault diagnosis unstructured data demonstrate that this model has the potential to improve the effect of intent recognition and entity extraction.Additionally,a fault alarm monitoring module was developed based on WebSocket protocol to deliver detailed information about the appeared fault to the operator automatically.Furthermore,the Bayesian inference method combined with the variable elimination algorithm was proposed to enable the development of a relatively intelligent and reliable fault diagnosis system.Finally,a CRDM fault diagnosis Web interface integrated with graph data visualization was constructed,making the CRDM fault diagnosis process intuitive and effective.
基金Supported by National Natural Science Foundation of China (11922514)。
文摘In this proceeding,some highlight results on the constraints of the nuclear matter equation of state(EOS)from the data of nucleus resonance and neutron-skin thickness using the Bayesian approach based on the Skyrme-Hartree-Fock model and its extension have been presented.Typically,the anti-correlation and positive correlations between the slope parameter and the value of the symmetry energy at the saturation density under the constraint of the neutron-skin thickness and the isovector giant dipole resonance have been discussed respectively.It’s shown that the Bayesian analysis can help to find a compromise for the“PREXII puzzle”and the“soft Tin puzzle”.The possible modifications on the constraints of lower-order EOS parameters as well as the relevant correlation when higher-order EOS parameters are incorporated as independent variables have been further illustrated.For a given model and parameter space,the Bayesian approach serves as a good analysis tool suitable for multi-messengers versus multi-variables,and is helpful for constraining quantitatively the model parameters as well as their correlations.
基金the Army Scientific Research(KYSZJWJK1744,012016012600B11403).
文摘Aiming at the problem that the consumption data of new ammunition is less and the demand is difficult to predict,combined with the law of ammunition consumption under different damage grades,a Bayesian inference method for ammunition demand based on Gompertz distribution is proposed.The Bayesian inference model based on Gompertz distribution is constructed,and the system contribution degree is introduced to determine the weight of the multi-source information.In the case where the prior distribution is known and the distribution of the field data is unknown,the consistency test is performed on the prior information,and the consistency test problem is transformed into the goodness of the fit test problem.Then the Bayesian inference is solved by the Markov chain-Monte Carlo(MCMC)method,and the ammunition demand under different damage grades is gained.The example verifies the accuracy of this method and solves the problem of ammunition demand prediction in the case of insufficient samples.
基金support from the Open Research Fund of State Key Laboratory of Geomechanics and Geotechnical Engineering,Institute of Rock and Soil Mechanics,Chinese Academy of Sciences(Grant No.Z020006)the National Natural Science Foundation of China(Grant Nos.U1765206 and 51874119).
文摘Rock mechanical parameters and their uncertainties are critical to rock stability analysis,engineering design,and safe construction in rock mechanics and engineering.The back analysis is widely adopted in rock engineering to determine the mechanical parameters of the surrounding rock mass,but this does not consider the uncertainty.This problem is addressed here by the proposed approach by developing a system of Bayesian inferences for updating mechanical parameters and their statistical properties using monitored field data,then integrating the monitored data,prior knowledge of geotechnical parameters,and a mechanical model of a rock tunnel using Markov chain Monte Carlo(MCMC)simulation.The proposed approach is illustrated by a circular tunnel with an analytical solution,which was then applied to an experimental tunnel in Goupitan Hydropower Station,China.The mechanical properties and strength parameters of the surrounding rock mass were modeled as random variables.The displacement was predicted with the aid of the parameters updated by Bayesian inferences and agreed closely with monitored displacements.It indicates that Bayesian inferences combined the monitored data into the tunnel model to update its parameters dynamically.Further study indicated that the performance of Bayesian inferences is improved greatly by regularly supplementing field monitoring data.Bayesian inference is a significant and new approach for determining the mechanical parameters of the surrounding rock mass in a tunnel model and contributes to safe construction in rock engineering.
基金This work was supported in part by National Natural Science Foundation of China under Grants 62103167 and 61833007in part by the Natural Science Foundation of Jiangsu Province under Grant BK20210451.
文摘This paper is focused on the state estimation problem for nonlinear systems with unknown statistics of measurement noise.Based on the cubature Kalman filter,we propose a new nonlinear filtering algorithm that employs a skew t distribution to characterize the asymmetry of the measurement noise.The system states and the statistics of skew t noise distribution,including the shape matrix,the scale matrix,and the degree of freedom(DOF)are estimated jointly by employing variational Bayesian(VB)inference.The proposed method is validated in a target tracking example.Results of the simulation indicate that the proposed nonlinear filter can perform satisfactorily in the presence of unknown statistics of measurement noise and outperform than the existing state-of-the-art nonlinear filters.
基金supported by National Natural Science Foundation of China(No.61901415)。
文摘Efficient estimation of line spectral from quantized samples is of significant importance in information theory and signal processing,e.g.,channel estimation in energy efficient massive MIMO systems and direction of arrival estimation.The goal of this paper is to recover the line spectral as well as its corresponding parameters including the model order,frequencies and amplitudes from heavily quantized samples.To this end,we propose an efficient gridless Bayesian algorithm named VALSE-EP,which is a combination of the high resolution and low complexity gridless variational line spectral estimation(VALSE)and expectation propagation(EP).The basic idea of VALSE-EP is to iteratively approximate the challenging quantized model of line spectral estimation as a sequence of simple pseudo unquantized models,where VALSE is applied.Moreover,to obtain a benchmark of the performance of the proposed algorithm,the Cram′er Rao bound(CRB)is derived.Finally,numerical experiments on both synthetic and real data are performed,demonstrating the near CRB performance of the proposed VALSE-EP for line spectral estimation from quantized samples.
基金the National Natural Science Foundation of China No.61502528.
文摘Mining penetration testing semantic knowledge hidden in vast amounts of raw penetration testing data is of vital importance for automated penetration testing.Associative rule mining,a data mining technique,has been studied and explored for a long time.However,few studies have focused on knowledge discovery in the penetration testing area.The experimental result reveals that the long-tail distribution of penetration testing data nullifies the effectiveness of associative rule mining algorithms that are based on frequent pattern.To address this problem,a Bayesian inference based penetration semantic knowledge mining algorithm is proposed.First,a directed bipartite graph model,a kind of Bayesian network,is constructed to formalize penetration testing data.Then,we adopt the maximum likelihood estimate method to optimize the model parameters and decompose a large Bayesian network into smaller networks based on conditional independence of variables for improved solution efficiency.Finally,irrelevant variable elimination is adopted to extract penetration semantic knowledge from the conditional probability distribution of the model.The experimental results show that the proposed method can discover penetration semantic knowledge from raw penetration testing data effectively and efficiently.
基金Supported by the National Natural Science Foundation of China(No.61976080)the Science and Technology Key Project of Science and TechnologyDepartment of Henan Province(No.212102310298)+1 种基金the Academic Degrees&Graduate Education Reform Project of Henan Province(No.2021SJGLX195Y)the Innovation and Quality Improvement Project for Graduate Education of Henan University(No.SYL20010101)。
文摘A novel variational Bayesian inference based on adaptive cubature Kalman filter(VBACKF)algorithm is proposed for the problem of state estimation in a target tracking system with time-varying measurement noise and random measurement losses.Firstly,the Inverse-Wishart(IW)distribution is chosen to model the covariance matrix of time-varying measurement noise in the cubature Kalman filter framework.Secondly,the Bernoulli random variable is introduced as the judgement factor of the measurement losses,and the Beta distribution is selected as the conjugate prior distribution of measurement loss probability to ensure that the posterior distribution and prior distribution have the same function form.Finally,the joint posterior probability density function of the estimated variables is approximately decoupled by the variational Bayesian inference,and the fixed-point iteration approach is used to update the estimated variables.The simulation results show that the proposed VBACKF algorithm considers the comprehensive effects of system nonlinearity,time-varying measurement noise and unknown measurement loss probability,moreover,effectively improves the accuracy of target state estimation in complex scene.
基金Supported by the National Natural Science Foundation of China(No.61976080)the Science and Technology Key Project of Science and Technology Department of Henan Province(No.212102310298)the Innovation and Quality Improvement Project for Graduate Education of Henan University(No.SYL20010101)。
文摘Aiming at the problem of filtering precision degradation caused by the random outliers of process noise and measurement noise in multi-target tracking(MTT) system,a new Gaussian-Student’s t mixture distribution probability hypothesis density(PHD) robust filtering algorithm based on variational Bayesian inference(GST-vbPHD) is proposed.Firstly,since it can accurately describe the heavy-tailed characteristics of noise with outliers,Gaussian-Student’s t mixture distribution is employed to model process noise and measurement noise respectively.Then Bernoulli random variable is introduced to correct the likelihood distribution of the mixture probability,leading hierarchical Gaussian distribution constructed by the Gaussian-Student’s t mixture distribution suitable to model non-stationary noise.Finally,the approximate solutions including target weights,measurement noise covariance and state estimation error covariance are obtained according to variational Bayesian inference approach.The simulation results show that,in the heavy-tailed noise environment,the proposed algorithm leads to strong improvements over the traditional PHD filter and the Student’s t distribution PHD filter.
基金Supported by the National Natural Science Foundation of China(71571144,71401134,71171164,11701406) Supported by the International Cooperation and Exchanges in Science and Technology Program of Shaanxi Province(2016KW-033)
文摘In this paper, we construct a Bayesian framework combining Type-Ⅰ progressively hybrid censoring scheme and competing risks which are independently distributed as exponentiated Weibull distribution with one scale parameter and two shape parameters. Since there exist unknown hyper-parameters in prior density functions of shape parameters, we consider the hierarchical priors to obtain the individual marginal posterior density functions,Bayesian estimates and highest posterior density credible intervals. As explicit expressions of estimates cannot be obtained, the componentwise updating algorithm of Metropolis-Hastings method is employed to compute the numerical results. Finally, it is concluded that Bayesian estimates have a good performance.
文摘We devise an approach to Bayesian statistics and their applications in the analysis of the Monty Hall problem. We combine knowledge gained through applications of the Maximum Entropy Principle and Nash equilibrium strategies to provide results concerning the use of Bayesian approaches unique to the Monty Hall problem. We use a model to describe Monty’s decision process and clarify that Bayesian inference results in an “irrelevant, therefore invariant” hypothesis. We discuss the advantages of Bayesian inference over the frequentist inference in tackling the uneven prior probability Monty Hall variant. We demonstrate that the use of Bayesian statistics conforms to the Maximum Entropy Principle in information theory and Bayesian approach successfully resolves dilemmas in the uneven probability Monty Hall variant. Our findings have applications in the decision making, information theory, bioinformatics, quantum game theory and beyond.
文摘Traditional global sensitivity analysis(GSA)neglects the epistemic uncertainties associated with the probabilistic characteristics(i.e.type of distribution type and its parameters)of input rock properties emanating due to the small size of datasets while mapping the relative importance of properties to the model response.This paper proposes an augmented Bayesian multi-model inference(BMMI)coupled with GSA methodology(BMMI-GSA)to address this issue by estimating the imprecision in the momentindependent sensitivity indices of rock structures arising from the small size of input data.The methodology employs BMMI to quantify the epistemic uncertainties associated with model type and parameters of input properties.The estimated uncertainties are propagated in estimating imprecision in moment-independent Borgonovo’s indices by employing a reweighting approach on candidate probabilistic models.The proposed methodology is showcased for a rock slope prone to stress-controlled failure in the Himalayan region of India.The proposed methodology was superior to the conventional GSA(neglects all epistemic uncertainties)and Bayesian coupled GSA(B-GSA)(neglects model uncertainty)due to its capability to incorporate the uncertainties in both model type and parameters of properties.Imprecise Borgonovo’s indices estimated via proposed methodology provide the confidence intervals of the sensitivity indices instead of their fixed-point estimates,which makes the user more informed in the data collection efforts.Analyses performed with the varying sample sizes suggested that the uncertainties in sensitivity indices reduce significantly with the increasing sample sizes.The accurate importance ranking of properties was only possible via samples of large sizes.Further,the impact of the prior knowledge in terms of prior ranges and distributions was significant;hence,any related assumption should be made carefully.
基金partially sponsored by the Natural Science Foundation of Shanghai(No.23ZR1429300)the Innovation Fund of CNNC(Lingchuang Fund)。
文摘The estimation of model parameters is an important subject in engineering.In this area of work,the prevailing approach is to estimate or calculate these as deterministic parameters.In this study,we consider the model parameters from the perspective of random variables and describe the general form of the parameter distribution inference problem.Under this framework,we propose an ensemble Bayesian method by introducing Bayesian inference and the Markov chain Monte Carlo(MCMC)method.Experiments on a finite cylindrical reactor and a 2D IAEA benchmark problem show that the proposed method converges quickly and can estimate parameters effectively,even for several correlated parameters simultaneously.Our experiments include cases of engineering software calls,demonstrating that the method can be applied to engineering,such as nuclear reactor engineering.
基金supported by the National Natural Science Foundation of China (51906181)the 2021 Construction Technology Plan Project of Hubei Province (No.2021-83)the Excellent Young and Middle-aged Talent in Universities of Hubei Province,China (Q20181110).
文摘For building heating,ventilation and air-conditioning systems(HVACs),sensor faults significantly affect the operation and control.Sensors with accurate and reliable measurements are critical for ensuring the precise indoor thermal demand.Owing to its high calibration accuracy and in-situ effectiveness,a virtual sensor(VS)-assisted Bayesian inference(VS-BI)sensor calibration strategy has been applied for HVACs.However,the application feasibility of this strategy for wider ranges of different sensor types(within-control-loop and out-of-control-loop)with various sensor bias fault amplitudes,and influencing factors that affect the practical in-situ calibration performance are still remained to be explored.Hence,to further validate its in-situ calibration performance and analyze the influencing factors,this study applied the VS-BI strategy in a HVAC system including a chiller plant with air handle unit(AHU)terminal.Three target sensors including air supply(SAT),chilled water supply(CHS)and cooling water return(CWR)temperatures are investigated using introduced sensor bias faults with eight different amplitudes of[−2℃,+2℃]with a 0.5℃ interval.Calibration performance is evaluated by considering three influencing factors:(1)performance of different data-driven VSs,(2)the influence of prior standard deviationsσon in-situ sensor calibration and(3)the influence of data quality on in-situ sensor calibration from the perspective of energy conservation and data volumes.After comparison,a long short term memory(LSTM)is adopted for VS construction with determination coefficient R-squared of 0.984.Results indicate thatσhas almost no impact on calibration accuracy of CHS but scanty impact on that of SAT and CWR.The potential of using a prior standard deviationσto improve the calibration accuracy is limited,only 8.61%on average.For system within-control-loop sensors like SAT and CHS,VS-BI obtains relatively high in-situ sensor calibration accuracy if the data quality is relatively high.
基金the National Natural Science Foundation of China(No.61401145)the Natural Science Foundation of Shanghai(No.19ZR1437600)。
文摘For dense time delay estimation(TDE),when multiple time delays are located within a grid interval,it is dificult for the existing sparse Bayesian learning/inference(SBL/SBI)methods to obtain high estimation accuracy to meet the application requirements.To solve this problem,this paper proposes a method named off-grid sparse Bayesian inference-biased total grid(OGSBI-BTG),where a mesh evolution process is conducted to move the total grids iteratively based on the position of the off-grid between two grids.The proposed method updates the off-grid dictionary matrix by further reconstructing an optimum mesh and offsetting the off-grid vector.Experimental results demonstrate that the proposed approach performs better than other state-of-the-art SBI methods and multiple signal classification even when the grid interval is larger than the gap of true time delays.In this paper,the time domain model and frequency domain model of TDE are studied.
基金The work of Y.Marzouk is supported in part by the DOE Office of Advanced Scientific Computing Research(ASCR)by Sandia Corporation(a wholly owned subsidiary of Lockheed Martin Corporation)as operator of Sandia National Laboratories under US Department of Energy contract number DE-AC04-94AL85000+1 种基金The work of D.Xiu is supported in part by AFOSR FA9550-08-1-0353,NSF CAREER Award DMS-0645035the DOE/NNSA PSAAP center at Purdue(PRISM)under contract number DE-FC52-08NA28617.
文摘We present an efficient numerical strategy for the Bayesian solution of inverse problems.Stochastic collocation methods,based on generalized polynomial chaos(gPC),are used to construct a polynomial approximation of the forward solution over the support of the prior distribution.This approximation then defines a surrogate posterior probability density that can be evaluated repeatedly at minimal computational cost.The ability to simulate a large number of samples from the posterior distribution results in very accurate estimates of the inverse solution and its associated uncertainty.Combined with high accuracy of the gPC-based forward solver,the new algorithm can provide great efficiency in practical applications.A rigorous error analysis of the algorithm is conducted,where we establish convergence of the approximate posterior to the true posterior and obtain an estimate of the convergence rate.It is proved that fast(exponential)convergence of the gPC forward solution yields similarly fast(exponential)convergence of the posterior.The numerical strategy and the predicted convergence rates are then demonstrated on nonlinear inverse problems of varying smoothness and dimension.
基金Project supported by the National Construction of High-Quality University Projects of Graduate (No. 2008102915)the National Specially Major Fund of Water Pollution Control and Management (No. 2008ZX07314-005)the Tianjin Science and Technology Support Program (No. 09ZCGYSF00600), China
文摘An accurate understanding of the condition of a pipe is important for maintaining acceptable levels of service and providing appropriate strategies for maintenance and rehabilitation in water supply systems. Many factors contribute to pipe deterioration. To consolidate information on these factors to assess the condition of water pipes, this study employed a new ap-proach based on Bayesian configuration against pipe condition to generate factor weights. Ten pipe factors from three pipe ma-terials (cast iron, ductile cast iron and steel) were used in this study. The factors included size, age, inner coating, outer coating, soil condition, bedding condition, trench depth, electrical recharge, the number of road lanes, material, and operational pressure. To address identification problems that arise when switching from pipe factor information to actual pipe condition, informative prior factor weight distribution based on the literature and previous knowledge of water pipe assessment was used. The influence of each factor on the results of pipe assessment was estimated. Results suggested that factors that with smaller weight values or with weights having relative stable posterior means and narrow uncertainty bounds, would have less influence on pipe conditions. The model was the most sensitive to variations of pipe age. Using numerical experiments of different factor combinations, a simplified model, excluding factors such as trench depth, electrical recharge, and the number of road lanes, is provided. The proposed Bayesian inference approach provides a more reliable assessment of pipe deterioration.
基金supported by the National Magnetic Confinement Fusion Energy Research and Development Program of China(Nos.2019YFE03090100,2019YFE03040004)the National Science Foundation for Young Scientists of China(No.12005052)。
文摘Data analysis on tokamak plasmas is mainly based on various diagnostic systems,which are usually modularized and independent of each other.This leads to a large amount of data not being fully and effectively exploited so that it is not conducive to revealing the deep physical mechanism.In this work,Bayesian probability inference with machine learning methods have been applied to the electron cyclotron emission and Thomson scattering diagnostic systems on HL-2A/2M,and the effects of integrated data analysis(IDA)on the electron temperature of HL-2A with Bayesian probability inference are demonstrated.A program is developed to infer the whole electron temperature profile with a confidence interval,and the program can be applied in online analysis.The IDA results show that the full profile of the electron temperature can be obtained and the diagnostic information is more comprehensive and abundant with IDA.The inference models for electron temperature analysis are established and the developed programs will serve as an experimental data analysis tool for HL-2A/2M in the near future.
文摘Various works from the literature aimed at accelerating Bayesian inference in inverse problems.Stochastic spectral methods have been recently proposed as surrogate approximations of the forward uncertainty propagation model over the support of the prior distribution.These representations are efficient because they allow affordable simulation of a large number of samples from the posterior distribution.Unfortunately,they do not perform well when the forward model exhibits strong nonlinear behavior with respect to its input.In this work,we first relate the fast(exponential)L2-convergence of the forward approximation to the fast(exponential)convergence(in terms of Kullback-Leibler divergence)of the approximate posterior.In particular,we prove that in case the prior distribution is uniform,the posterior is at least twice as fast as the convergence rate of the forward model in those norms.The Bayesian inference strategy is developed in the framework of a stochastic spectral projection method.The predicted convergence rates are then demonstrated for simple nonlinear inverse problems of varying smoothness.We then propose an efficient numerical approach for the Bayesian solution of inverse problems presenting strongly nonlinear or discontinuous system responses.This comes with the improvement of the forward model that is adaptively approximated by an iterative generalized Polynomial Chaos-based representation.The numerical approximations and predicted convergence rates of the former approach are compared to the new iterative numerical method for nonlinear time-dependent test cases of varying dimension and complexity,which are relevant regarding our hydrodynamics motivations and therefore regarding hyperbolic conservation laws and the apparition of discontinuities in finite time.