With the improvement of equipment reliability,human factors have become the most uncertain part in the system.The standardized Plant Analysis of Risk-Human Reliability Analysis(SPAR-H)method is a reliable method in th...With the improvement of equipment reliability,human factors have become the most uncertain part in the system.The standardized Plant Analysis of Risk-Human Reliability Analysis(SPAR-H)method is a reliable method in the field of human reliability analysis(HRA)to evaluate human reliability and assess risk in large complex systems.However,the classical SPAR-H method does not consider the dependencies among performance shaping factors(PSFs),whichmay cause overestimation or underestimation of the risk of the actual situation.To address this issue,this paper proposes a new method to deal with the dependencies among PSFs in SPAR-H based on the Pearson correlation coefficient.First,the dependence between every two PSFs is measured by the Pearson correlation coefficient.Second,the weights of the PSFs are obtained by considering the total dependence degree.Finally,PSFs’multipliers are modified based on the weights of corresponding PSFs,and then used in the calculating of human error probability(HEP).A case study is used to illustrate the procedure and effectiveness of the proposed method.展开更多
Prediction of reservoir fracture is the key to explore fracture-type reservoir. When a shear-wave propagates in anisotropic media containing fracture,it splits into two polarized shear waves: fast shear wave and slow ...Prediction of reservoir fracture is the key to explore fracture-type reservoir. When a shear-wave propagates in anisotropic media containing fracture,it splits into two polarized shear waves: fast shear wave and slow shear wave. The polarization and time delay of the fast and slow shear wave can be used to predict the azimuth and density of fracture. The current identification method of fracture azimuth and fracture density is cross-correlation method. It is assumed that fast and slow shear waves were symmetrical wavelets after completely separating,and use the most similar characteristics of wavelets to identify fracture azimuth and density,but in the experiment the identification is poor in accuracy. Pearson correlation coefficient method is one of the methods for separating the fast wave and slow wave. This method is faster in calculating speed and better in noise immunity and resolution compared with the traditional cross-correlation method. Pearson correlation coefficient method is a non-linear problem,particle swarm optimization( PSO) is a good nonlinear global optimization method which converges fast and is easy to implement. In this study,PSO is combined with the Pearson correlation coefficient method to achieve identifying fracture property and improve the computational efficiency.展开更多
We propose two simple regression models of Pearson correlation coefficient of two normal responses or binary responses to assess the effect of covariates of interest.Likelihood-based inference is established to estima...We propose two simple regression models of Pearson correlation coefficient of two normal responses or binary responses to assess the effect of covariates of interest.Likelihood-based inference is established to estimate the regression coefficients,upon which bootstrap-based method is used to test the significance of covariates of interest.Simulation studies show the effectiveness of the method in terms of type-I error control,power performance in moderate sample size and robustness with respect to model mis-specification.We illustrate the application of the proposed method to some real data concerning health measurements.展开更多
In view of the difficulty in predicting the cost data of power transmission and transformation projects at present,a method based on Pearson correlation coefficient-improved particle swarm optimization(IPSO)-extreme l...In view of the difficulty in predicting the cost data of power transmission and transformation projects at present,a method based on Pearson correlation coefficient-improved particle swarm optimization(IPSO)-extreme learning machine(ELM)is proposed.In this paper,the Pearson correlation coefficient is used to screen out the main influencing factors as the input-independent variables of the ELM algorithm and IPSO based on a ladder-structure coding method is used to optimize the number of hidden-layer nodes,input weights and bias values of the ELM.Therefore,the prediction model for the cost data of power transmission and transformation projects based on the Pearson correlation coefficient-IPSO-ELM algorithm is constructed.Through the analysis of calculation examples,it is proved that the prediction accuracy of the proposed method is higher than that of other algorithms,which verifies the effectiveness of the model.展开更多
The hesitancy fuzzy graphs(HFGs),an extension of fuzzy graphs,are useful tools for dealing with ambiguity and uncertainty in issues involving decision-making(DM).This research implements a correlation coefficient meas...The hesitancy fuzzy graphs(HFGs),an extension of fuzzy graphs,are useful tools for dealing with ambiguity and uncertainty in issues involving decision-making(DM).This research implements a correlation coefficient measure(CCM)to assess the strength of the association between HFGs in this article since CCMs have a high capacity to process and interpret data.The CCM that is proposed between the HFGs has better qualities than the existing ones.It lowers restrictions on the hesitant fuzzy elements’length and may be used to establish whether the HFGs are connected negatively or favorably.Additionally,a CCMbased attribute DM approach is built into a hesitant fuzzy environment.This article suggests the use of weighted correlation coefficient measures(WCCMs)using the CCM concept to quantify the correlation between two HFGs.The decisionmaking problems of hesitancy fuzzy preference relations(HFPRs)are considered.This research proposes a new technique for assessing the relative weights of experts based on the uncertainty of HFPRs and the correlation coefficient degree of each HFPR.This paper determines the ranking order of all alternatives and the best one by using the CCMs between each option and the ideal choice.In the meantime,the appropriate example is given to demonstrate the viability of the new strategies.展开更多
Aim To study the reason of the insensitiveness of Pearson preduct-moment correlation coefficient as a similarity measure and the method to improve its sensitivity. Methods Experimental and simulated data sets were use...Aim To study the reason of the insensitiveness of Pearson preduct-moment correlation coefficient as a similarity measure and the method to improve its sensitivity. Methods Experimental and simulated data sets were used. Results The distribution range of the data sets influences the sensitivity of Pearson product-moment correlation coefficient. Weighted Pearson product-moment correlation coefficient is more sensitive when the range of the data set is large. Conclusion Weighted Pearson product-moment correlation coefficient is necessary when the range of the data set is large.展开更多
The district cooling system (DCS) with ice storage can reduce the peak electricity demand of the business district buildings it serves, improve system efficiency, and lower operational costs. This study utilizes a mon...The district cooling system (DCS) with ice storage can reduce the peak electricity demand of the business district buildings it serves, improve system efficiency, and lower operational costs. This study utilizes a monitoring and control platform for DCS with ice storage to analyze historical parameter values related to system operation and executed operations. We assess the distribution of cooling loads among various devices within the DCS, identify operational characteristics of the system through correlation analysis and principal component analysis (PCA), and subsequently determine key parameters affecting changes in cooling loads. Accurate forecasting of cooling loads is crucial for determining optimal control strategies. The research process can be summarized briefly as follows: data preprocessing, parameter analysis, parameter selection, and validation of load forecasting performance. The study reveals that while individual devices in the system perform well, there is considerable room for improving overall system efficiency. Six principal components have been identified as input parameters for the cold load forecasting model, with each of these components having eigenvalues greater than 1 and contributing to an accumulated variance of 87.26%, and during the dimensionality reduction process, we obtained a confidence ellipse with a 95% confidence interval. Regarding cooling load forecasting, the Relative Absolute Error (RAE) value of the light gradient boosting machine (lightGBM) algorithm is 3.62%, Relative Root Mean Square Error (RRMSE) is 42.75%, and R-squared value (R<sup>2</sup>) is 92.96%, indicating superior forecasting performance compared to other commonly used cooling load forecasting algorithms. This research provides valuable insights and auxiliary guidance for data analysis and optimizing operations in practical engineering applications. .展开更多
The running correlation coefficient(RCC)is useful for capturing temporal variations in correlations between two time series.The local running correlation coefficient(LRCC)is a widely used algorithm that directly appli...The running correlation coefficient(RCC)is useful for capturing temporal variations in correlations between two time series.The local running correlation coefficient(LRCC)is a widely used algorithm that directly applies the Pearson correlation to a time window.A new algorithm called synthetic running correlation coefficient(SRCC)was proposed in 2018 and proven to be rea-sonable and usable;however,this algorithm lacks a theoretical demonstration.In this paper,SRCC is proven theoretically.RCC is only meaningful when its values at different times can be compared.First,the global means are proven to be the unique standard quantities for comparison.SRCC is the only RCC that satisfies the comparability criterion.The relationship between LRCC and SRCC is derived using statistical methods,and SRCC is obtained by adding a constraint condition to the LRCC algorithm.Dividing the temporal fluctuations into high-and low-frequency signals reveals that LRCC only reflects the correlation of high-frequency signals;by contrast,SRCC reflects the correlations of high-and low-frequency signals simultaneously.Therefore,SRCC is the ap-propriate method for calculating RCCs.展开更多
A theoretical model to correlate and predict the liquid diffusion coefficients in binary sys-tems has been developed.Based on this mode1 the diffusion coefficient of 73 binary systems have beencorrelated,the overall a...A theoretical model to correlate and predict the liquid diffusion coefficients in binary sys-tems has been developed.Based on this mode1 the diffusion coefficient of 73 binary systems have beencorrelated,the overall average deviation of the correlation for diffusion coefficients is 0.009.Forbinary systems the diffusion coefficients have been predicted from vapor liquid phase equilibrium(VLE)and vice versa.展开更多
Sensory properties and physico-chemical parameters of 10 most popular brands of commercial set-type Turkish yoghurts were evaluated and correlation coefficients between the two indices were investigated. The results i...Sensory properties and physico-chemical parameters of 10 most popular brands of commercial set-type Turkish yoghurts were evaluated and correlation coefficients between the two indices were investigated. The results indicated that increases in volatile compounds (acetaldehyde, 2-butanone, 2-nanonane, ethyl acetate), titratable acidity, ash and fat contents inversely correlated with the overall acceptability score of the yoghurt. However, diacetyl, C4 to C12 free fatty acids, pH, whiteness index and texture positively correlated with overall acceptability of the yoghurt products. It was concluded that the acceptability of the Turkish set-type yoghurts is mainly governed by the fifteen volatile compounds as well as the physico-chemical properties determined. Thus, the overall acceptability of the yoghurts was not influenced by a single characteristic, but rather by complex in nature.展开更多
In order to study the temporal variations of correlations between two time series,a running correlation coefficient(RCC)could be used.An RCC is calculated for a given time window,and the window is then moved sequentia...In order to study the temporal variations of correlations between two time series,a running correlation coefficient(RCC)could be used.An RCC is calculated for a given time window,and the window is then moved sequentially through time.The current calculation method for RCCs is based on the general definition of the Pearson product-moment correlation coefficient,calculated with the data within the time window,which we call the local running correlation coefficient(LRCC).The LRCC is calculated via the two anomalies corresponding to the two local means,meanwhile,the local means also vary.It is cleared up that the LRCC reflects only the correlation between the two anomalies within the time window but fails to exhibit the contributions of the two varying means.To address this problem,two unchanged means obtained from all available data are adopted to calculate an RCC,which is called the synthetic running correlation coefficient(SRCC).When the anomaly variations are dominant,the two RCCs are similar.However,when the variations of the means are dominant,the difference between the two RCCs becomes obvious.The SRCC reflects the correlations of both the anomaly variations and the variations of the means.Therefore,the SRCCs from different time points are intercomparable.A criterion for the superiority of the RCC algorithm is that the average value of the RCC should be close to the global correlation coefficient calculated using all data.The SRCC always meets this criterion,while the LRCC sometimes fails.Therefore,the SRCC is better than the LRCC for running correlations.We suggest using the SRCC to calculate the RCCs.展开更多
Due to the induced polarization(IP)eff ect,the sign reversal often occurs in timedomain airborne electromagnetic(AEM)data.The inversions that do not consider IP eff ect cannot recover the true umderground electrical s...Due to the induced polarization(IP)eff ect,the sign reversal often occurs in timedomain airborne electromagnetic(AEM)data.The inversions that do not consider IP eff ect cannot recover the true umderground electrical structures.In view of the fact that there are many parameters of airborne induced polarization data in time domain,and the sensitivity diff erence between parameters is large,which brings challenges to the stability and accuracy of the inversion.In this paper,we propose an inversion mehtod for time-domain AEM data with IP effect based on the Pearson correlation constraints.This method uses the Pearson correlation coeffi cient in statistics to characterize the correlation between the resistivity and the chargeability and constructs the Pearson correlation constraints for inverting the objective function to reduce the non uniqueness of inversion.To verify the eff ectiveness of this method,we perform both Occam’s inversion and Pearson correlation constrained inversion on the synthetic data.The experiments show that the Pearson correlation constrained inverison is more accurate and stable than the Occam’s inversion.Finally,we carried out the inversion to a survey dataset with and without IP eff ect.The results show that the data misfit and the continuity of the inverted section are greatly improved when the IP eff ect is considered.展开更多
Fluidization of non-spherical particles is very common in petroleum engineering.Understanding the complex phenomenon of non-spherical particle flow is of great significance.In this paper,coupled with two-fluid model,t...Fluidization of non-spherical particles is very common in petroleum engineering.Understanding the complex phenomenon of non-spherical particle flow is of great significance.In this paper,coupled with two-fluid model,the drag coefficient correlation based on artificial neural network was applied in the simulations of a bubbling fluidized bed filled with non-spherical particles.The simulation results were compared with the experimental data from the literature.Good agreement between the experimental data and the simulation results reveals that the modified drag model can accurately capture the interaction between the gas phase and solid phase.Then,several cases of different particles,including tetrahedron,cube,and sphere,together with the nylon beads used in the model validation,were employed in the simulations to study the effect of particle shape on the flow behaviors in the bubbling fluidized bed.Particle shape affects the hydrodynamics of non-spherical particles mainly on microscale.This work can be a basis and reference for the utilization of artificial neural network in the investigation of drag coefficient correlation in the dense gas-solid two-phase flow.Moreover,the proposed drag coefficient correlation provides one more option when investigating the hydrodynamics of non-spherical particles in the gas-solid fluidized bed.展开更多
The correlation coefficients of random variables of mechanical structures are generally chosen with experience or even ignored,which cannot actually reflect the effects of parameter uncertainties on reliability.To dis...The correlation coefficients of random variables of mechanical structures are generally chosen with experience or even ignored,which cannot actually reflect the effects of parameter uncertainties on reliability.To discuss the selection problem of the correlation coefficients from the reliability-based sensitivity point of view,the theory principle of the problem is established based on the results of the reliability sensitivity,and the criterion of correlation among random variables is shown.The values of the correlation coefficients are obtained according to the proposed principle and the reliability sensitivity problem is discussed.Numerical studies have shown the following results:(1) If the sensitivity value of correlation coefficient ρ is less than(at what magnitude 0.000 01),then the correlation could be ignored,which could simplify the procedure without introducing additional error.(2) However,as the difference between ρs,that is the most sensitive to the reliability,and ρR,that is with the smallest reliability,is less than 0.001,ρs is suggested to model the dependency of random variables.This could ensure the robust quality of system without the loss of safety requirement.(3) In the case of |Eabs|ρ0.001 and also |Erel|ρ0.001,ρR should be employed to quantify the correlation among random variables in order to ensure the accuracy of reliability analysis.Application of the proposed approach could provide a practical routine for mechanical design and manufactory to study the reliability and reliability-based sensitivity of basic design variables in mechanical reliability analysis and design.展开更多
基金Shanghai Rising-Star Program(Grant No.21QA1403400)Shanghai Sailing Program(Grant No.20YF1414800)Shanghai Key Laboratory of Power Station Automation Technology(Grant No.13DZ2273800).
文摘With the improvement of equipment reliability,human factors have become the most uncertain part in the system.The standardized Plant Analysis of Risk-Human Reliability Analysis(SPAR-H)method is a reliable method in the field of human reliability analysis(HRA)to evaluate human reliability and assess risk in large complex systems.However,the classical SPAR-H method does not consider the dependencies among performance shaping factors(PSFs),whichmay cause overestimation or underestimation of the risk of the actual situation.To address this issue,this paper proposes a new method to deal with the dependencies among PSFs in SPAR-H based on the Pearson correlation coefficient.First,the dependence between every two PSFs is measured by the Pearson correlation coefficient.Second,the weights of the PSFs are obtained by considering the total dependence degree.Finally,PSFs’multipliers are modified based on the weights of corresponding PSFs,and then used in the calculating of human error probability(HEP).A case study is used to illustrate the procedure and effectiveness of the proposed method.
文摘Prediction of reservoir fracture is the key to explore fracture-type reservoir. When a shear-wave propagates in anisotropic media containing fracture,it splits into two polarized shear waves: fast shear wave and slow shear wave. The polarization and time delay of the fast and slow shear wave can be used to predict the azimuth and density of fracture. The current identification method of fracture azimuth and fracture density is cross-correlation method. It is assumed that fast and slow shear waves were symmetrical wavelets after completely separating,and use the most similar characteristics of wavelets to identify fracture azimuth and density,but in the experiment the identification is poor in accuracy. Pearson correlation coefficient method is one of the methods for separating the fast wave and slow wave. This method is faster in calculating speed and better in noise immunity and resolution compared with the traditional cross-correlation method. Pearson correlation coefficient method is a non-linear problem,particle swarm optimization( PSO) is a good nonlinear global optimization method which converges fast and is easy to implement. In this study,PSO is combined with the Pearson correlation coefficient method to achieve identifying fracture property and improve the computational efficiency.
文摘We propose two simple regression models of Pearson correlation coefficient of two normal responses or binary responses to assess the effect of covariates of interest.Likelihood-based inference is established to estimate the regression coefficients,upon which bootstrap-based method is used to test the significance of covariates of interest.Simulation studies show the effectiveness of the method in terms of type-I error control,power performance in moderate sample size and robustness with respect to model mis-specification.We illustrate the application of the proposed method to some real data concerning health measurements.
文摘In view of the difficulty in predicting the cost data of power transmission and transformation projects at present,a method based on Pearson correlation coefficient-improved particle swarm optimization(IPSO)-extreme learning machine(ELM)is proposed.In this paper,the Pearson correlation coefficient is used to screen out the main influencing factors as the input-independent variables of the ELM algorithm and IPSO based on a ladder-structure coding method is used to optimize the number of hidden-layer nodes,input weights and bias values of the ELM.Therefore,the prediction model for the cost data of power transmission and transformation projects based on the Pearson correlation coefficient-IPSO-ELM algorithm is constructed.Through the analysis of calculation examples,it is proved that the prediction accuracy of the proposed method is higher than that of other algorithms,which verifies the effectiveness of the model.
基金This research work supported and funded was provided by Vellore Institute of Technology.
文摘The hesitancy fuzzy graphs(HFGs),an extension of fuzzy graphs,are useful tools for dealing with ambiguity and uncertainty in issues involving decision-making(DM).This research implements a correlation coefficient measure(CCM)to assess the strength of the association between HFGs in this article since CCMs have a high capacity to process and interpret data.The CCM that is proposed between the HFGs has better qualities than the existing ones.It lowers restrictions on the hesitant fuzzy elements’length and may be used to establish whether the HFGs are connected negatively or favorably.Additionally,a CCMbased attribute DM approach is built into a hesitant fuzzy environment.This article suggests the use of weighted correlation coefficient measures(WCCMs)using the CCM concept to quantify the correlation between two HFGs.The decisionmaking problems of hesitancy fuzzy preference relations(HFPRs)are considered.This research proposes a new technique for assessing the relative weights of experts based on the uncertainty of HFPRs and the correlation coefficient degree of each HFPR.This paper determines the ranking order of all alternatives and the best one by using the CCMs between each option and the ideal choice.In the meantime,the appropriate example is given to demonstrate the viability of the new strategies.
文摘Aim To study the reason of the insensitiveness of Pearson preduct-moment correlation coefficient as a similarity measure and the method to improve its sensitivity. Methods Experimental and simulated data sets were used. Results The distribution range of the data sets influences the sensitivity of Pearson product-moment correlation coefficient. Weighted Pearson product-moment correlation coefficient is more sensitive when the range of the data set is large. Conclusion Weighted Pearson product-moment correlation coefficient is necessary when the range of the data set is large.
文摘The district cooling system (DCS) with ice storage can reduce the peak electricity demand of the business district buildings it serves, improve system efficiency, and lower operational costs. This study utilizes a monitoring and control platform for DCS with ice storage to analyze historical parameter values related to system operation and executed operations. We assess the distribution of cooling loads among various devices within the DCS, identify operational characteristics of the system through correlation analysis and principal component analysis (PCA), and subsequently determine key parameters affecting changes in cooling loads. Accurate forecasting of cooling loads is crucial for determining optimal control strategies. The research process can be summarized briefly as follows: data preprocessing, parameter analysis, parameter selection, and validation of load forecasting performance. The study reveals that while individual devices in the system perform well, there is considerable room for improving overall system efficiency. Six principal components have been identified as input parameters for the cold load forecasting model, with each of these components having eigenvalues greater than 1 and contributing to an accumulated variance of 87.26%, and during the dimensionality reduction process, we obtained a confidence ellipse with a 95% confidence interval. Regarding cooling load forecasting, the Relative Absolute Error (RAE) value of the light gradient boosting machine (lightGBM) algorithm is 3.62%, Relative Root Mean Square Error (RRMSE) is 42.75%, and R-squared value (R<sup>2</sup>) is 92.96%, indicating superior forecasting performance compared to other commonly used cooling load forecasting algorithms. This research provides valuable insights and auxiliary guidance for data analysis and optimizing operations in practical engineering applications. .
基金This study was supported by the National Natural Sci-ence Foundation of China(Nos.41976022,41941012)the Major Scientific and Technological Innovation Projects of Shandong Province(No.2018SDKJ0104-1).
文摘The running correlation coefficient(RCC)is useful for capturing temporal variations in correlations between two time series.The local running correlation coefficient(LRCC)is a widely used algorithm that directly applies the Pearson correlation to a time window.A new algorithm called synthetic running correlation coefficient(SRCC)was proposed in 2018 and proven to be rea-sonable and usable;however,this algorithm lacks a theoretical demonstration.In this paper,SRCC is proven theoretically.RCC is only meaningful when its values at different times can be compared.First,the global means are proven to be the unique standard quantities for comparison.SRCC is the only RCC that satisfies the comparability criterion.The relationship between LRCC and SRCC is derived using statistical methods,and SRCC is obtained by adding a constraint condition to the LRCC algorithm.Dividing the temporal fluctuations into high-and low-frequency signals reveals that LRCC only reflects the correlation of high-frequency signals;by contrast,SRCC reflects the correlations of high-and low-frequency signals simultaneously.Therefore,SRCC is the ap-propriate method for calculating RCCs.
文摘A theoretical model to correlate and predict the liquid diffusion coefficients in binary sys-tems has been developed.Based on this mode1 the diffusion coefficient of 73 binary systems have beencorrelated,the overall average deviation of the correlation for diffusion coefficients is 0.009.Forbinary systems the diffusion coefficients have been predicted from vapor liquid phase equilibrium(VLE)and vice versa.
文摘Sensory properties and physico-chemical parameters of 10 most popular brands of commercial set-type Turkish yoghurts were evaluated and correlation coefficients between the two indices were investigated. The results indicated that increases in volatile compounds (acetaldehyde, 2-butanone, 2-nanonane, ethyl acetate), titratable acidity, ash and fat contents inversely correlated with the overall acceptability score of the yoghurt. However, diacetyl, C4 to C12 free fatty acids, pH, whiteness index and texture positively correlated with overall acceptability of the yoghurt products. It was concluded that the acceptability of the Turkish set-type yoghurts is mainly governed by the fifteen volatile compounds as well as the physico-chemical properties determined. Thus, the overall acceptability of the yoghurts was not influenced by a single characteristic, but rather by complex in nature.
基金supported by the Key Program of the National Natural Science Foundation of China (No. 41330960)the Global Change Research Program of China (No. 2015CB953900)
文摘In order to study the temporal variations of correlations between two time series,a running correlation coefficient(RCC)could be used.An RCC is calculated for a given time window,and the window is then moved sequentially through time.The current calculation method for RCCs is based on the general definition of the Pearson product-moment correlation coefficient,calculated with the data within the time window,which we call the local running correlation coefficient(LRCC).The LRCC is calculated via the two anomalies corresponding to the two local means,meanwhile,the local means also vary.It is cleared up that the LRCC reflects only the correlation between the two anomalies within the time window but fails to exhibit the contributions of the two varying means.To address this problem,two unchanged means obtained from all available data are adopted to calculate an RCC,which is called the synthetic running correlation coefficient(SRCC).When the anomaly variations are dominant,the two RCCs are similar.However,when the variations of the means are dominant,the difference between the two RCCs becomes obvious.The SRCC reflects the correlations of both the anomaly variations and the variations of the means.Therefore,the SRCCs from different time points are intercomparable.A criterion for the superiority of the RCC algorithm is that the average value of the RCC should be close to the global correlation coefficient calculated using all data.The SRCC always meets this criterion,while the LRCC sometimes fails.Therefore,the SRCC is better than the LRCC for running correlations.We suggest using the SRCC to calculate the RCCs.
基金This paper was fi nancially supported by the National Natural Science Foundation of China(Nos.42030806,41774125,41904104,41804098)the Pioneer Project of Chinese Academy of Sciences(No.XDA14020102).
文摘Due to the induced polarization(IP)eff ect,the sign reversal often occurs in timedomain airborne electromagnetic(AEM)data.The inversions that do not consider IP eff ect cannot recover the true umderground electrical structures.In view of the fact that there are many parameters of airborne induced polarization data in time domain,and the sensitivity diff erence between parameters is large,which brings challenges to the stability and accuracy of the inversion.In this paper,we propose an inversion mehtod for time-domain AEM data with IP effect based on the Pearson correlation constraints.This method uses the Pearson correlation coeffi cient in statistics to characterize the correlation between the resistivity and the chargeability and constructs the Pearson correlation constraints for inverting the objective function to reduce the non uniqueness of inversion.To verify the eff ectiveness of this method,we perform both Occam’s inversion and Pearson correlation constrained inversion on the synthetic data.The experiments show that the Pearson correlation constrained inverison is more accurate and stable than the Occam’s inversion.Finally,we carried out the inversion to a survey dataset with and without IP eff ect.The results show that the data misfit and the continuity of the inverted section are greatly improved when the IP eff ect is considered.
基金the financial support by the National Natural Science Foundation of China(Grant No.51706055).
文摘Fluidization of non-spherical particles is very common in petroleum engineering.Understanding the complex phenomenon of non-spherical particle flow is of great significance.In this paper,coupled with two-fluid model,the drag coefficient correlation based on artificial neural network was applied in the simulations of a bubbling fluidized bed filled with non-spherical particles.The simulation results were compared with the experimental data from the literature.Good agreement between the experimental data and the simulation results reveals that the modified drag model can accurately capture the interaction between the gas phase and solid phase.Then,several cases of different particles,including tetrahedron,cube,and sphere,together with the nylon beads used in the model validation,were employed in the simulations to study the effect of particle shape on the flow behaviors in the bubbling fluidized bed.Particle shape affects the hydrodynamics of non-spherical particles mainly on microscale.This work can be a basis and reference for the utilization of artificial neural network in the investigation of drag coefficient correlation in the dense gas-solid two-phase flow.Moreover,the proposed drag coefficient correlation provides one more option when investigating the hydrodynamics of non-spherical particles in the gas-solid fluidized bed.
基金supported by Changjiang Scholars and Innovative Research Team in University of China (Grant No. IRT0816)Key National Science & Technology Special Project on "High-Grade CNC Machine Tools and Basic Manufacturing Equipments" of China (Grant No. 2010ZX04014-014)+1 种基金National Natural Science Foundation of China (Grant No. 50875039)Key Projects in National Science & Technology Pillar Program during the 11th Five-year Plan Period of China (Grant No. 2009BAG12A02-A07-2)
文摘The correlation coefficients of random variables of mechanical structures are generally chosen with experience or even ignored,which cannot actually reflect the effects of parameter uncertainties on reliability.To discuss the selection problem of the correlation coefficients from the reliability-based sensitivity point of view,the theory principle of the problem is established based on the results of the reliability sensitivity,and the criterion of correlation among random variables is shown.The values of the correlation coefficients are obtained according to the proposed principle and the reliability sensitivity problem is discussed.Numerical studies have shown the following results:(1) If the sensitivity value of correlation coefficient ρ is less than(at what magnitude 0.000 01),then the correlation could be ignored,which could simplify the procedure without introducing additional error.(2) However,as the difference between ρs,that is the most sensitive to the reliability,and ρR,that is with the smallest reliability,is less than 0.001,ρs is suggested to model the dependency of random variables.This could ensure the robust quality of system without the loss of safety requirement.(3) In the case of |Eabs|ρ0.001 and also |Erel|ρ0.001,ρR should be employed to quantify the correlation among random variables in order to ensure the accuracy of reliability analysis.Application of the proposed approach could provide a practical routine for mechanical design and manufactory to study the reliability and reliability-based sensitivity of basic design variables in mechanical reliability analysis and design.