In this paper,we study spatial cross-sectional data models in the form of matrix exponential spatial specification(MESS),where MESS appears in both dependent and error terms.The empirical likelihood(EL)ratio statistic...In this paper,we study spatial cross-sectional data models in the form of matrix exponential spatial specification(MESS),where MESS appears in both dependent and error terms.The empirical likelihood(EL)ratio statistics are established for the parameters of the MESS model.It is shown that the limiting distributions of EL ratio statistics follow chi-square distributions,which are used to construct the confidence regions of model parameters.Simulation experiments are conducted to compare the performances of confidence regions based on EL method and normal approximation method.展开更多
The noise that comes from finite element simulation often causes the model to fall into the local optimal solution and over fitting during optimization of generator.Thus,this paper proposes a Gaussian Process Regressi...The noise that comes from finite element simulation often causes the model to fall into the local optimal solution and over fitting during optimization of generator.Thus,this paper proposes a Gaussian Process Regression(GPR)model based on Conditional Likelihood Lower Bound Search(CLLBS)to optimize the design of the generator,which can filter the noise in the data and search for global optimization by combining the Conditional Likelihood Lower Bound Search method.Taking the efficiency optimization of 15 kW Permanent Magnet Synchronous Motor as an example.Firstly,this method uses the elementary effect analysis to choose the sensitive variables,combining the evolutionary algorithm to design the super Latin cube sampling plan;Then the generator-converter system is simulated by establishing a co-simulation platform to obtain data.A Gaussian process regression model combing the method of the conditional likelihood lower bound search is established,which combined the chi-square test to optimize the accuracy of the model globally.Secondly,after the model reaches the accuracy,the Pareto frontier is obtained through the NSGA-II algorithm by considering the maximum output torque as a constraint.Last,the constrained optimization is transformed into an unconstrained optimizing problem by introducing maximum constrained improvement expectation(CEI)optimization method based on the re-interpolation model,which cross-validated the optimization results of the Gaussian process regression model.The above method increase the efficiency of generator by 0.76%and 0.5%respectively;And this method can be used for rapid modeling and multi-objective optimization of generator systems.展开更多
BACKGROUND Adolescent major depressive disorder(MDD)is a significant mental health concern that often leads to recurrent depression in adulthood.Resting-state functional magnetic resonance imaging(rs-fMRI)offers uniqu...BACKGROUND Adolescent major depressive disorder(MDD)is a significant mental health concern that often leads to recurrent depression in adulthood.Resting-state functional magnetic resonance imaging(rs-fMRI)offers unique insights into the neural mechanisms underlying this condition.However,despite previous research,the specific vulnerable brain regions affected in adolescent MDD patients have not been fully elucidated.AIM To identify consistent vulnerable brain regions in adolescent MDD patients using rs-fMRI and activation likelihood estimation(ALE)meta-analysis.METHODS We performed a comprehensive literature search through July 12,2023,for studies investigating brain functional changes in adolescent MDD patients.We utilized regional homogeneity(ReHo),amplitude of low-frequency fluctuations(ALFF)and fractional ALFF(fALFF)analyses.We compared the regions of aberrant spontaneous neural activity in adolescents with MDD vs healthy controls(HCs)using ALE.RESULTS Ten studies(369 adolescent MDD patients and 313 HCs)were included.Combining the ReHo and ALFF/fALFF data,the results revealed that the activity in the right cuneus and left precuneus was lower in the adolescent MDD patients than in the HCs(voxel size:648 mm3,P<0.05),and no brain region exhibited increased activity.Based on the ALFF data,we found decreased activity in the right cuneus and left precuneus in adolescent MDD patients(voxel size:736 mm3,P<0.05),with no regions exhibiting increased activity.CONCLUSION Through ALE meta-analysis,we consistently identified the right cuneus and left precuneus as vulnerable brain regions in adolescent MDD patients,increasing our understanding of the neuropathology of affected adolescents.展开更多
The neutron spectrum unfolding by Bonner sphere spectrometer(BSS) is considered a complex multidimensional model,which requires complex mathematical methods to solve the first kind of Fredholm integral equation. In or...The neutron spectrum unfolding by Bonner sphere spectrometer(BSS) is considered a complex multidimensional model,which requires complex mathematical methods to solve the first kind of Fredholm integral equation. In order to solve the problem of the maximum likelihood expectation maximization(MLEM) algorithm which is easy to suffer the pitfalls of local optima and the particle swarm optimization(PSO) algorithm which is easy to get unreasonable flight direction and step length of particles, which leads to the invalid iteration and affect efficiency and accuracy, an improved PSO-MLEM algorithm, combined of PSO and MLEM algorithm, is proposed for neutron spectrum unfolding. The dynamic acceleration factor is used to balance the ability of global and local search, and improves the convergence speed and accuracy of the algorithm. Firstly, the Monte Carlo method was used to simulated the BSS to obtain the response function and count rates of BSS. In the simulation of count rate, four reference spectra from the IAEA Technical Report Series No. 403 were used as input parameters of the Monte Carlo method. The PSO-MLEM algorithm was used to unfold the neutron spectrum of the simulated data and was verified by the difference of the unfolded spectrum to the reference spectrum. Finally, the 252Cf neutron source was measured by BSS, and the PSO-MLEM algorithm was used to unfold the experimental neutron spectrum.Compared with maximum entropy deconvolution(MAXED), PSO and MLEM algorithm, the PSO-MLEM algorithm has fewer parameters and automatically adjusts the dynamic acceleration factor to solve the problem of local optima. The convergence speed of the PSO-MLEM algorithm is 1.4 times and 3.1 times that of the MLEM and PSO algorithms. Compared with PSO, MLEM and MAXED, the correlation coefficients of PSO-MLEM algorithm are increased by 33.1%, 33.5% and 1.9%, and the relative mean errors are decreased by 98.2%, 97.8% and 67.4%.展开更多
Maximum likelihood estimation(MLE)is an effective method for localizing radioactive sources in a given area.However,it requires an exhaustive search for parameter estimation,which is time-consuming.In this study,heuri...Maximum likelihood estimation(MLE)is an effective method for localizing radioactive sources in a given area.However,it requires an exhaustive search for parameter estimation,which is time-consuming.In this study,heuristic techniques were employed to search for radiation source parameters that provide the maximum likelihood by using a network of sensors.Hence,the time consumption of MLE would be effectively reduced.First,the radiation source was detected using the k-sigma method.Subsequently,the MLE was applied for parameter estimation using the readings and positions of the detectors that have detected the radiation source.A comparative study was performed in which the estimation accuracy and time consump-tion of the MLE were evaluated for traditional methods and heuristic techniques.The traditional MLE was performed via a grid search method using fixed and multiple resolutions.Additionally,four commonly used heuristic algorithms were applied:the firefly algorithm(FFA),particle swarm optimization(PSO),ant colony optimization(ACO),and artificial bee colony(ABC).The experiment was conducted using real data collected by the Low Scatter Irradiator facility at the Savannah River National Laboratory as part of the Intelligent Radiation Sensing System program.The comparative study showed that the estimation time was 3.27 s using fixed resolution MLE and 0.59 s using multi-resolution MLE.The time consumption for the heuristic-based MLE was 0.75,0.03,0.02,and 0.059 s for FFA,PSO,ACO,and ABC,respectively.The location estimation error was approximately 0.4 m using either the grid search-based MLE or the heuristic-based MLE.Hence,heuristic-based MLE can provide comparable estimation accuracy through a less time-consuming process than traditional MLE.展开更多
The conformal array can make full use of the aperture,save space,meet the requirements of aerodynamics,and is sensitive to polarization information.It has broad application prospects in military,aerospace,and communic...The conformal array can make full use of the aperture,save space,meet the requirements of aerodynamics,and is sensitive to polarization information.It has broad application prospects in military,aerospace,and communication fields.The joint polarization and direction-of-arrival(DOA)estimation based on the conformal array and the theoretical analysis of its parameter estimation performance are the key factors to promote the engineering application of the conformal array.To solve these problems,this paper establishes the wave field signal model of the conformal array.Then,for the case of a single target,the cost function of the maximum likelihood(ML)estimator is rewritten with Rayleigh quotient from a problem of maximizing the ratio of quadratic forms into those of minimizing quadratic forms.On this basis,rapid parameter estimation is achieved with the idea of manifold separation technology(MST).Compared with the modified variable projection(MVP)algorithm,it reduces the computational complexity and improves the parameter estimation performance.Meanwhile,the MST is used to solve the partial derivative of the steering vector.Then,the theoretical performance of ML,the multiple signal classification(MUSIC)estimator and Cramer-Rao bound(CRB)based on the conformal array are derived respectively,which provides theoretical foundation for the engineering application of the conformal array.Finally,the simulation experiment verifies the effectiveness of the proposed method.展开更多
The paper discusses the statistical inference problem of the compound Poisson vector process(CPVP)in the domain of attraction of normal law but with infinite covariance matrix.The empirical likelihood(EL)method to con...The paper discusses the statistical inference problem of the compound Poisson vector process(CPVP)in the domain of attraction of normal law but with infinite covariance matrix.The empirical likelihood(EL)method to construct confidence regions for the mean vector has been proposed.It is a generalization from the finite second-order moments to the infinite second-order moments in the domain of attraction of normal law.The log-empirical likelihood ratio statistic for the average number of the CPVP converges to F distribution in distribution when the population is in the domain of attraction of normal law but has infinite covariance matrix.Some simulation results are proposed to illustrate the method of the paper.展开更多
In this paper, a weighted maximum likelihood technique (WMLT) for the logistic regression model is presented. This method depended on a weight function that is continuously adaptable using Mahalanobis distances for pr...In this paper, a weighted maximum likelihood technique (WMLT) for the logistic regression model is presented. This method depended on a weight function that is continuously adaptable using Mahalanobis distances for predictor variables. Under the model, the asymptotic consistency of the suggested estimator is demonstrated and properties of finite-sample are also investigated via simulation. In simulation studies and real data sets, it is observed that the newly proposed technique demonstrated the greatest performance among all estimators compared.展开更多
In this paper, three smoothed empirical log-likelihood ratio functions for the parameters of nonlinear models with missing response are suggested. Under some regular conditions, the corresponding Wilks phenomena are o...In this paper, three smoothed empirical log-likelihood ratio functions for the parameters of nonlinear models with missing response are suggested. Under some regular conditions, the corresponding Wilks phenomena are obtained and the confidence regions for the parameter can be constructed easily.展开更多
In longitudinal data analysis, our primary interest is in the estimation of regression parameters for the marginal expectations of the longitudinal responses, and the longitudinal correlation parameters are of seconda...In longitudinal data analysis, our primary interest is in the estimation of regression parameters for the marginal expectations of the longitudinal responses, and the longitudinal correlation parameters are of secondary interest. The joint likelihood function for longitudinal data is challenging, particularly due to correlated responses. Marginal models, such as generalized estimating equations (GEEs), have received much attention based on the assumption of the first two moments of the data and a working correlation structure. The confidence regions and hypothesis tests are constructed based on the asymptotic normality. This approach is sensitive to the misspecification of the variance function and the working correlation structure which may yield inefficient and inconsistent estimates leading to wrong conclusions. To overcome this problem, we propose an empirical likelihood (EL) procedure based on a set of estimating equations for the parameter of interest and discuss its <span style="font-family:Verdana;">characteristics and asymptotic properties. We also provide an algorithm base</span><span style="font-family:Verdana;">d on EL principles for the estimation of the regression parameters and the construction of its confidence region. We have applied the proposed method in two case examples.</span>展开更多
Fisher [1] proposed a simple method to combine p-values from independent investigations without using detailed information of the original data. In recent years, likelihood-based asymptotic methods have been developed...Fisher [1] proposed a simple method to combine p-values from independent investigations without using detailed information of the original data. In recent years, likelihood-based asymptotic methods have been developed to produce highly accurate p-values. These likelihood-based methods generally required the likelihood function and the standardized maximum likelihood estimates departure calculated in the canonical parameter scale. In this paper, a method is proposed to obtain a p-value by combining the likelihood functions and the standardized maximum likelihood estimates departure of independent investigations for testing a scalar parameter of interest. Examples are presented to illustrate the application of the proposed method and simulation studies are performed to compare the accuracy of the proposed method with Fisher’s method.展开更多
In this paper, asymptotic expansions of the distribution of the likelihood ratio statistic for testing sphericity in a crowth curve model have been derived in the null and nonnull cases when the alternatives are dose ...In this paper, asymptotic expansions of the distribution of the likelihood ratio statistic for testing sphericity in a crowth curve model have been derived in the null and nonnull cases when the alternatives are dose to the null hypothesis. These expansions are given in series form of beta distributions.展开更多
This paper presents the calibration of Omori’s aftershock occurrence rate model for Turkey and the resulting likelihoods. Aftershock occurrence rate models are used for estimating the probability of an aftershock tha...This paper presents the calibration of Omori’s aftershock occurrence rate model for Turkey and the resulting likelihoods. Aftershock occurrence rate models are used for estimating the probability of an aftershock that exceeds a specific magnitude threshold within a time interval after the mainshock. Critical decisions on the post-earthquake safety of structures directly depend on the aftershock hazard estimated using the occurrence model. It is customary to calibrate models in a region-specific manner. These models depend on rate parameters(a, b, c and p) related to the seismicity characteristics of the investigated region. In this study, the available well-recorded aftershock sequences for a set of Mw ≥ 5.9 mainshock events that were observed in Turkey until 2012 are considered to develop the aftershock occurrence model. Mean estimates of the model parameters identified for Turkey are a =-1.90, b = 1.11, c = 0.05 and p = 1.20. Based on the developed model, aftershock likelihoods are computed for a range of different time intervals and mainshock magnitudes. Also, the sensitivity of aftershock probabilities to the model parameters is investigated. Aftershock occurrence probabilities estimated using the model are expected to be useful for post-earthquake safety evaluations in Turkey.展开更多
By taking the subsequence out of the input-output sequence of a system polluted by white noise, anindependent observation sequence and its probability density are obtained and then a maximum likelihood estimation of t...By taking the subsequence out of the input-output sequence of a system polluted by white noise, anindependent observation sequence and its probability density are obtained and then a maximum likelihood estimation of theidentification parameters is given. In order to decrease the asymptotic error, a corrector of maximum likelihood (CML)estimation with its recursive algorithm is given. It has been proved that the corrector has smaller asymptotic error thanthe least square methods. A simulation example shows that the corrector of maximum likelihood estimation is of higherapproximating precision to the true parameters than the least square methods.展开更多
Exponentiated Generalized Weibull distribution is a probability distribution which generalizes the Weibull distribution introducing two more shapes parameters to best adjust the non-monotonic shape. The parameters of ...Exponentiated Generalized Weibull distribution is a probability distribution which generalizes the Weibull distribution introducing two more shapes parameters to best adjust the non-monotonic shape. The parameters of the new probability distribution function are estimated by the maximum likelihood method under progressive type II censored data via expectation maximization algorithm.展开更多
基金Supported by the National Natural Science Foundation of China(12061017,12161009)the Research Fund of Guangxi Key Lab of Multi-source Information Mining&Security(22-A-01-01)。
文摘In this paper,we study spatial cross-sectional data models in the form of matrix exponential spatial specification(MESS),where MESS appears in both dependent and error terms.The empirical likelihood(EL)ratio statistics are established for the parameters of the MESS model.It is shown that the limiting distributions of EL ratio statistics follow chi-square distributions,which are used to construct the confidence regions of model parameters.Simulation experiments are conducted to compare the performances of confidence regions based on EL method and normal approximation method.
基金supported in part by the National Key Research and Development Program of China(2019YFB1503700)the Hunan Natural Science Foundation-Science and Education Joint Project(2019JJ70063)。
文摘The noise that comes from finite element simulation often causes the model to fall into the local optimal solution and over fitting during optimization of generator.Thus,this paper proposes a Gaussian Process Regression(GPR)model based on Conditional Likelihood Lower Bound Search(CLLBS)to optimize the design of the generator,which can filter the noise in the data and search for global optimization by combining the Conditional Likelihood Lower Bound Search method.Taking the efficiency optimization of 15 kW Permanent Magnet Synchronous Motor as an example.Firstly,this method uses the elementary effect analysis to choose the sensitive variables,combining the evolutionary algorithm to design the super Latin cube sampling plan;Then the generator-converter system is simulated by establishing a co-simulation platform to obtain data.A Gaussian process regression model combing the method of the conditional likelihood lower bound search is established,which combined the chi-square test to optimize the accuracy of the model globally.Secondly,after the model reaches the accuracy,the Pareto frontier is obtained through the NSGA-II algorithm by considering the maximum output torque as a constraint.Last,the constrained optimization is transformed into an unconstrained optimizing problem by introducing maximum constrained improvement expectation(CEI)optimization method based on the re-interpolation model,which cross-validated the optimization results of the Gaussian process regression model.The above method increase the efficiency of generator by 0.76%and 0.5%respectively;And this method can be used for rapid modeling and multi-objective optimization of generator systems.
基金Supported by The 2024 Guizhou Provincial Health Commission Science and Technology Fund Project,No.gzwkj2024-47502022 Provincial Clinical Key Specialty Construction Project。
文摘BACKGROUND Adolescent major depressive disorder(MDD)is a significant mental health concern that often leads to recurrent depression in adulthood.Resting-state functional magnetic resonance imaging(rs-fMRI)offers unique insights into the neural mechanisms underlying this condition.However,despite previous research,the specific vulnerable brain regions affected in adolescent MDD patients have not been fully elucidated.AIM To identify consistent vulnerable brain regions in adolescent MDD patients using rs-fMRI and activation likelihood estimation(ALE)meta-analysis.METHODS We performed a comprehensive literature search through July 12,2023,for studies investigating brain functional changes in adolescent MDD patients.We utilized regional homogeneity(ReHo),amplitude of low-frequency fluctuations(ALFF)and fractional ALFF(fALFF)analyses.We compared the regions of aberrant spontaneous neural activity in adolescents with MDD vs healthy controls(HCs)using ALE.RESULTS Ten studies(369 adolescent MDD patients and 313 HCs)were included.Combining the ReHo and ALFF/fALFF data,the results revealed that the activity in the right cuneus and left precuneus was lower in the adolescent MDD patients than in the HCs(voxel size:648 mm3,P<0.05),and no brain region exhibited increased activity.Based on the ALFF data,we found decreased activity in the right cuneus and left precuneus in adolescent MDD patients(voxel size:736 mm3,P<0.05),with no regions exhibiting increased activity.CONCLUSION Through ALE meta-analysis,we consistently identified the right cuneus and left precuneus as vulnerable brain regions in adolescent MDD patients,increasing our understanding of the neuropathology of affected adolescents.
基金supported by the National Natural science Foundation of China (No. 42127807)the Sichuan Science and Technology Program (No. 2020YJ0334)the Sichuan Science and Technology Breeding Program (No. 2022041)。
文摘The neutron spectrum unfolding by Bonner sphere spectrometer(BSS) is considered a complex multidimensional model,which requires complex mathematical methods to solve the first kind of Fredholm integral equation. In order to solve the problem of the maximum likelihood expectation maximization(MLEM) algorithm which is easy to suffer the pitfalls of local optima and the particle swarm optimization(PSO) algorithm which is easy to get unreasonable flight direction and step length of particles, which leads to the invalid iteration and affect efficiency and accuracy, an improved PSO-MLEM algorithm, combined of PSO and MLEM algorithm, is proposed for neutron spectrum unfolding. The dynamic acceleration factor is used to balance the ability of global and local search, and improves the convergence speed and accuracy of the algorithm. Firstly, the Monte Carlo method was used to simulated the BSS to obtain the response function and count rates of BSS. In the simulation of count rate, four reference spectra from the IAEA Technical Report Series No. 403 were used as input parameters of the Monte Carlo method. The PSO-MLEM algorithm was used to unfold the neutron spectrum of the simulated data and was verified by the difference of the unfolded spectrum to the reference spectrum. Finally, the 252Cf neutron source was measured by BSS, and the PSO-MLEM algorithm was used to unfold the experimental neutron spectrum.Compared with maximum entropy deconvolution(MAXED), PSO and MLEM algorithm, the PSO-MLEM algorithm has fewer parameters and automatically adjusts the dynamic acceleration factor to solve the problem of local optima. The convergence speed of the PSO-MLEM algorithm is 1.4 times and 3.1 times that of the MLEM and PSO algorithms. Compared with PSO, MLEM and MAXED, the correlation coefficients of PSO-MLEM algorithm are increased by 33.1%, 33.5% and 1.9%, and the relative mean errors are decreased by 98.2%, 97.8% and 67.4%.
文摘Maximum likelihood estimation(MLE)is an effective method for localizing radioactive sources in a given area.However,it requires an exhaustive search for parameter estimation,which is time-consuming.In this study,heuristic techniques were employed to search for radiation source parameters that provide the maximum likelihood by using a network of sensors.Hence,the time consumption of MLE would be effectively reduced.First,the radiation source was detected using the k-sigma method.Subsequently,the MLE was applied for parameter estimation using the readings and positions of the detectors that have detected the radiation source.A comparative study was performed in which the estimation accuracy and time consump-tion of the MLE were evaluated for traditional methods and heuristic techniques.The traditional MLE was performed via a grid search method using fixed and multiple resolutions.Additionally,four commonly used heuristic algorithms were applied:the firefly algorithm(FFA),particle swarm optimization(PSO),ant colony optimization(ACO),and artificial bee colony(ABC).The experiment was conducted using real data collected by the Low Scatter Irradiator facility at the Savannah River National Laboratory as part of the Intelligent Radiation Sensing System program.The comparative study showed that the estimation time was 3.27 s using fixed resolution MLE and 0.59 s using multi-resolution MLE.The time consumption for the heuristic-based MLE was 0.75,0.03,0.02,and 0.059 s for FFA,PSO,ACO,and ABC,respectively.The location estimation error was approximately 0.4 m using either the grid search-based MLE or the heuristic-based MLE.Hence,heuristic-based MLE can provide comparable estimation accuracy through a less time-consuming process than traditional MLE.
基金the National Natural Science Foundation of China(62071144,61971159,61871149).
文摘The conformal array can make full use of the aperture,save space,meet the requirements of aerodynamics,and is sensitive to polarization information.It has broad application prospects in military,aerospace,and communication fields.The joint polarization and direction-of-arrival(DOA)estimation based on the conformal array and the theoretical analysis of its parameter estimation performance are the key factors to promote the engineering application of the conformal array.To solve these problems,this paper establishes the wave field signal model of the conformal array.Then,for the case of a single target,the cost function of the maximum likelihood(ML)estimator is rewritten with Rayleigh quotient from a problem of maximizing the ratio of quadratic forms into those of minimizing quadratic forms.On this basis,rapid parameter estimation is achieved with the idea of manifold separation technology(MST).Compared with the modified variable projection(MVP)algorithm,it reduces the computational complexity and improves the parameter estimation performance.Meanwhile,the MST is used to solve the partial derivative of the steering vector.Then,the theoretical performance of ML,the multiple signal classification(MUSIC)estimator and Cramer-Rao bound(CRB)based on the conformal array are derived respectively,which provides theoretical foundation for the engineering application of the conformal array.Finally,the simulation experiment verifies the effectiveness of the proposed method.
基金Characteristic Innovation Projects of Ordinary Universities of Guangdong Province,China(No.2022KTSCX150)Zhaoqing Education Development Institute Project,China(No.ZQJYY2021144)Zhaoqing College Quality Project and Teaching Reform Project,China(Nos.zlgc202003 and zlgc202112)。
文摘The paper discusses the statistical inference problem of the compound Poisson vector process(CPVP)in the domain of attraction of normal law but with infinite covariance matrix.The empirical likelihood(EL)method to construct confidence regions for the mean vector has been proposed.It is a generalization from the finite second-order moments to the infinite second-order moments in the domain of attraction of normal law.The log-empirical likelihood ratio statistic for the average number of the CPVP converges to F distribution in distribution when the population is in the domain of attraction of normal law but has infinite covariance matrix.Some simulation results are proposed to illustrate the method of the paper.
文摘In this paper, a weighted maximum likelihood technique (WMLT) for the logistic regression model is presented. This method depended on a weight function that is continuously adaptable using Mahalanobis distances for predictor variables. Under the model, the asymptotic consistency of the suggested estimator is demonstrated and properties of finite-sample are also investigated via simulation. In simulation studies and real data sets, it is observed that the newly proposed technique demonstrated the greatest performance among all estimators compared.
文摘In this paper, three smoothed empirical log-likelihood ratio functions for the parameters of nonlinear models with missing response are suggested. Under some regular conditions, the corresponding Wilks phenomena are obtained and the confidence regions for the parameter can be constructed easily.
文摘In longitudinal data analysis, our primary interest is in the estimation of regression parameters for the marginal expectations of the longitudinal responses, and the longitudinal correlation parameters are of secondary interest. The joint likelihood function for longitudinal data is challenging, particularly due to correlated responses. Marginal models, such as generalized estimating equations (GEEs), have received much attention based on the assumption of the first two moments of the data and a working correlation structure. The confidence regions and hypothesis tests are constructed based on the asymptotic normality. This approach is sensitive to the misspecification of the variance function and the working correlation structure which may yield inefficient and inconsistent estimates leading to wrong conclusions. To overcome this problem, we propose an empirical likelihood (EL) procedure based on a set of estimating equations for the parameter of interest and discuss its <span style="font-family:Verdana;">characteristics and asymptotic properties. We also provide an algorithm base</span><span style="font-family:Verdana;">d on EL principles for the estimation of the regression parameters and the construction of its confidence region. We have applied the proposed method in two case examples.</span>
文摘Fisher [1] proposed a simple method to combine p-values from independent investigations without using detailed information of the original data. In recent years, likelihood-based asymptotic methods have been developed to produce highly accurate p-values. These likelihood-based methods generally required the likelihood function and the standardized maximum likelihood estimates departure calculated in the canonical parameter scale. In this paper, a method is proposed to obtain a p-value by combining the likelihood functions and the standardized maximum likelihood estimates departure of independent investigations for testing a scalar parameter of interest. Examples are presented to illustrate the application of the proposed method and simulation studies are performed to compare the accuracy of the proposed method with Fisher’s method.
文摘In this paper, asymptotic expansions of the distribution of the likelihood ratio statistic for testing sphericity in a crowth curve model have been derived in the null and nonnull cases when the alternatives are dose to the null hypothesis. These expansions are given in series form of beta distributions.
基金Supported by:Scientific and Technological Research Council of Turkey(TUBITAK)with Grant No.213M454
文摘This paper presents the calibration of Omori’s aftershock occurrence rate model for Turkey and the resulting likelihoods. Aftershock occurrence rate models are used for estimating the probability of an aftershock that exceeds a specific magnitude threshold within a time interval after the mainshock. Critical decisions on the post-earthquake safety of structures directly depend on the aftershock hazard estimated using the occurrence model. It is customary to calibrate models in a region-specific manner. These models depend on rate parameters(a, b, c and p) related to the seismicity characteristics of the investigated region. In this study, the available well-recorded aftershock sequences for a set of Mw ≥ 5.9 mainshock events that were observed in Turkey until 2012 are considered to develop the aftershock occurrence model. Mean estimates of the model parameters identified for Turkey are a =-1.90, b = 1.11, c = 0.05 and p = 1.20. Based on the developed model, aftershock likelihoods are computed for a range of different time intervals and mainshock magnitudes. Also, the sensitivity of aftershock probabilities to the model parameters is investigated. Aftershock occurrence probabilities estimated using the model are expected to be useful for post-earthquake safety evaluations in Turkey.
文摘By taking the subsequence out of the input-output sequence of a system polluted by white noise, anindependent observation sequence and its probability density are obtained and then a maximum likelihood estimation of theidentification parameters is given. In order to decrease the asymptotic error, a corrector of maximum likelihood (CML)estimation with its recursive algorithm is given. It has been proved that the corrector has smaller asymptotic error thanthe least square methods. A simulation example shows that the corrector of maximum likelihood estimation is of higherapproximating precision to the true parameters than the least square methods.
文摘Exponentiated Generalized Weibull distribution is a probability distribution which generalizes the Weibull distribution introducing two more shapes parameters to best adjust the non-monotonic shape. The parameters of the new probability distribution function are estimated by the maximum likelihood method under progressive type II censored data via expectation maximization algorithm.