The delayed S-shaped software reliability growth model (SRGM) is one of the non-homogeneous Poisson process (NHPP) models which have been proposed for software reliability assessment. The model is distinctive because ...The delayed S-shaped software reliability growth model (SRGM) is one of the non-homogeneous Poisson process (NHPP) models which have been proposed for software reliability assessment. The model is distinctive because it has a mean value function that reflects the delay in failure reporting: there is a delay between failure detection and reporting time. The model captures error detection, isolation, and removal processes, thus is appropriate for software reliability analysis. Predictive analysis in software testing is useful in modifying, debugging, and determining when to terminate software development testing processes. However, Bayesian predictive analyses on the delayed S-shaped model have not been extensively explored. This paper uses the delayed S-shaped SRGM to address four issues in one-sample prediction associated with the software development testing process. Bayesian approach based on non-informative priors was used to derive explicit solutions for the four issues, and the developed methodologies were illustrated using real data.展开更多
In this paper, a novel Magnetic Resonance (MR) reconstruction framework which combines image-wise and patch-wise sparse prior is proposed. For addressing, a truncated beta-Bernoulli process is firstly employed to enfo...In this paper, a novel Magnetic Resonance (MR) reconstruction framework which combines image-wise and patch-wise sparse prior is proposed. For addressing, a truncated beta-Bernoulli process is firstly employed to enforce sparsity on overlapping image patches emphasizing local structures. Due to its properties, beta-Bernoulli process can adaptive infer the sparsity (number of non-zero coefficients) of each patch, an appropriate dictionary, and the noise variance simultaneously, which are prerequisite for iterative image reconstruction. Secondly, a General Gaussian Distribution (GGD) prior is introduced to engage image-wise sparsity for wavelet coefficients, which can be then estimated by a threshold denoising algorithm. Finally, MR image is reconstructed by patch-wise estimation, image-wise estimation and under-sampled k-space data with least square data fitting. Experimental results have demonstrated that proposed approach exhibits excellent reconstruction performance. Moreover, if the image is full of similar low-dimensional-structures, proposed algorithm has dramatically improved Peak Signal to Noise Ratio (PSNR) 7~9 dB, with comparisons to other state-of-art compressive sampling methods.展开更多
The likelihood function plays a central role in statistical analysis in relation to information, from both frequentist and Bayesian perspectives. In large samples several new properties of the likelihood in relation t...The likelihood function plays a central role in statistical analysis in relation to information, from both frequentist and Bayesian perspectives. In large samples several new properties of the likelihood in relation to information are developed here. The Arrow-Pratt absolute risk aversion measure is shown to be related to the Cramer-Rao Information bound. The derivative of the log-likelihood function is seen to provide a measure of information related stability for the Bayesian posterior density. As well, information similar prior densities can be defined reflecting the central role of likelihood in the Bayes learning paradigm.展开更多
Ordering based search methods have advantages over graph based search methods for structure learning of Bayesian networks in terms on the efficiency. With the aim of further increasing the accuracy of ordering based s...Ordering based search methods have advantages over graph based search methods for structure learning of Bayesian networks in terms on the efficiency. With the aim of further increasing the accuracy of ordering based search methods, we first propose to increase the search space, which can facilitate escaping from the local optima. We present our search operators with majorizations, which are easy to implement. Experiments show that the proposed algorithm can obtain significantly more accurate results. With regard to the problem of the decrease on efficiency due to the increase of the search space, we then propose to add path priors as constraints into the swap process. We analyze the coefficient which may influence the performance of the proposed algorithm, the experiments show that the constraints can enhance the efficiency greatly, while has little effect on the accuracy. The final experiments show that, compared to other competitive methods, the proposed algorithm can find better solutions while holding high efficiency at the same time on both synthetic and real data sets.展开更多
Based on the concept of admissibility in statistics, a definition of generalized admissibility of Bayes estimates has been given at first, which was with inaccurate prior for the application in surveying adjustment. T...Based on the concept of admissibility in statistics, a definition of generalized admissibility of Bayes estimates has been given at first, which was with inaccurate prior for the application in surveying adjustment. Then according to the definition, the generalized admissibility of the normal linear Bayes estimate with the inaccurate prior information that contains deviations or model errors, as well as how to eliminate the effect of the model error on the Bayes estimate in surveying adjustment were studied. The results show that if the prior information is not accurate, that is, it contains model error, the generalized admissibility can explain whether the Bayes estimate can be accepted or not. For the case of linear normal Bayes estimate, the Bayes estimate can be made generally admissible by giving a less prior weight if the prior information is inaccurate. Finally an example was given.展开更多
The blurred image restoration method can dramatically highlight the image details and enhance the global contrast, which is of benefit to improvement of the visual effect during practical applications. This paper is b...The blurred image restoration method can dramatically highlight the image details and enhance the global contrast, which is of benefit to improvement of the visual effect during practical applications. This paper is based on the dark channel prior principle and aims at the prior information absent blurred image degradation situation. A lot of improvements have been made to estimate the transmission map of blurred images. Since the dark channel prior principle can effectively restore the blurred image at the cost of a large amount of computation, the total variation(TV) and image morphology transform(specifically top-hat transform and bottomhat transform) have been introduced into the improved method.Compared with original transmission map estimation methods, the proposed method features both simplicity and accuracy. The estimated transmission map together with the element can restore the image. Simulation results show that this method could inhibit the ill-posed problem during image restoration, meanwhile it can greatly improve the image quality and definition.展开更多
In many practical applications of image segmentation problems,employing prior information can greatly improve segmentation results.This paper continues to study one kind of prior information,called prior distribution....In many practical applications of image segmentation problems,employing prior information can greatly improve segmentation results.This paper continues to study one kind of prior information,called prior distribution.Within this research,there is no exact template of the object;instead only several samples are given.The proposed method,called the parametric distribution prior model,extends our previous model by adding the training procedure to learn the prior distribution of the objects.Then this paper establishes the energy function of the active contour model(ACM)with consideration of this parametric form of prior distribution.Therefore,during the process of segmenting,the template can update itself while the contour evolves.Experiments are performed on the airplane data set.Experimental results demonstrate the potential of the proposed method that with the information of prior distribution,the segmentation effect and speed can be both improved efficaciously.展开更多
Smoothness prior approach for spectral smoothing is investigated using Fourier frequency filter analysis.We show that the regularization parameter in penalized least squares could continuously control the bandwidth of...Smoothness prior approach for spectral smoothing is investigated using Fourier frequency filter analysis.We show that the regularization parameter in penalized least squares could continuously control the bandwidth of low-pass filter.Besides,due to its property of interpolating the missing values automatically and smoothly,a spectral baseline correction algorithm based on the approach is proposed.This algorithm generally comprises spectral peak detection and baseline estimation.First,the spectral peak regions are detected and identified according to the second derivatives.Then,generalized smoothness prior approach combining identification information could estimate the baseline in peak regions.Results with both the simulated and real spectra show accurate baseline-corrected signals with this method.展开更多
The present study was conducted to study the effect of feed restriction prior to slaughter on carcass weight of male broiler chicks from 32 to 40 days of age. A total number of 180 (Pure line) male broiler chicks were...The present study was conducted to study the effect of feed restriction prior to slaughter on carcass weight of male broiler chicks from 32 to 40 days of age. A total number of 180 (Pure line) male broiler chicks were taken randomly, labeled and divided into six groups. At 32 days of age, the experimental groups were put under the experimental feeding program. Group A fed ad libitum (control) while group B and C fed 120, 60 gm/bird/day for eight days, respectively. Group D and E fed 120, 60 gm/bird/day for four days respectively, followed by zero feeding for an extra 4 days. Group F deprived of food during the whole experimental period (8 days). The experimental diet was formulated to be approximately iso caloric-iso nitrogenous containing sorghum, groundnut cake, broiler concentrate, calcium, salt, lysine, methionine, and premix. The parameters taken were live body weight, feed intake, mortality, carcass, and non-carcass values. The effect of feed restriction program on male broiler chicks was not significant during the period from 32 to 34 days of age for parameters final live body weight, carcass weight, and dressing percentage, but net weight (gain or loss) was affected by feed restriction program and showed a significant difference (P < 0.01) between experimental groups. From 32 to 36 days of age male broilers subjected to feed restriction regimes showed the lowest reading for final live body weight, net weight (gain or loss) and carcass weight and the difference were significant (P 0.05) between experimental groups for dressing percentage during period from32 to 36 days of age. At the period from 32 to 38 days and the period from 32 to 40 days of age, all parameters were significantly affected by feed restriction program. It was concluded that carcass weight of broiler chickens can be controlled using different options of feed restriction programs according to the need of the market and the producer situation with special consideration to the economic return.展开更多
AIM To develop a framework to incorporate background domain knowledge into classification rule learning for knowledge discovery in biomedicine.METHODS Bayesian rule learning(BRL) is a rule-based classifier that uses a...AIM To develop a framework to incorporate background domain knowledge into classification rule learning for knowledge discovery in biomedicine.METHODS Bayesian rule learning(BRL) is a rule-based classifier that uses a greedy best-first search over a space of Bayesian belief-networks(BN) to find the optimal BN to explain the input dataset, and then infers classification rules from this BN. BRL uses a Bayesian score to evaluate the quality of BNs. In this paper, we extended the Bayesian score to include informative structure priors, which encodes our prior domain knowledge about the dataset. We call this extension of BRL as BRL_p. The structure prior has a λ hyperparameter that allows the user to tune the degree of incorporation of the prior knowledge in the model learning process. We studied the effect of λ on model learning using a simulated dataset and a real-world lung cancer prognostic biomarker dataset, by measuring the degree of incorporation of our specified prior knowledge. We also monitored its effect on the model predictive performance. Finally, we compared BRL_p to other stateof-the-art classifiers commonly used in biomedicine.RESULTS We evaluated the degree of incorporation of prior knowledge into BRL_p, with simulated data by measuring the Graph Edit Distance between the true datagenerating model and the model learned by BRL_p. We specified the true model using informative structurepriors. We observed that by increasing the value of λ we were able to increase the influence of the specified structure priors on model learning. A large value of λ of BRL_p caused it to return the true model. This also led to a gain in predictive performance measured by area under the receiver operator characteristic curve(AUC). We then obtained a publicly available real-world lung cancer prognostic biomarker dataset and specified a known biomarker from literature [the epidermal growth factor receptor(EGFR) gene]. We again observed that larger values of λ led to an increased incorporation of EGFR into the final BRL_p model. This relevant background knowledge also led to a gain in AUC.CONCLUSION BRL_p enables tunable structure priors to be incorporated during Bayesian classification rule learning that integrates data and knowledge as demonstrated using lung cancer biomarker data.展开更多
Objectives: Since studies demonstrate that neonates born to mothers having been cared for premature labor will suffer from congenital neonatal sepsis, we aimed to evaluate the prevalence and main risk factors of neona...Objectives: Since studies demonstrate that neonates born to mothers having been cared for premature labor will suffer from congenital neonatal sepsis, we aimed to evaluate the prevalence and main risk factors of neonatal infection among mothers having experienced a prior premature labor. Methods: This was a cross sectional study carried out from January 1st throughout 31st December, 2013 at the university clinics of Kinshasa. It concerned all delivered women at term having been cared for premature labor prior to giving birth a live newborn. Maternal variables of interest were: parity, gestation, age, intrapartum fever, malaria, urogenital infection during the last 2 weeks before delivery (UGI), premature rupture of membranes (PROM), cervical cerclage, meconium-stained amniotic fluid (MSAF) and the way of delivery. For neonates attention was paid on gestational age, birth weight, admission at neonatal intensive care unit (ANICU) and infection as stated within three days after birth. T-test and Chi-square were used where appropriate. Logistic analysis was used to determine the risk for maternal variables to induce neonatal infection (OR and CI), the significance stated at p < 0.05. Results: Fifty two mother-infant couples were recruited. Of these 19 neonates were infected (prevalence of 36.5%). Mean age, gestational age and birth weight were 30.19 ± 5.32 years, 37.2 ± 2 weeks and 2638 ± 588 g, respectively. Infected neonates had their gestational age and birth weight significantly lower whilst proportion of ANICU higher than that on non infected. Prematurity, PROM, UGI, prior cerclage and MSAF were significantly more frequent in couples with neonatal infection. Prematurity, birth weight <2500 g and UGI were found to enhance the risk by 3 to 4 times. Conclusion: The prevalence of neonatal infection was very high. Prematurity, birth weight <2500 g and maternal UGI were found to enhance the risk by 3 to 4 times.展开更多
Yin [1] has developed a new Bayesian measure of evidence for testing a point null hypothesis which agrees with the frequentist p-value thereby, solving Lindley’s paradox. Yin and Li [2] extended the methodology of Yi...Yin [1] has developed a new Bayesian measure of evidence for testing a point null hypothesis which agrees with the frequentist p-value thereby, solving Lindley’s paradox. Yin and Li [2] extended the methodology of Yin [1] to the case of the Behrens-Fisher problem by assigning Jeffreys’ independent prior to the nuisance parameters. In this paper, we were able to show both analytically and through the results from simulation studies that the methodology of Yin?[1] solves simultaneously, the Behrens-Fisher problem and Lindley’s paradox when a Gamma prior is assigned to the nuisance parameters.展开更多
We ask the question if a formula for entropy, as given by with a usual value ascribed of initial entropy of the onset of inflation can allow an order of magnitude resolution of the question of if there could be a surv...We ask the question if a formula for entropy, as given by with a usual value ascribed of initial entropy of the onset of inflation can allow an order of magnitude resolution of the question of if there could be a survival of a graviton from a prior to the present universe, using typical Planckian peak temperature values of . We obtain values consistent with up to 1038 gravitons contributing to an energy value of if we assume a relic energy contribution based upon each graviton initially exhibiting a frequency spike of 1010 Hz. The value of is picked from looking at the aftermath of what happens if there exists a quantum bounce with a peak density value of [1] in a regime of LQG bounce regime radii of the order of magnitude of meters. The author, in making estimates specifically avoids using ,?by setting the chemical potential for ultra high temperatures for reasons which will be brought up in the conclusion.展开更多
We examine if there are grounds to entertain the Penrose suggestion as to black holes from a prior cycle of creation appearing in the present cosmos. There are two cases to consider. One a singular start to the Univer...We examine if there are grounds to entertain the Penrose suggestion as to black holes from a prior cycle of creation appearing in the present cosmos. There are two cases to consider. One a singular start to the Universe or as Karen Freeze and others have modeled a non-singular start. The two cases are different and touch upon the limits of validity of the Penrose singularity theorem. We will first of all state the two cases, singular and nonsingular, and then afterwards, briefly allude to the Penrose singularity theorem. The plausibility of the singular cosmological expansion start point w case analysis of Black holes from a prior universe will be discussed first Afterwards, a synopsis of the Penrose singularity theorem. After that, the Nonsingular case of a starting point of the expansion of the Universe will be entertained and described. Since the nonsingular start to the expansion of the Universe is not so well known, a considerable amount of space will be spent upon what I view as mathematical constructions allowing for its analysis. About the only way to ascertain these cases will be by GW astronomy, hence the details of GW production from the early Universe will be covered in excruciating detail. The methodology for that section is simple. Use a construction for a minimal time-step, then from there get emergent space-time conditions for a bridge from a nonsingular start to the universe, to potential Quantum gravity conditions. Our Methodology is to construct using a “trivial” solution to massive gravitons, and a nonsingular start for expansion of the universe. Our methodology has many unintended consequences, not the least is a relationship between a small timestep, which is called <i>t</i>, and then the minimum scale factor and even the tension or property values of the initial space-time wall, all of which are a consequence of a “trivial” solution taking into account “massive” gravitons. From there we next will in future articles postulate conditions for experimental detectors for subsequent data sets to obtain falsifiable data sets. Finally upon doing this, the outlines of the way to ascertain data sets as to either falsify or confirm the Penrose suggestion will be the final concluding part of the manuscript.展开更多
Isothermal transformation(TTT) behavior of the low carbon steels with two Si con-tents(0.50 wt pct and 1.35 wt pct) was investigated with and without the prior deformation.The results show that Si and the prior deform...Isothermal transformation(TTT) behavior of the low carbon steels with two Si con-tents(0.50 wt pct and 1.35 wt pct) was investigated with and without the prior deformation.The results show that Si and the prior deformation of the austenite have significant effects on the transformation of the ferrite and bainite.The addition of Si refines the ferrite grains,accelerates the polygonal ferrite transformation and the formation of M/A constituents,leading to the improvement of the strength.The ferrite grains formed under the prior deformation of the austenite become more ho-mogeneous and refined.However,the influence of deformation on the tensile strength of both steels is dependent on the isothermal temperatures.Thermodynamic calcu-lation indicates that Si and prior deformation reduce the incubation time of both ferrite and bainite transformation,but the effect is weakened by the decrease of the isothermal temperatures.展开更多
Focusing on the degradation of foggy images, a restoration approach from a single image based on spatial correlation of dark channel prior is proposed. Firstly, the transmission of each pixel is estimated by the spati...Focusing on the degradation of foggy images, a restoration approach from a single image based on spatial correlation of dark channel prior is proposed. Firstly, the transmission of each pixel is estimated by the spatial correlation of dark channel prior. Secondly, a degradation model is utilized to restore the foggy image. Thirdly, the final recovered image, with enhanced contrast,is obtained by performing a post-processing technique based on just-noticeable difference. Experimental results demonstrate that the information of a foggy image can be recovered perfectly by the proposed method, even in the case of the abrupt depth changing scene.展开更多
Due to limited volume, weight and power consumption, micro-satellite has to reduce data transmission and storage capacity by image compression when performs earth observation missions. However, the quality of images m...Due to limited volume, weight and power consumption, micro-satellite has to reduce data transmission and storage capacity by image compression when performs earth observation missions. However, the quality of images may be unsatisfied. This paper considers the problem of recovering sparse signals by exploiting their unknown sparsity pattern. To model structured sparsity, the prior correlation of the support is encoded by imposing a transformed Gaussian process on the spike and slab probabilities. Then, an efficient approximate message-passing algorithm with structured spike and slab prior is derived for posterior inference, which, combined with a fast direct method, reduces the computational complexity significantly. Further, a unified scheme is developed to learn the hyperparameters using expectation maximization(EM) and Bethe free energy optimization. Simulation results on both synthetic and real data demonstrate the superiority of the proposed algorithm.展开更多
A low carbon steel was used to determine the critical strain εc for completion of deformation enhanced ferrite transforma-tion (DEFT) through a series of hot compression tests. In addition, the influence of prior aus...A low carbon steel was used to determine the critical strain εc for completion of deformation enhanced ferrite transforma-tion (DEFT) through a series of hot compression tests. In addition, the influence of prior austenite grain size (PAGS) on the criticalstrain was systematically investigated. Experimental results showed that the critical strain is affected by PAGS. When γ→α transfor-mation completes, the smaller the PAGS is, the smaller the critical strain is. The ferrite grains obtained through DEFT can be refinedto about 3 μm when the DEFT is completed.展开更多
文摘The delayed S-shaped software reliability growth model (SRGM) is one of the non-homogeneous Poisson process (NHPP) models which have been proposed for software reliability assessment. The model is distinctive because it has a mean value function that reflects the delay in failure reporting: there is a delay between failure detection and reporting time. The model captures error detection, isolation, and removal processes, thus is appropriate for software reliability analysis. Predictive analysis in software testing is useful in modifying, debugging, and determining when to terminate software development testing processes. However, Bayesian predictive analyses on the delayed S-shaped model have not been extensively explored. This paper uses the delayed S-shaped SRGM to address four issues in one-sample prediction associated with the software development testing process. Bayesian approach based on non-informative priors was used to derive explicit solutions for the four issues, and the developed methodologies were illustrated using real data.
基金Supported by the National Natural Science Foundation of China (No. 30900328, 61172179)the Fundamental Research Funds for the Central Universities (No.2011121051)the Natural Science Foundation of Fujian Province of China (No. 2012J05160)
文摘In this paper, a novel Magnetic Resonance (MR) reconstruction framework which combines image-wise and patch-wise sparse prior is proposed. For addressing, a truncated beta-Bernoulli process is firstly employed to enforce sparsity on overlapping image patches emphasizing local structures. Due to its properties, beta-Bernoulli process can adaptive infer the sparsity (number of non-zero coefficients) of each patch, an appropriate dictionary, and the noise variance simultaneously, which are prerequisite for iterative image reconstruction. Secondly, a General Gaussian Distribution (GGD) prior is introduced to engage image-wise sparsity for wavelet coefficients, which can be then estimated by a threshold denoising algorithm. Finally, MR image is reconstructed by patch-wise estimation, image-wise estimation and under-sampled k-space data with least square data fitting. Experimental results have demonstrated that proposed approach exhibits excellent reconstruction performance. Moreover, if the image is full of similar low-dimensional-structures, proposed algorithm has dramatically improved Peak Signal to Noise Ratio (PSNR) 7~9 dB, with comparisons to other state-of-art compressive sampling methods.
文摘The likelihood function plays a central role in statistical analysis in relation to information, from both frequentist and Bayesian perspectives. In large samples several new properties of the likelihood in relation to information are developed here. The Arrow-Pratt absolute risk aversion measure is shown to be related to the Cramer-Rao Information bound. The derivative of the log-likelihood function is seen to provide a measure of information related stability for the Bayesian posterior density. As well, information similar prior densities can be defined reflecting the central role of likelihood in the Bayes learning paradigm.
基金supported by the National Natural Science Fundation of China(61573285)the Doctoral Fundation of China(2013ZC53037)
文摘Ordering based search methods have advantages over graph based search methods for structure learning of Bayesian networks in terms on the efficiency. With the aim of further increasing the accuracy of ordering based search methods, we first propose to increase the search space, which can facilitate escaping from the local optima. We present our search operators with majorizations, which are easy to implement. Experiments show that the proposed algorithm can obtain significantly more accurate results. With regard to the problem of the decrease on efficiency due to the increase of the search space, we then propose to add path priors as constraints into the swap process. We analyze the coefficient which may influence the performance of the proposed algorithm, the experiments show that the constraints can enhance the efficiency greatly, while has little effect on the accuracy. The final experiments show that, compared to other competitive methods, the proposed algorithm can find better solutions while holding high efficiency at the same time on both synthetic and real data sets.
文摘Based on the concept of admissibility in statistics, a definition of generalized admissibility of Bayes estimates has been given at first, which was with inaccurate prior for the application in surveying adjustment. Then according to the definition, the generalized admissibility of the normal linear Bayes estimate with the inaccurate prior information that contains deviations or model errors, as well as how to eliminate the effect of the model error on the Bayes estimate in surveying adjustment were studied. The results show that if the prior information is not accurate, that is, it contains model error, the generalized admissibility can explain whether the Bayes estimate can be accepted or not. For the case of linear normal Bayes estimate, the Bayes estimate can be made generally admissible by giving a less prior weight if the prior information is inaccurate. Finally an example was given.
基金supported by the National Natural Science Foundation of China(61301095)the Chinese University Scientific Fund(HEUCF130807)the Chinese Defense Advanced Research Program of Science and Technology(10J3.1.6)
文摘The blurred image restoration method can dramatically highlight the image details and enhance the global contrast, which is of benefit to improvement of the visual effect during practical applications. This paper is based on the dark channel prior principle and aims at the prior information absent blurred image degradation situation. A lot of improvements have been made to estimate the transmission map of blurred images. Since the dark channel prior principle can effectively restore the blurred image at the cost of a large amount of computation, the total variation(TV) and image morphology transform(specifically top-hat transform and bottomhat transform) have been introduced into the improved method.Compared with original transmission map estimation methods, the proposed method features both simplicity and accuracy. The estimated transmission map together with the element can restore the image. Simulation results show that this method could inhibit the ill-posed problem during image restoration, meanwhile it can greatly improve the image quality and definition.
基金supported by the National Key R&D Program of China(2018YFC0309400)the National Natural Science Foundation of China(61871188)
文摘In many practical applications of image segmentation problems,employing prior information can greatly improve segmentation results.This paper continues to study one kind of prior information,called prior distribution.Within this research,there is no exact template of the object;instead only several samples are given.The proposed method,called the parametric distribution prior model,extends our previous model by adding the training procedure to learn the prior distribution of the objects.Then this paper establishes the energy function of the active contour model(ACM)with consideration of this parametric form of prior distribution.Therefore,during the process of segmenting,the template can update itself while the contour evolves.Experiments are performed on the airplane data set.Experimental results demonstrate the potential of the proposed method that with the information of prior distribution,the segmentation effect and speed can be both improved efficaciously.
基金Supported by the National Basic Research Program of China(61178072)
文摘Smoothness prior approach for spectral smoothing is investigated using Fourier frequency filter analysis.We show that the regularization parameter in penalized least squares could continuously control the bandwidth of low-pass filter.Besides,due to its property of interpolating the missing values automatically and smoothly,a spectral baseline correction algorithm based on the approach is proposed.This algorithm generally comprises spectral peak detection and baseline estimation.First,the spectral peak regions are detected and identified according to the second derivatives.Then,generalized smoothness prior approach combining identification information could estimate the baseline in peak regions.Results with both the simulated and real spectra show accurate baseline-corrected signals with this method.
文摘The present study was conducted to study the effect of feed restriction prior to slaughter on carcass weight of male broiler chicks from 32 to 40 days of age. A total number of 180 (Pure line) male broiler chicks were taken randomly, labeled and divided into six groups. At 32 days of age, the experimental groups were put under the experimental feeding program. Group A fed ad libitum (control) while group B and C fed 120, 60 gm/bird/day for eight days, respectively. Group D and E fed 120, 60 gm/bird/day for four days respectively, followed by zero feeding for an extra 4 days. Group F deprived of food during the whole experimental period (8 days). The experimental diet was formulated to be approximately iso caloric-iso nitrogenous containing sorghum, groundnut cake, broiler concentrate, calcium, salt, lysine, methionine, and premix. The parameters taken were live body weight, feed intake, mortality, carcass, and non-carcass values. The effect of feed restriction program on male broiler chicks was not significant during the period from 32 to 34 days of age for parameters final live body weight, carcass weight, and dressing percentage, but net weight (gain or loss) was affected by feed restriction program and showed a significant difference (P < 0.01) between experimental groups. From 32 to 36 days of age male broilers subjected to feed restriction regimes showed the lowest reading for final live body weight, net weight (gain or loss) and carcass weight and the difference were significant (P 0.05) between experimental groups for dressing percentage during period from32 to 36 days of age. At the period from 32 to 38 days and the period from 32 to 40 days of age, all parameters were significantly affected by feed restriction program. It was concluded that carcass weight of broiler chickens can be controlled using different options of feed restriction programs according to the need of the market and the producer situation with special consideration to the economic return.
基金Supported by National Institute of General Medical Sciences of the National Institutes of Health,No.R01GM100387
文摘AIM To develop a framework to incorporate background domain knowledge into classification rule learning for knowledge discovery in biomedicine.METHODS Bayesian rule learning(BRL) is a rule-based classifier that uses a greedy best-first search over a space of Bayesian belief-networks(BN) to find the optimal BN to explain the input dataset, and then infers classification rules from this BN. BRL uses a Bayesian score to evaluate the quality of BNs. In this paper, we extended the Bayesian score to include informative structure priors, which encodes our prior domain knowledge about the dataset. We call this extension of BRL as BRL_p. The structure prior has a λ hyperparameter that allows the user to tune the degree of incorporation of the prior knowledge in the model learning process. We studied the effect of λ on model learning using a simulated dataset and a real-world lung cancer prognostic biomarker dataset, by measuring the degree of incorporation of our specified prior knowledge. We also monitored its effect on the model predictive performance. Finally, we compared BRL_p to other stateof-the-art classifiers commonly used in biomedicine.RESULTS We evaluated the degree of incorporation of prior knowledge into BRL_p, with simulated data by measuring the Graph Edit Distance between the true datagenerating model and the model learned by BRL_p. We specified the true model using informative structurepriors. We observed that by increasing the value of λ we were able to increase the influence of the specified structure priors on model learning. A large value of λ of BRL_p caused it to return the true model. This also led to a gain in predictive performance measured by area under the receiver operator characteristic curve(AUC). We then obtained a publicly available real-world lung cancer prognostic biomarker dataset and specified a known biomarker from literature [the epidermal growth factor receptor(EGFR) gene]. We again observed that larger values of λ led to an increased incorporation of EGFR into the final BRL_p model. This relevant background knowledge also led to a gain in AUC.CONCLUSION BRL_p enables tunable structure priors to be incorporated during Bayesian classification rule learning that integrates data and knowledge as demonstrated using lung cancer biomarker data.
文摘Objectives: Since studies demonstrate that neonates born to mothers having been cared for premature labor will suffer from congenital neonatal sepsis, we aimed to evaluate the prevalence and main risk factors of neonatal infection among mothers having experienced a prior premature labor. Methods: This was a cross sectional study carried out from January 1st throughout 31st December, 2013 at the university clinics of Kinshasa. It concerned all delivered women at term having been cared for premature labor prior to giving birth a live newborn. Maternal variables of interest were: parity, gestation, age, intrapartum fever, malaria, urogenital infection during the last 2 weeks before delivery (UGI), premature rupture of membranes (PROM), cervical cerclage, meconium-stained amniotic fluid (MSAF) and the way of delivery. For neonates attention was paid on gestational age, birth weight, admission at neonatal intensive care unit (ANICU) and infection as stated within three days after birth. T-test and Chi-square were used where appropriate. Logistic analysis was used to determine the risk for maternal variables to induce neonatal infection (OR and CI), the significance stated at p < 0.05. Results: Fifty two mother-infant couples were recruited. Of these 19 neonates were infected (prevalence of 36.5%). Mean age, gestational age and birth weight were 30.19 ± 5.32 years, 37.2 ± 2 weeks and 2638 ± 588 g, respectively. Infected neonates had their gestational age and birth weight significantly lower whilst proportion of ANICU higher than that on non infected. Prematurity, PROM, UGI, prior cerclage and MSAF were significantly more frequent in couples with neonatal infection. Prematurity, birth weight <2500 g and UGI were found to enhance the risk by 3 to 4 times. Conclusion: The prevalence of neonatal infection was very high. Prematurity, birth weight <2500 g and maternal UGI were found to enhance the risk by 3 to 4 times.
文摘Yin [1] has developed a new Bayesian measure of evidence for testing a point null hypothesis which agrees with the frequentist p-value thereby, solving Lindley’s paradox. Yin and Li [2] extended the methodology of Yin [1] to the case of the Behrens-Fisher problem by assigning Jeffreys’ independent prior to the nuisance parameters. In this paper, we were able to show both analytically and through the results from simulation studies that the methodology of Yin?[1] solves simultaneously, the Behrens-Fisher problem and Lindley’s paradox when a Gamma prior is assigned to the nuisance parameters.
文摘We ask the question if a formula for entropy, as given by with a usual value ascribed of initial entropy of the onset of inflation can allow an order of magnitude resolution of the question of if there could be a survival of a graviton from a prior to the present universe, using typical Planckian peak temperature values of . We obtain values consistent with up to 1038 gravitons contributing to an energy value of if we assume a relic energy contribution based upon each graviton initially exhibiting a frequency spike of 1010 Hz. The value of is picked from looking at the aftermath of what happens if there exists a quantum bounce with a peak density value of [1] in a regime of LQG bounce regime radii of the order of magnitude of meters. The author, in making estimates specifically avoids using ,?by setting the chemical potential for ultra high temperatures for reasons which will be brought up in the conclusion.
文摘We examine if there are grounds to entertain the Penrose suggestion as to black holes from a prior cycle of creation appearing in the present cosmos. There are two cases to consider. One a singular start to the Universe or as Karen Freeze and others have modeled a non-singular start. The two cases are different and touch upon the limits of validity of the Penrose singularity theorem. We will first of all state the two cases, singular and nonsingular, and then afterwards, briefly allude to the Penrose singularity theorem. The plausibility of the singular cosmological expansion start point w case analysis of Black holes from a prior universe will be discussed first Afterwards, a synopsis of the Penrose singularity theorem. After that, the Nonsingular case of a starting point of the expansion of the Universe will be entertained and described. Since the nonsingular start to the expansion of the Universe is not so well known, a considerable amount of space will be spent upon what I view as mathematical constructions allowing for its analysis. About the only way to ascertain these cases will be by GW astronomy, hence the details of GW production from the early Universe will be covered in excruciating detail. The methodology for that section is simple. Use a construction for a minimal time-step, then from there get emergent space-time conditions for a bridge from a nonsingular start to the universe, to potential Quantum gravity conditions. Our Methodology is to construct using a “trivial” solution to massive gravitons, and a nonsingular start for expansion of the universe. Our methodology has many unintended consequences, not the least is a relationship between a small timestep, which is called <i>t</i>, and then the minimum scale factor and even the tension or property values of the initial space-time wall, all of which are a consequence of a “trivial” solution taking into account “massive” gravitons. From there we next will in future articles postulate conditions for experimental detectors for subsequent data sets to obtain falsifiable data sets. Finally upon doing this, the outlines of the way to ascertain data sets as to either falsify or confirm the Penrose suggestion will be the final concluding part of the manuscript.
基金the Baoshan Iron and Steel Group for the financial support
文摘Isothermal transformation(TTT) behavior of the low carbon steels with two Si con-tents(0.50 wt pct and 1.35 wt pct) was investigated with and without the prior deformation.The results show that Si and the prior deformation of the austenite have significant effects on the transformation of the ferrite and bainite.The addition of Si refines the ferrite grains,accelerates the polygonal ferrite transformation and the formation of M/A constituents,leading to the improvement of the strength.The ferrite grains formed under the prior deformation of the austenite become more ho-mogeneous and refined.However,the influence of deformation on the tensile strength of both steels is dependent on the isothermal temperatures.Thermodynamic calcu-lation indicates that Si and prior deformation reduce the incubation time of both ferrite and bainite transformation,but the effect is weakened by the decrease of the isothermal temperatures.
基金supported by "the Twelfth Five-year Civil Aerospace Technologies Pre-Research Program"(D040201)
文摘Focusing on the degradation of foggy images, a restoration approach from a single image based on spatial correlation of dark channel prior is proposed. Firstly, the transmission of each pixel is estimated by the spatial correlation of dark channel prior. Secondly, a degradation model is utilized to restore the foggy image. Thirdly, the final recovered image, with enhanced contrast,is obtained by performing a post-processing technique based on just-noticeable difference. Experimental results demonstrate that the information of a foggy image can be recovered perfectly by the proposed method, even in the case of the abrupt depth changing scene.
基金partially supported by the National Nature Science Foundation of China(Grant No.91438206 and 91638205)supported by Zhejiang Province Natural Science Foundation of China(Grant No.LQ18F010001)
文摘Due to limited volume, weight and power consumption, micro-satellite has to reduce data transmission and storage capacity by image compression when performs earth observation missions. However, the quality of images may be unsatisfied. This paper considers the problem of recovering sparse signals by exploiting their unknown sparsity pattern. To model structured sparsity, the prior correlation of the support is encoded by imposing a transformed Gaussian process on the spike and slab probabilities. Then, an efficient approximate message-passing algorithm with structured spike and slab prior is derived for posterior inference, which, combined with a fast direct method, reduces the computational complexity significantly. Further, a unified scheme is developed to learn the hyperparameters using expectation maximization(EM) and Bethe free energy optimization. Simulation results on both synthetic and real data demonstrate the superiority of the proposed algorithm.
基金This work was financially supported by the National Science and Technology Ministry to the research project ‘Advanced industriali-zation technique of manufacture for carbon steel of 500 MPa grade’ (No.2001AA332020).
文摘A low carbon steel was used to determine the critical strain εc for completion of deformation enhanced ferrite transforma-tion (DEFT) through a series of hot compression tests. In addition, the influence of prior austenite grain size (PAGS) on the criticalstrain was systematically investigated. Experimental results showed that the critical strain is affected by PAGS. When γ→α transfor-mation completes, the smaller the PAGS is, the smaller the critical strain is. The ferrite grains obtained through DEFT can be refinedto about 3 μm when the DEFT is completed.