Aim To improve the efficiency of fatigue material tests and relevant statistical treatment of test data. Methods\ Least square approach and other special treatments were used. Results and Conclusion\ The concepts...Aim To improve the efficiency of fatigue material tests and relevant statistical treatment of test data. Methods\ Least square approach and other special treatments were used. Results and Conclusion\ The concepts of each phase in fatigue tests and statistical treatment are clarified. The method proposed leads to three important properties. Reduced number of specimens brings to the advantage of lowering test expenditures. The whole test procedure has more flexibility for there is no need to conduct many tests at the same stress level as in traditional cases.展开更多
A new algorithm based on the projection method with the implicit finite difference technique was established to calculate the velocity fields and pressure.The calculation region can be divided into different regions a...A new algorithm based on the projection method with the implicit finite difference technique was established to calculate the velocity fields and pressure.The calculation region can be divided into different regions according to Reynolds number.In the far-wall region,the thermal melt flow was calculated as Newtonian flow.In the near-wall region,the thermal melt flow was calculated as non-Newtonian flow.It was proved that the new algorithm based on the projection method with the implicit technique was correct through nonparametric statistics method and experiment.The simulation results show that the new algorithm based on the projection method with the implicit technique calculates more quickly than the solution algorithm-volume of fluid method using the explicit difference method.展开更多
Demand for large vibrating screen is huge in the mineral processing industry. As bending and random vibration are not considered in a traditional design method for beam structures of a large vibrating screen, fatigue ...Demand for large vibrating screen is huge in the mineral processing industry. As bending and random vibration are not considered in a traditional design method for beam structures of a large vibrating screen, fatigue damage occurs frequently to affect the screening performance. This work aims to conduct a systematic mechanics analysis of the beam structures and improve the design method. Total motion of a beam structure in screening process can be decomposed into the traditional followed rigid translation(FRT), bending vibration(BV) and axial linear-distributed random rigid translation(ALRRT) excited by the side-plates. When treated as a generalized single-degree-of-freedom(SDOF) elastic system analytically, the BV can be solved by the Rayleigh's method. Stochastic analysis for random process is conducted for the detailed ALRRT calculation. Expressions for the mechanics property, namely, the shearing force and bending-moment with respect to BV and ALRRT, are derived, respectively. Experimental and numerical investigations demonstrate that the largest BV exists at the beam center and can be nearly ignored in comparison with the FRT during a simplified engineering design. With the BV and FRT considered, the mechanics property accords well with the practical situation with the maximum error of 6.33%, which is less than that obtained by traditional method.展开更多
Monte Carlo simulation was applied to Assembly Success Bate (ASK) analyses. ASR of two peg-in-hole robot assemblies was used as an example by taking component parts' sizes, manufacturing tolerances and robot repea...Monte Carlo simulation was applied to Assembly Success Bate (ASK) analyses. ASR of two peg-in-hole robot assemblies was used as an example by taking component parts' sizes, manufacturing tolerances and robot repeatability into account. A statistic arithmetic expression was proposed and deduced in this paper, which offers an alternative method of estimating the accuracy of ASR, without having to repeat the simulations. This statistic method also helps to choose a suitable sample size, if error reduction is desired. Monte Carlo simulation results demonstrated the feasibility of the method.展开更多
To understand the characteristics of macrobenthic structures and the relationship between environment and benthic assemblages in jellyfish bloom, we studied the macrobenthos and related environmental factors in the co...To understand the characteristics of macrobenthic structures and the relationship between environment and benthic assemblages in jellyfish bloom, we studied the macrobenthos and related environmental factors in the coastal waters of the Yellow Sea and East China Sea. Data were collected during two seasonal cruises in April and August of 2011, and analyzed with multivariate statistical methods. Up to 306 macrobenthic species were registered from the research areas, including 115 species of Polychaeta, 78 of Crustacea, 61 of Mollusca, 30 of Echinodermata, and 22 of other groups. Nine polychaete species occurred at frequencies higher than 25% from the sampling stations: Lumbrineris longifolia, Notomastus latericeus, Nin6e palmata, Ophelina acuminata, Nephtys oligobranchia, Onuphis geophiliformis, Glycera chirori, Terebellides stroemii, and Aricidea fragilis. Both the average biomass and abundance of macrobenthos are higher in August (23.8 g/m^2 and 237.7 ind./m^2) than those in April (11.3 g/m^2 and 128 ind./m^2); the dissimilarity ofmacrobenthic structures among stations is as high as 70%. In terms of the dissimilarity values, we divided the stations into four clusters in spring and eight in summer. The ABC curve shows that the macrofauna communities in high jellyfish abundance were not changed. Canonical correspondence analysis showed that depth, temperature, median grain size, total organic carbon of sediment and total nitrogen in sediment were important factors affecting the macrozoobenthic community in the study area.展开更多
An approach of limit state equation for surrounding rock was put forward based on deformation criterion. A method of symmetrical sampling of basic random variables adopted by classical response surface method was mend...An approach of limit state equation for surrounding rock was put forward based on deformation criterion. A method of symmetrical sampling of basic random variables adopted by classical response surface method was mended, and peak value and deflection degree of basic random variables distribution curve were took into account in the mended sampling method. A calculation way of probability moment, based on mended Rosenbluth method, suitable for non-explicit performance function was put forward. The first, second, third and fourth order moments of functional function value were calculated by mended Rosenbluth method through the first, second, third and fourth order moments of basic random variable. A probability density the function(PDF) of functional function was deduced through its first, second, third and fourth moments, the PDF in the new method took the place of the method of quadratic polynomial to approximate real functional function and reliability probability was calculated through integral by the PDF for random variable of functional function value in the new method. The result shows that the improved response surface method can adapt to various statistic distribution types of basic random variables, its calculation process is legible and need not itemtive circulation. In addition, a stability probability of surrounding rock for a tunnel was calculated by the improved method, whose workload is only 30% of classical method and its accuracy is comparative.展开更多
At the time of writing,coronavirus disease 2019(COVID-19)is seriously threatening human lives and health throughout the world.Many epidemic models have been developed to provide references for decision-making by gover...At the time of writing,coronavirus disease 2019(COVID-19)is seriously threatening human lives and health throughout the world.Many epidemic models have been developed to provide references for decision-making by governments and the World Health Organization.To capture and understand the characteristics of the epidemic trend,parameter optimization algorithms are needed to obtain model parameters.In this study,the authors propose using the Levenberg–Marquardt algorithm(LMA)to identify epidemic models.This algorithm combines the advantage of the Gauss–Newton method and gradient descent method and has improved the stability of parameters.The authors selected four countries with relatively high numbers of confirmed cases to verify the advantages of the Levenberg–Marquardt algorithm over the traditional epidemiological model method.The results show that the Statistical-SIR(Statistical-Susceptible–Infected–Recovered)model using LMA can fit the actual curve of the epidemic well,while the epidemic simulation of the traditional model evolves too fast and the peak value is too high to reflect the real situation.展开更多
In this paper,a low complexity ESPRIT algorithm based on power method and Orthogo- nal-triangular (QR) decomposition is presented for direction finding,which does not require a priori knowledge of source number and th...In this paper,a low complexity ESPRIT algorithm based on power method and Orthogo- nal-triangular (QR) decomposition is presented for direction finding,which does not require a priori knowledge of source number and the predetermined threshold (separates the signal and noise ei- gen-values).Firstly,according to the estimation of noise subspace obtained by the power method,a novel source number detection method without eigen-decomposition is proposed based on QR de- composition.Furthermore,the eigenvectors of signal subspace can be determined according to Q matrix and then the directions of signals could be computed by the ESPRIT algorithm.To determine the source number and subspace,the computation complexity of the proposed algorithm is approximated as (2log_2 n+2.67)M^3,where n is the power of covariance matrix and M is the number of array ele- ments.Compared with the Single Vector Decomposition (SVD) based algorithm,it has a substantial computational saving with the approximation performance.The simulation results demonstrate its effectiveness and robustness.展开更多
The performance of six statistical approaches,which can be used for selection of the best model to describe the growth of individual fish,was analyzed using simulated and real length-at-age data.The six approaches inc...The performance of six statistical approaches,which can be used for selection of the best model to describe the growth of individual fish,was analyzed using simulated and real length-at-age data.The six approaches include coefficient of determination(R2),adjusted coefficient of determination(adj.-R2),root mean squared error(RMSE),Akaike's information criterion(AIC),bias correction of AIC(AICc) and Bayesian information criterion(BIC).The simulation data were generated by five growth models with different numbers of parameters.Four sets of real data were taken from the literature.The parameters in each of the five growth models were estimated using the maximum likelihood method under the assumption of the additive error structure for the data.The best supported model by the data was identified using each of the six approaches.The results show that R2 and RMSE have the same properties and perform worst.The sample size has an effect on the performance of adj.-R2,AIC,AICc and BIC.Adj.-R2 does better in small samples than in large samples.AIC is not suitable to use in small samples and tends to select more complex model when the sample size becomes large.AICc and BIC have best performance in small and large sample cases,respectively.Use of AICc or BIC is recommended for selection of fish growth model according to the size of the length-at-age data.展开更多
A total of 126 bacterial strains were isolated from soil samples. Among them, 11 isolates were found positive for amylase production. Strain YL produced the largest zone of clearance on plate assay. The isolate YL was...A total of 126 bacterial strains were isolated from soil samples. Among them, 11 isolates were found positive for amylase production. Strain YL produced the largest zone of clearance on plate assay. The isolate YL was identified as Bacillus sp. based on morphological and physiochemical characterization. According to 16S rRNA gene sequencing data, the closest phylogenetic neighbor of strain YL was Bacillus amyloliquefaciens (99.54%). After that, an optimization of culture conditions was carried out for the improvement of a-amylase production. Response surface methodology (RSM) was applied to evaluate the effect of medium components including wheat bran, cottonseed extract, yeast extract, starch, NaC1 and CaCl2. Three variables (wheat bran, cottonseed extract, and starch), which were identified to significantly affect amylase production by Plackett-Burman design were further optimized using response surface methodology of Box-Behnken design (BBD). The optimal concentrations estimated for each variable related to the maximum of amylase activity (86 kU/mL) were 10.80 g/L wheat bran, 9.90 g/L cottonseed extract, 0.5 g/L starch, 2.0 g/L yeast extract, 5.00 g/L NaCl and 2.00 g/L CaC12. The fermentation using optimized culture medium allowed a significant increase in amylase production (by 3-fold). The improvement in the a-amylase production after optimization process can be considered adequate for large-scale applications.展开更多
Objective To retrospectively evaluate the diagnosis and treatment of Mirizzi syndrome (MS). Methods Patients who received elective or emergency cholecystectomies in our center during 23 years were retrospectively e...Objective To retrospectively evaluate the diagnosis and treatment of Mirizzi syndrome (MS). Methods Patients who received elective or emergency cholecystectomies in our center during 23 years were retrospectively evaluated. The data reviewed included demography, clinical presentations, diagnostic methods, surgical procedures, postoperative complications, and follow-up.展开更多
Estimating the global position of a road vehicle without using GPS is a challenge that many scientists look forward to solving in the near future. Normally, inertial and odometry sensors are used to complement GPS mea...Estimating the global position of a road vehicle without using GPS is a challenge that many scientists look forward to solving in the near future. Normally, inertial and odometry sensors are used to complement GPS measures in an attempt to provide a means for maintaining vehicle odometry during GPS outage. Nonetheless, recent experiments have demonstrated that computer vision can also be used as a valuable source to provide what can be denoted as visual odometry. For this purpose, vehicle motion can be estimated using a non-linear, photogrametric approach based on RAndom SAmple Consensus (RANSAC). The results prove that the detection and selection of relevant feature points is a crucial factor in the global performance of the visual odometry algorithm. The key issues for further improvement are discussed in this letter.展开更多
A dynamical-statistical post-processing approach is applied to seasonal precipitation forecasts in China during the summer.The data are ensemble-mean seasonal forecasts in summer (June August) from four atmospheric ge...A dynamical-statistical post-processing approach is applied to seasonal precipitation forecasts in China during the summer.The data are ensemble-mean seasonal forecasts in summer (June August) from four atmospheric general circulation models (GCMs) in the second phase of the Canadian Historical Forecasting Project (HFP2) from 1969 to 2001.This dynamical-statistical approach is designed based on the relationship between the 500 geopotential height (Z500) forecast and the observed sea surface temperature (SST) to calibrate the precipitation forecasts.The results show that the post-processing can improve summer precipitation forecasts for many areas in China.Further examination shows that this post-processing approach is very effective in reducing the model-dependent part of the errors,which are associated with GCMs.The possible mechanisms behind the forecast's improvements are investigated.展开更多
The rise of big data has led to new demands for machine learning (ML) systems to learn complex mod- els, with millions to billions of parameters, that promise adequate capacity to digest massive datasets and offer p...The rise of big data has led to new demands for machine learning (ML) systems to learn complex mod- els, with millions to billions of parameters, that promise adequate capacity to digest massive datasets and offer powerful predictive analytics (such as high-dimensional latent features, intermediate repre- sentations, and decision functions) thereupon. In order to run ML algorithms at such scales, on a distrib- uted cluster with tens to thousands of machines, it is often the case that significant engineering efforts are required-and one might fairly ask whether such engineering truly falls within the domain of ML research. Taking the view that "big" ML systems can benefit greatly from ML-rooted statistical and algo- rithmic insights-and that ML researchers should therefore not shy away from such systems design-we discuss a series of principles and strategies distilled from our recent efforts on industrial-scale ML solu- tions. These principles and strategies span a continuum from application, to engineering, and to theo- retical research and development of big ML systems and architectures, with the goal of understanding how to make them efficient, generally applicable, and supported with convergence and scaling guaran- tees. They concern four key questions that traditionally receive little attention in ML research: How can an ML program be distributed over a cluster? How can ML computation be bridged with inter-machine communication? How can such communication be performed? What should be communicated between machines? By exposing underlying statistical and algorithmic characteristics unique to ML programs but not typically seen in traditional computer programs, and by dissecting successful cases to reveal how we have harnessed these principles to design and develop both high-performance distributed ML software as well as general-purpose ML frameworks, we present opportunities for ML researchers and practitioners to further shape and enlarge the area that lies between ML and systems..展开更多
Conventional process monitoring method based on fast independent component analysis(Fast ICA) cannot take the ubiquitous measurement noises into account and may exhibit degraded monitoring performance under the advers...Conventional process monitoring method based on fast independent component analysis(Fast ICA) cannot take the ubiquitous measurement noises into account and may exhibit degraded monitoring performance under the adverse effects of the measurement noises. In this paper, a new process monitoring approach based on noisy time structure ICA(Noisy TSICA) is proposed to solve such problem. A Noisy TSICA algorithm which can consider the measurement noises explicitly is firstly developed to estimate the mixing matrix and extract the independent components(ICs). Subsequently, a monitoring statistic is built to detect process faults on the basis of the recursive kurtosis estimations of the dominant ICs. Lastly, a contribution plot for the monitoring statistic is constructed to identify the fault variables based on the sensitivity analysis. Simulation studies on the continuous stirred tank reactor system demonstrate that the proposed Noisy TSICA-based monitoring method outperforms the conventional Fast ICA-based monitoring method.展开更多
Soil organic carbon (SOC) has great impacts on global warming, land degradation and food security. Classic statistical and geostatistical methods were used to characterize and compare the spatial heterogeneity of SOC ...Soil organic carbon (SOC) has great impacts on global warming, land degradation and food security. Classic statistical and geostatistical methods were used to characterize and compare the spatial heterogeneity of SOC and related factors, such as topography, soil type and land use, in the Liudaogou watershed on the Loess Plateau of North China. SOC concentrations followed a log-normal distribution with an arithmetic and geometric means of 23.4 and 21.3 g kg-1, respectively, were moderately variable (CV = 75.9%), and demonstrated a moderate spatial dependence according to the nugget ratio (34.7%). The experimental variogram of SOC was best-fitted by a spherical model, after the spatial outliers had been detected and subsequently eliminated. Lower SOC concentrations were associated with higher elevations. Warp soils and farmland had the highest SOC concentrations, while aeolian sand soil and shrublands had the lowest SOC values. The geostatistical characteristics of SOC for the different soil and land use types were different. These patterns were closely related to the spatial structure of topography, and soil and land use types.展开更多
文摘Aim To improve the efficiency of fatigue material tests and relevant statistical treatment of test data. Methods\ Least square approach and other special treatments were used. Results and Conclusion\ The concepts of each phase in fatigue tests and statistical treatment are clarified. The method proposed leads to three important properties. Reduced number of specimens brings to the advantage of lowering test expenditures. The whole test procedure has more flexibility for there is no need to conduct many tests at the same stress level as in traditional cases.
基金Project (50975263) supported by the National Natural Science Foundation of ChinaProject (2010081015) supported by International Cooperation Project of Shanxi Province, China+1 种基金 Project (2010-78) supported by the Scholarship Council in Shanxi province, ChinaProject (2010420120005) supported by Doctoral Fund of Ministry of Education of China
文摘A new algorithm based on the projection method with the implicit finite difference technique was established to calculate the velocity fields and pressure.The calculation region can be divided into different regions according to Reynolds number.In the far-wall region,the thermal melt flow was calculated as Newtonian flow.In the near-wall region,the thermal melt flow was calculated as non-Newtonian flow.It was proved that the new algorithm based on the projection method with the implicit technique was correct through nonparametric statistics method and experiment.The simulation results show that the new algorithm based on the projection method with the implicit technique calculates more quickly than the solution algorithm-volume of fluid method using the explicit difference method.
基金Project(51221462) supported by the National Natural Science Foundation of ChinaProject(20120095110001) supported by the Ph D Programs Foundation of Ministry of Education of China
文摘Demand for large vibrating screen is huge in the mineral processing industry. As bending and random vibration are not considered in a traditional design method for beam structures of a large vibrating screen, fatigue damage occurs frequently to affect the screening performance. This work aims to conduct a systematic mechanics analysis of the beam structures and improve the design method. Total motion of a beam structure in screening process can be decomposed into the traditional followed rigid translation(FRT), bending vibration(BV) and axial linear-distributed random rigid translation(ALRRT) excited by the side-plates. When treated as a generalized single-degree-of-freedom(SDOF) elastic system analytically, the BV can be solved by the Rayleigh's method. Stochastic analysis for random process is conducted for the detailed ALRRT calculation. Expressions for the mechanics property, namely, the shearing force and bending-moment with respect to BV and ALRRT, are derived, respectively. Experimental and numerical investigations demonstrate that the largest BV exists at the beam center and can be nearly ignored in comparison with the FRT during a simplified engineering design. With the BV and FRT considered, the mechanics property accords well with the practical situation with the maximum error of 6.33%, which is less than that obtained by traditional method.
文摘Monte Carlo simulation was applied to Assembly Success Bate (ASK) analyses. ASR of two peg-in-hole robot assemblies was used as an example by taking component parts' sizes, manufacturing tolerances and robot repeatability into account. A statistic arithmetic expression was proposed and deduced in this paper, which offers an alternative method of estimating the accuracy of ASR, without having to repeat the simulations. This statistic method also helps to choose a suitable sample size, if error reduction is desired. Monte Carlo simulation results demonstrated the feasibility of the method.
基金Supported by the National Basic Research Program of China(973 Program)(No.2011CB403605)the National Natural Science Foundation of China(No.41176133)
文摘To understand the characteristics of macrobenthic structures and the relationship between environment and benthic assemblages in jellyfish bloom, we studied the macrobenthos and related environmental factors in the coastal waters of the Yellow Sea and East China Sea. Data were collected during two seasonal cruises in April and August of 2011, and analyzed with multivariate statistical methods. Up to 306 macrobenthic species were registered from the research areas, including 115 species of Polychaeta, 78 of Crustacea, 61 of Mollusca, 30 of Echinodermata, and 22 of other groups. Nine polychaete species occurred at frequencies higher than 25% from the sampling stations: Lumbrineris longifolia, Notomastus latericeus, Nin6e palmata, Ophelina acuminata, Nephtys oligobranchia, Onuphis geophiliformis, Glycera chirori, Terebellides stroemii, and Aricidea fragilis. Both the average biomass and abundance of macrobenthos are higher in August (23.8 g/m^2 and 237.7 ind./m^2) than those in April (11.3 g/m^2 and 128 ind./m^2); the dissimilarity ofmacrobenthic structures among stations is as high as 70%. In terms of the dissimilarity values, we divided the stations into four clusters in spring and eight in summer. The ABC curve shows that the macrofauna communities in high jellyfish abundance were not changed. Canonical correspondence analysis showed that depth, temperature, median grain size, total organic carbon of sediment and total nitrogen in sediment were important factors affecting the macrozoobenthic community in the study area.
基金Project(50378036) supported by the National Natural Science Foundation of China Project (200503) supported by the Foundation ofCommunications Department of Hunan Province, China
文摘An approach of limit state equation for surrounding rock was put forward based on deformation criterion. A method of symmetrical sampling of basic random variables adopted by classical response surface method was mended, and peak value and deflection degree of basic random variables distribution curve were took into account in the mended sampling method. A calculation way of probability moment, based on mended Rosenbluth method, suitable for non-explicit performance function was put forward. The first, second, third and fourth order moments of functional function value were calculated by mended Rosenbluth method through the first, second, third and fourth order moments of basic random variable. A probability density the function(PDF) of functional function was deduced through its first, second, third and fourth moments, the PDF in the new method took the place of the method of quadratic polynomial to approximate real functional function and reliability probability was calculated through integral by the PDF for random variable of functional function value in the new method. The result shows that the improved response surface method can adapt to various statistic distribution types of basic random variables, its calculation process is legible and need not itemtive circulation. In addition, a stability probability of surrounding rock for a tunnel was calculated by the improved method, whose workload is only 30% of classical method and its accuracy is comparative.
基金This work was jointly supported by the National Natural Science Foundation of China[grant number 41521004]the Gansu Provincial Special Fund Project for Guiding Scientific and Technological Innovation and Development[grant number 2019ZX-06].
文摘At the time of writing,coronavirus disease 2019(COVID-19)is seriously threatening human lives and health throughout the world.Many epidemic models have been developed to provide references for decision-making by governments and the World Health Organization.To capture and understand the characteristics of the epidemic trend,parameter optimization algorithms are needed to obtain model parameters.In this study,the authors propose using the Levenberg–Marquardt algorithm(LMA)to identify epidemic models.This algorithm combines the advantage of the Gauss–Newton method and gradient descent method and has improved the stability of parameters.The authors selected four countries with relatively high numbers of confirmed cases to verify the advantages of the Levenberg–Marquardt algorithm over the traditional epidemiological model method.The results show that the Statistical-SIR(Statistical-Susceptible–Infected–Recovered)model using LMA can fit the actual curve of the epidemic well,while the epidemic simulation of the traditional model evolves too fast and the peak value is too high to reflect the real situation.
基金Supported by the National Natural Science Foundation of China (No.60102005).
文摘In this paper,a low complexity ESPRIT algorithm based on power method and Orthogo- nal-triangular (QR) decomposition is presented for direction finding,which does not require a priori knowledge of source number and the predetermined threshold (separates the signal and noise ei- gen-values).Firstly,according to the estimation of noise subspace obtained by the power method,a novel source number detection method without eigen-decomposition is proposed based on QR de- composition.Furthermore,the eigenvectors of signal subspace can be determined according to Q matrix and then the directions of signals could be computed by the ESPRIT algorithm.To determine the source number and subspace,the computation complexity of the proposed algorithm is approximated as (2log_2 n+2.67)M^3,where n is the power of covariance matrix and M is the number of array ele- ments.Compared with the Single Vector Decomposition (SVD) based algorithm,it has a substantial computational saving with the approximation performance.The simulation results demonstrate its effectiveness and robustness.
基金Supported by the High Technology Research and Development Program of China (863 Program,No2006AA100301)
文摘The performance of six statistical approaches,which can be used for selection of the best model to describe the growth of individual fish,was analyzed using simulated and real length-at-age data.The six approaches include coefficient of determination(R2),adjusted coefficient of determination(adj.-R2),root mean squared error(RMSE),Akaike's information criterion(AIC),bias correction of AIC(AICc) and Bayesian information criterion(BIC).The simulation data were generated by five growth models with different numbers of parameters.Four sets of real data were taken from the literature.The parameters in each of the five growth models were estimated using the maximum likelihood method under the assumption of the additive error structure for the data.The best supported model by the data was identified using each of the six approaches.The results show that R2 and RMSE have the same properties and perform worst.The sample size has an effect on the performance of adj.-R2,AIC,AICc and BIC.Adj.-R2 does better in small samples than in large samples.AIC is not suitable to use in small samples and tends to select more complex model when the sample size becomes large.AICc and BIC have best performance in small and large sample cases,respectively.Use of AICc or BIC is recommended for selection of fish growth model according to the size of the length-at-age data.
基金Project(31000350) supported by the National Natural Science Foundation of ChinaProject(2010CB630902) supported by the National Basic Research Program of China
文摘A total of 126 bacterial strains were isolated from soil samples. Among them, 11 isolates were found positive for amylase production. Strain YL produced the largest zone of clearance on plate assay. The isolate YL was identified as Bacillus sp. based on morphological and physiochemical characterization. According to 16S rRNA gene sequencing data, the closest phylogenetic neighbor of strain YL was Bacillus amyloliquefaciens (99.54%). After that, an optimization of culture conditions was carried out for the improvement of a-amylase production. Response surface methodology (RSM) was applied to evaluate the effect of medium components including wheat bran, cottonseed extract, yeast extract, starch, NaC1 and CaCl2. Three variables (wheat bran, cottonseed extract, and starch), which were identified to significantly affect amylase production by Plackett-Burman design were further optimized using response surface methodology of Box-Behnken design (BBD). The optimal concentrations estimated for each variable related to the maximum of amylase activity (86 kU/mL) were 10.80 g/L wheat bran, 9.90 g/L cottonseed extract, 0.5 g/L starch, 2.0 g/L yeast extract, 5.00 g/L NaCl and 2.00 g/L CaC12. The fermentation using optimized culture medium allowed a significant increase in amylase production (by 3-fold). The improvement in the a-amylase production after optimization process can be considered adequate for large-scale applications.
文摘Objective To retrospectively evaluate the diagnosis and treatment of Mirizzi syndrome (MS). Methods Patients who received elective or emergency cholecystectomies in our center during 23 years were retrospectively evaluated. The data reviewed included demography, clinical presentations, diagnostic methods, surgical procedures, postoperative complications, and follow-up.
文摘Estimating the global position of a road vehicle without using GPS is a challenge that many scientists look forward to solving in the near future. Normally, inertial and odometry sensors are used to complement GPS measures in an attempt to provide a means for maintaining vehicle odometry during GPS outage. Nonetheless, recent experiments have demonstrated that computer vision can also be used as a valuable source to provide what can be denoted as visual odometry. For this purpose, vehicle motion can be estimated using a non-linear, photogrametric approach based on RAndom SAmple Consensus (RANSAC). The results prove that the detection and selection of relevant feature points is a crucial factor in the global performance of the visual odometry algorithm. The key issues for further improvement are discussed in this letter.
基金funded by the National Natural Sci-ence Foundation of China (Grant No. 40805018)
文摘A dynamical-statistical post-processing approach is applied to seasonal precipitation forecasts in China during the summer.The data are ensemble-mean seasonal forecasts in summer (June August) from four atmospheric general circulation models (GCMs) in the second phase of the Canadian Historical Forecasting Project (HFP2) from 1969 to 2001.This dynamical-statistical approach is designed based on the relationship between the 500 geopotential height (Z500) forecast and the observed sea surface temperature (SST) to calibrate the precipitation forecasts.The results show that the post-processing can improve summer precipitation forecasts for many areas in China.Further examination shows that this post-processing approach is very effective in reducing the model-dependent part of the errors,which are associated with GCMs.The possible mechanisms behind the forecast's improvements are investigated.
文摘The rise of big data has led to new demands for machine learning (ML) systems to learn complex mod- els, with millions to billions of parameters, that promise adequate capacity to digest massive datasets and offer powerful predictive analytics (such as high-dimensional latent features, intermediate repre- sentations, and decision functions) thereupon. In order to run ML algorithms at such scales, on a distrib- uted cluster with tens to thousands of machines, it is often the case that significant engineering efforts are required-and one might fairly ask whether such engineering truly falls within the domain of ML research. Taking the view that "big" ML systems can benefit greatly from ML-rooted statistical and algo- rithmic insights-and that ML researchers should therefore not shy away from such systems design-we discuss a series of principles and strategies distilled from our recent efforts on industrial-scale ML solu- tions. These principles and strategies span a continuum from application, to engineering, and to theo- retical research and development of big ML systems and architectures, with the goal of understanding how to make them efficient, generally applicable, and supported with convergence and scaling guaran- tees. They concern four key questions that traditionally receive little attention in ML research: How can an ML program be distributed over a cluster? How can ML computation be bridged with inter-machine communication? How can such communication be performed? What should be communicated between machines? By exposing underlying statistical and algorithmic characteristics unique to ML programs but not typically seen in traditional computer programs, and by dissecting successful cases to reveal how we have harnessed these principles to design and develop both high-performance distributed ML software as well as general-purpose ML frameworks, we present opportunities for ML researchers and practitioners to further shape and enlarge the area that lies between ML and systems..
基金Supported by the National Natural Science Foundation of China(61273160)the Natural Science Foundation of Shandong Province(ZR2011FM014)+1 种基金the Fundamental Research Funds for the Central Universities(12CX06071A)the Postgraduate Innovation Funds of China University of Petroleum(CX2013060)
文摘Conventional process monitoring method based on fast independent component analysis(Fast ICA) cannot take the ubiquitous measurement noises into account and may exhibit degraded monitoring performance under the adverse effects of the measurement noises. In this paper, a new process monitoring approach based on noisy time structure ICA(Noisy TSICA) is proposed to solve such problem. A Noisy TSICA algorithm which can consider the measurement noises explicitly is firstly developed to estimate the mixing matrix and extract the independent components(ICs). Subsequently, a monitoring statistic is built to detect process faults on the basis of the recursive kurtosis estimations of the dominant ICs. Lastly, a contribution plot for the monitoring statistic is constructed to identify the fault variables based on the sensitivity analysis. Simulation studies on the continuous stirred tank reactor system demonstrate that the proposed Noisy TSICA-based monitoring method outperforms the conventional Fast ICA-based monitoring method.
基金Project supported by the National Key Basic Research Program (973 Program) of China (No.2007CB106803)the National Programs for Science and Technology Development of China (No.2006BAD09B06)the Scientific ResearchInnovation Team Support Program of the Northwest A&F University, China
文摘Soil organic carbon (SOC) has great impacts on global warming, land degradation and food security. Classic statistical and geostatistical methods were used to characterize and compare the spatial heterogeneity of SOC and related factors, such as topography, soil type and land use, in the Liudaogou watershed on the Loess Plateau of North China. SOC concentrations followed a log-normal distribution with an arithmetic and geometric means of 23.4 and 21.3 g kg-1, respectively, were moderately variable (CV = 75.9%), and demonstrated a moderate spatial dependence according to the nugget ratio (34.7%). The experimental variogram of SOC was best-fitted by a spherical model, after the spatial outliers had been detected and subsequently eliminated. Lower SOC concentrations were associated with higher elevations. Warp soils and farmland had the highest SOC concentrations, while aeolian sand soil and shrublands had the lowest SOC values. The geostatistical characteristics of SOC for the different soil and land use types were different. These patterns were closely related to the spatial structure of topography, and soil and land use types.