We present a statistical investigation of the degree of influence that assumptions made in relation to the mechanical parameters of a pylon have on its ground-induced vibrations.The study is set up by using as a key k...We present a statistical investigation of the degree of influence that assumptions made in relation to the mechanical parameters of a pylon have on its ground-induced vibrations.The study is set up by using as a key kinematic variable the displacement at the top of a reference,a stand-alone pylon with a uniform cross-section and fixity at its base.Next,statistics are produced using a dimensionless displacement ratio defined between the‘parental’and the‘subsidiary’cases,the latter defined for the pylon(a)resting on compliant soil,(b)having an attached top mass,and(c)being non-uniform with height.Furthermore,two materials are examined,namely,steel and reinforced concrete(R/C).More specifically,this displacement ratio is independent of the excitation and plays the role of a transfer function between the base and the top of the pylon.Both horizontal and vertical motions are considered,and the equations of motion are solved in the frequency domain.The ensuing statistical analysis is conducted for the following parameter combinations:(a)pylon founded on soft,intermediate,and stiff soil;(b)low,intermediate,and high-mass ratios of the attached mass to the pylon′s mass;(c)a constant and quadratic degree of pylon tapering with height.Spearman correlation coefficients are calculated for all the above combinations to arrive at statistical results that establish validity bounds and quantify the degree of influence of each assumption on the pylon′s response.展开更多
Background:Patients with colon cancer who receive chemotherapy usually experience various gastrointestinal adverse reactions,including nausea,vomiting,and diarrhea,which make it challenging for them to adhere to treat...Background:Patients with colon cancer who receive chemotherapy usually experience various gastrointestinal adverse reactions,including nausea,vomiting,and diarrhea,which make it challenging for them to adhere to treatment.As an effective traditional Chinese medicine,the Jianpi Bushen formula has been widely used to alleviate the side effects of chemotherapy.Objective:To evaluate the efficacy and safety of Jianpi Bushen formulae for patients who undergo chemotherapy.This statistical analysis plan(SAP)is intended to enhance the transparency and research quality of our randomized controlled trial.Methods:Our study is a multicenter,double-blind,randomized controlled clinical trial.This trial aimed to compare the completion rate of chemotherapy in colon cancer patients who are using and not using Jianpi Bushen formula.To attenuate possible selection bias in the final report,we declared the overall trial design,outcome measures,subgroup analyses,and safety measures.Also,we described the data management and statistical analysis methods in detail.Conclusion:The SAP provides more detailed information than the trial protocol for data management and statistical analysis methods.Further post-hoc analyses can be performed by referring to the SAP,and possible selection bias can be attenuated.展开更多
The rejected specimens from the Emergency Department of the Center of Clinical Laboratory from January 1,2022 to January 1,2023 were analyzed to reduce the specimen rejection rates and to improve the quality of inspec...The rejected specimens from the Emergency Department of the Center of Clinical Laboratory from January 1,2022 to January 1,2023 were analyzed to reduce the specimen rejection rates and to improve the quality of inspection.The results showed that there were 1488 samples of rejected specimens and the non-conforming rate was 0.58%.The departments involved were mainly the Emergency Department,the Hematology Department,the Cardiology Department,the Intensive Care Department,and the Brain Surgery Department.Among the reasons for rejection,blood hemolysis accounted for 43.15%,blood coagulation accounted for 26.61%,and the rate of insufficient specimens was 17.14%.Among them,the sample rejection rate for arterial blood gas analysis was the highest,which accounted for 1.74%;followed by specimens for coagulation test,which was 1.18%.These results indicate the main reason for producing rejected specimens is mainly due to not following the standard operating procedure.Specimen rejection can largely be avoided if the standards for specimen collection are strictly followed.展开更多
Based on the Tropical Cyclone(TC briefly thereafter)Yearbook 1980-2009,this paper first analyzes the number and intensity change of the TCs which passed directly over or by the side of Poyang Lake(the distance of TC c...Based on the Tropical Cyclone(TC briefly thereafter)Yearbook 1980-2009,this paper first analyzes the number and intensity change of the TCs which passed directly over or by the side of Poyang Lake(the distance of TC center is less than 1°longitude or 1°latitude from the Lake)among all the landfalling TCs in China during the past 30 years.Two cases are examined in detail in this paper.One is severe typhoon Rananim with a speed of 3.26 m/s and a change of 1 hPa in intensity when it was passing the Lake.The other is super typhoon Saomai with a faster moving speed of 6.50 m/s and a larger change in intensity of 6 hPa.Through numerical simulation experiments,this paper analyzes how the change of underlying surface from water to land contributes to the differences in intensity,speed and mesoscale convection of the two TCs when they passed the Lake.Results show that the moisture and dynamic condition above the Lake were favorable for the maintenance of the intensity when Rananim was passing through Poyang Lake,despite the moisture supply from the ocean was cut off.As a result,there was strong convection around the lake which led to a rainfall spinning counter-clockwise as it was affected by the TC movement.However,little impact was seen in the Saomai case.These results indicate that for the TCs coming ashore on Poyang Lake with a slow speed,the large water body is conducive to the sustaining of the intensity and strengthening of the convection around the TC center and the subsequent heavy rainfall.On the contrary,a fast-moving TC is less likely to be influenced by the underlying surface in terms of intensity and speed.展开更多
Due to the advances of intelligent transportation system(ITSs),traffic forecasting has gained significant interest as robust traffic prediction acts as an important part in different ITSs namely traffic signal control...Due to the advances of intelligent transportation system(ITSs),traffic forecasting has gained significant interest as robust traffic prediction acts as an important part in different ITSs namely traffic signal control,navigation,route mapping,etc.The traffic prediction model aims to predict the traffic conditions based on the past traffic data.For more accurate traffic prediction,this study proposes an optimal deep learning-enabled statistical analysis model.This study offers the design of optimal convolutional neural network with attention long short term memory(OCNN-ALSTM)model for traffic prediction.The proposed OCNN-ALSTM technique primarily preprocesses the traffic data by the use of min-max normalization technique.Besides,OCNN-ALSTM technique was executed for classifying and predicting the traffic data in real time cases.For enhancing the predictive outcomes of the OCNN-ALSTM technique,the bird swarm algorithm(BSA)is employed to it and thereby overall efficacy of the network gets improved.The design of BSA for optimal hyperparameter tuning of the CNN-ALSTM model shows the novelty of the work.The experimental validation of the OCNNALSTM technique is performed using benchmark datasets and the results are examined under several aspects.The simulation results reported the enhanced outcomes of the OCNN-ALSTM model over the recent methods under several dimensions.展开更多
The establishment of effective null models can provide reference networks to accurately describe statistical properties of real-life signed networks.At present,two classical null models of signed networks(i.e.,sign an...The establishment of effective null models can provide reference networks to accurately describe statistical properties of real-life signed networks.At present,two classical null models of signed networks(i.e.,sign and full-edge randomized models)shuffle both positive and negative topologies at the same time,so it is difficult to distinguish the effect on network topology of positive edges,negative edges,and the correlation between them.In this study,we construct three re-fined edge-randomized null models by only randomizing link relationships without changing positive and negative degree distributions.The results of nontrivial statistical indicators of signed networks,such as average degree connectivity and clustering coefficient,show that the position of positive edges has a stronger effect on positive-edge topology,while the signs of negative edges have a greater influence on negative-edge topology.For some specific statistics(e.g.,embeddedness),the results indicate that the proposed null models can more accurately describe real-life networks compared with the two existing ones,which can be selected to facilitate a better understanding of complex structures,functions,and dynamical behaviors on signed networks.展开更多
Security is a vital parameter to conserve energy in wireless sensor networks(WSN).Trust management in the WSN is a crucial process as trust is utilized when collaboration is important for accomplishing trustworthy dat...Security is a vital parameter to conserve energy in wireless sensor networks(WSN).Trust management in the WSN is a crucial process as trust is utilized when collaboration is important for accomplishing trustworthy data transmission.But the available routing techniques do not involve security in the design of routing techniques.This study develops a novel statistical analysis with dingo optimizer enabled reliable routing scheme(SADO-RRS)for WSN.The proposed SADO-RRS technique aims to detect the existence of attacks and optimal routes in WSN.In addition,the presented SADORRS technique derives a new statistics based linear discriminant analysis(LDA)for attack detection,Moreover,a trust based dingo optimizer(TBDO)algorithm is applied for optimal route selection in the WSN and accomplishes secure data transmission in WSN.Besides,the TBDO algorithm involves the derivation of the fitness function involving different input variables of WSN.For demonstrating the enhanced outcomes of the SADO-RRS technique,a wide range of simulations was carried out and the outcomes demonstrated the enhanced outcomes of the SADO-RRS technique.展开更多
Geomechanical data are never sufficient in quantity or adequately precise and accurate for design purposes in mining and civil engineering.The objective of this paper is to show the variability of rock properties at t...Geomechanical data are never sufficient in quantity or adequately precise and accurate for design purposes in mining and civil engineering.The objective of this paper is to show the variability of rock properties at the sampled point in the roadway's roof,and then,how the statistical processing of the available geomechanical data can affect the results of numerical modelling of the roadway's stability.Four cases were applied in the numerical analysis,using average values(the most common in geomechanical data analysis),average minus standard deviation,median,and average value minus statistical error.The study show that different approach to the same geomechanical data set can change the modelling results considerably.The case shows that average minus standard deviation is the most conservative and least risky.It gives the displacements and yielded elements zone in four times broader range comparing to the average values scenario,which is the least conservative option.The two other cases need to be studied further.The results obtained from them are placed between most favorable and most adverse values.Taking the average values corrected by statistical error for the numerical analysis seems to be the best solution.Moreover,the confidence level can be adjusted depending on the object importance and the assumed risk level.展开更多
The ultimate pit may affect other aspects in the life of a mine such as economical, technical, environmental, and social aspects. What makes it even more complex is that most often there are many pits which are econom...The ultimate pit may affect other aspects in the life of a mine such as economical, technical, environmental, and social aspects. What makes it even more complex is that most often there are many pits which are economically minable. This calls for a heuristic approach to determine which of these pits is the ultimate pit. This study presents a means of selecting an ultimate pit during design operations of the Hebei Limestone mine. During optimization processes of the mine, many pit shells were created using Whittle Software. Normally, Whittle Software optimizes these processes and assigns a revenue factor of 1 for the ultimate pit. Unfortunately, the pit shells created did not satisfy the criteria with a revenue factor of 1 based on the parameters. As a result of this, statistical analysis was implemented to further understand the relationship, variability, and correlation of the pit shells created (data). Correlation Analysis, K-means++ Analysis, Principal Component Analysis, and Generalized Linear models were used in the analysis of the pit shells created. The results portray a salient relationship of the optimization variables. In addition, the proposed method was tested on Whittle Sample projects which satisfy the selection of ultimate pit selection with a revenue factor of 1. The results show that the proposed model produced almost the same results as the Whittle model with a revenue factor of 1 and was also able to determine the ultimate pit in cases which did not satisfy the Whittle selection criteria.展开更多
Qasab basin is one of the most promising areas for the sustainable development in the Eastern Desert fringes of the Nile Valley, Egypt. The integration between statistical analysis, stable isotopes as well as geochemi...Qasab basin is one of the most promising areas for the sustainable development in the Eastern Desert fringes of the Nile Valley, Egypt. The integration between statistical analysis, stable isotopes as well as geochemical modeling tools delineated the geochemical possesses affecting groundwater quality and detected the main recharge source in Qasab basin. The most of groundwater samples are brackish (88%), while the minority (12%) of the samples are fresh. The electrical conductivity of groundwater ranged from 1135 to 10,030 μS/cm. The statistical analysis and hydrochemical diagrams suggest that the groundwater quality is mainly controlled by several intermixed processes (rock weathering and agricultural activities). The mineralization of the Pleistocene groundwater is regulated by the rock weathering source, evaporation processes and reverse cation exchange. The isotopic signatures (δ<sup>2</sup>H and δ<sup>18</sup>O) represent two groundwater groups. The first group, is enriched with the isotopic signature of δ<sup>18</sup>O, which ranges from 0.9‰ to 5.5‰. This group is mostly affected by the recent meteoric recharge from the surface water leakage. The second group, is relatively depleted with the isotopic signature of δ<sup>18</sup>O, reflecting a palaeo recharge source of colder climate. The δ<sup>18</sup>O‰ varies from <span style="color:#4F4F4F;font-family:"font-size:14px;white-space:normal;background-color:#FFFFFF;">-</span>10.1‰ to <span style="color:#4F4F4F;font-family:"font-size:14px;white-space:normal;background-color:#FFFFFF;">-</span>6.4‰, indicating upward leakage of the Nubian sandstone aquifer through deep seated faults. The inverse geochemical model reflects that the salinity source of the groundwater samples is due to the leaching and dissolution processes of carbonate, sulphate and chloride minerals from the aquifer matrix. This study can demonstrate the hydrochemistry assessment guide to support sustainable development in Qasab basin to ensure that adequate groundwater management can play to reduce poverty and support socioeconomic development.展开更多
Jinhongtang is a traditional Chinese medicine formula composed of Rheum palmatum L.stem,Sargentodoxa cuneata stem,and Taraxacum mongolicum and is used for the treatment of sepsis.However,quality assessment method for ...Jinhongtang is a traditional Chinese medicine formula composed of Rheum palmatum L.stem,Sargentodoxa cuneata stem,and Taraxacum mongolicum and is used for the treatment of sepsis.However,quality assessment method for Jinhongtang is not available.In present study,we developed a UFLC-MS/MS method to determine 16 analytes in 20 batches of home-made and commercial Jinhongtang.Multivariate statistical analysis revealed the significant differences in the quality of home-made and commercial Jinhongtang and the difference in the quality of home-made samples was more significant.The integrated strategy based on UFLC-MS/MS and multivariate statistical analysis provided a new basis for the overall quality assessment of Jinhongtang.展开更多
Biology is a challenging and complicated mess. Understanding this challenging complexity is the realm of the biological sciences: Trying to make sense of the massive, messy data in terms of discovering patterns and re...Biology is a challenging and complicated mess. Understanding this challenging complexity is the realm of the biological sciences: Trying to make sense of the massive, messy data in terms of discovering patterns and revealing its underlying general rules. Among the most powerful mathematical tools for organizing and helping to structure complex, heterogeneous and noisy data are the tools provided by multivariate statistical analysis (MSA) approaches. These eigenvector/eigenvalue data-compression approaches were first introduced to electron microscopy (EM) in 1980 to help sort out different views of macromolecules in a micrograph. After 35 years of continuous use and developments, new MSA applications are still being proposed regularly. The speed of computing has increased dramatically in the decades since their first use in electron microscopy. However, we have also seen a possibly even more rapid increase in the size and complexity of the EM data sets to be studied. MSA computations had thus become a very serious bottleneck limiting its general use. The parallelization of our programs—speeding up the process by orders of magnitude—has opened whole new avenues of research. The speed of the automatic classification in the compressed eigenvector space had also become a bottleneck which needed to be removed. In this paper we explain the basic principles of multivariate statistical eigenvector-eigenvalue data compression;we provide practical tips and application examples for those working in structural biology, and we provide the more experienced researcher in this and other fields with the formulas associated with these powerful MSA approaches.展开更多
Chinese traditional shadow play has been selected into the List of Intangible Cultural Heritage in 2011.Yet,reflecting abundant national cultural values,such traditional art form is degenerating and fading out from pe...Chinese traditional shadow play has been selected into the List of Intangible Cultural Heritage in 2011.Yet,reflecting abundant national cultural values,such traditional art form is degenerating and fading out from people’s sight.As the earliest statistical analysis software,Statistical Package for the Social Science(SPSS)is comprehensive in analyzing and managing statistical data.This study explores the application of SPSS in minimizing the workload of researchers while raising the validity of data in supporting the analysis of the survey data which reflected the inheritance and development of Chinese traditional shadow play in schools.展开更多
Social computing and online groups have accompanied in a new age of the network, where information, networking and communication technologies are enabling systematized human efforts in primarily innovative ways. The s...Social computing and online groups have accompanied in a new age of the network, where information, networking and communication technologies are enabling systematized human efforts in primarily innovative ways. The social network communities working on various social network domains face different hurdles, including various new research studies and challenges in social computing. The researcher should try to expand the scope and establish new ideas and methods even from other disciplines to address the various challenges. This idea has diverse academic association, social links and technical characteristics. Thus it offers an ultimate opportunity for researchers to find out the issues in social computing and provide innovative solutions for conveying the information between social online groups on network computing. In this research paper we investigate the different issues in social media like users’ privacy and security, network reliabilities, and desire data availability on these social media, users’ awareness about the social networks and problems faced by academic domains. A huge number of users operated the social networks for retrieving and disseminating their real time and offline information to various places. The information may be transmitted on local networks or may be on global networks. The main concerns of users on social media are secure and fast communication channels. Facebook and YouTube both claimed for efficient security mechanism and fast communication channels for multimedia data. In this research a survey has been conducted in the most populated cities where a large number of Facebook and YouTube users have been found. During the survey several regular users indicate the certain potential issues continuously occurred on these social web sites interfaces, for example unwanted advertisement, fake IDS, uncensored videos and unknown friend request which cause the poor speed of channel communication, poor uploading and downloading data speed, channel interferences, security of data, privacy of users, integrity and reliability of user communication on these social sites. The major issues faced by active users of Facebook and YouTube have been highlighted in this research.展开更多
This paper analyzes the application value of statistical analysis method of big data in economic management from the macro and micro perspectives,and analyzes its specific application from three aspects such as econom...This paper analyzes the application value of statistical analysis method of big data in economic management from the macro and micro perspectives,and analyzes its specific application from three aspects such as economic trends,industrial operations and marketing strategies.展开更多
Since Ko Kung-chen published History of Chinese Journalism with Shanghai Commercial Press in 1927,this work had been published by different publishing houses and reprinted many times amounting to 18 editions.There are...Since Ko Kung-chen published History of Chinese Journalism with Shanghai Commercial Press in 1927,this work had been published by different publishing houses and reprinted many times amounting to 18 editions.There are many historical narrative and fact errors of alien press in this work,and up to date these errors are not revised.This essay aims to revise the omitted errors of alien press in History of Chinese Journalism systematically besides those corrected by other works or essays,and carry on the statistical analysis of the errors to probe new approach for the study of alien periodicals and press in China.展开更多
This study analyzed rainfall variability in Southeast region of Nigeria using graphical models,as well as using statistical approach to investigate any significant relationship between the global North Atlantic Oscill...This study analyzed rainfall variability in Southeast region of Nigeria using graphical models,as well as using statistical approach to investigate any significant relationship between the global North Atlantic Oscillation(NAO)Index and the regional rainfall variability in region.The study was conducted in three States of Southeastern Nigeria namely,Abia,Ebonyi and Imo States that lie between Latitudes 4040’and 8050’N and Longitudes 6020’and 8050’E.Data for the study included 30 years(1988-2017)archival time-series monthly rainfall values for the three study States,acquired from Nigerian Meteorological Agency(NIMET),offices in the states,and Standardized values of NAOI(North Atlantic Oscillation Index)for the same period,which were collected from a website,on the NOAA Data Center,USA.In the data analyses,the first method was adopted by using graphs to illustrate mean annual rainfall values for thirty years.Coefficient of variability was employed in evaluating the degree of variability of values from the mean rate.The second analysis was accomplished using correlation models to ascertain any relationship between NAOI and rainfall in Southeast Nigeria.The results showed a significant variability of rainfall in the region from January to December(mean monthly)within the study period.A negative correlation value of 0.7525 was obtained from the correlation analysis,showing that the global NAO index and rainfall variability deviate in the opposite direction.Coefficient of multiple determinations(CMD)subsequently showed value of 0.031%,being the variation in rainfall as influenced by the global teleconnectivity,and this means that the NAO index has zero or no influence on rainfall variability in Southeast region of Nigeria.展开更多
This study comprises a climatology of the spatial variability of precipitation over the São Francisco River Basin (SFRB), characterized by its geographic heterogeneity. The different rainfall regimes in the r...This study comprises a climatology of the spatial variability of precipitation over the São Francisco River Basin (SFRB), characterized by its geographic heterogeneity. The different rainfall regimes in the region were analyzed through statistical and spectral analyses. Measured precipitation data, Pacific Decennial Climate indexes, ENSO, Atlantic Multidecadal Oscillation, North Atlantic Oscillation, Atlantic dipole, and the sunspot cycle over 65 years were used. The rainfall data were filtered and filled in using the regional weighting method. The spatial and temporal variability of precipitation along the SFRB is remarkable. A pattern was observed along with the time series of precipitation over the SFRB. The cluster analysis identified four homogeneous regions in the SFRB and explained 87.4% of the total variance of the average monthly rainfall of the 199 rain gauges. The Cross-wavelet analysis identified the relationship between the precipitation data series and the climatic indexes that are analyzed in this work.展开更多
Overlapping latent fingermarks constitute a serious challenge to database related recognition and matching algorithms in biometry, forensic and crime scene investigations. Mass spectrometry imaging (MSI) is a powerful...Overlapping latent fingermarks constitute a serious challenge to database related recognition and matching algorithms in biometry, forensic and crime scene investigations. Mass spectrometry imaging (MSI) is a powerful tool for deciphering and analyzing overlapping fingermarks based on the individual chemical information of each deposit. Fingermark MSI in practice still requires a subjective judgment of an MSI expert, such that rapid analysis, automation, standardization, and a quantitative evaluation of the complete detection and separation process of overlapped fingermarks from MSI data sets is the ultimate goal and will be necessary to become an accepted process in criminal investigations and law enforcement. Here we investigated the feasibility and efficiency of different statistical approaches for the separation of overlapped latent fingermarks based on MSI data. Entropy analysis of generated m/z-images was used to evaluate the results obtained from the statistical analysis. Furthermore, we demonstrate and discuss the opportunity to reconstitute and separate overlapping fingermarks by discrete scanning at selected x,y-positions defined from a previous image analysis using a more simple schema based on visible and therefore optical distinguishable overlapped ink-based fingermarks. The overlapped latent fingermarks were developed by rapid gold sputter coating and analyzed by laser based MSI, without (organic) matrix preparation. Latent finger marks can be transferred from the substrate/surface with and conserved on a soft gold sputtered soft membrane at low temperatures.展开更多
基金support of the German Research Foundation (DFG) through Grant SM 281/20-1the Hellenic Foundation for Research and Innovation (HFRI) under the 3rd Call for PhD fellowships (Fellowship Number: 6522)
文摘We present a statistical investigation of the degree of influence that assumptions made in relation to the mechanical parameters of a pylon have on its ground-induced vibrations.The study is set up by using as a key kinematic variable the displacement at the top of a reference,a stand-alone pylon with a uniform cross-section and fixity at its base.Next,statistics are produced using a dimensionless displacement ratio defined between the‘parental’and the‘subsidiary’cases,the latter defined for the pylon(a)resting on compliant soil,(b)having an attached top mass,and(c)being non-uniform with height.Furthermore,two materials are examined,namely,steel and reinforced concrete(R/C).More specifically,this displacement ratio is independent of the excitation and plays the role of a transfer function between the base and the top of the pylon.Both horizontal and vertical motions are considered,and the equations of motion are solved in the frequency domain.The ensuing statistical analysis is conducted for the following parameter combinations:(a)pylon founded on soft,intermediate,and stiff soil;(b)low,intermediate,and high-mass ratios of the attached mass to the pylon′s mass;(c)a constant and quadratic degree of pylon tapering with height.Spearman correlation coefficients are calculated for all the above combinations to arrive at statistical results that establish validity bounds and quantify the degree of influence of each assumption on the pylon′s response.
基金funded by the Key R&D project of the Ministry of Science and Technology,China(2017YFC1700604).
文摘Background:Patients with colon cancer who receive chemotherapy usually experience various gastrointestinal adverse reactions,including nausea,vomiting,and diarrhea,which make it challenging for them to adhere to treatment.As an effective traditional Chinese medicine,the Jianpi Bushen formula has been widely used to alleviate the side effects of chemotherapy.Objective:To evaluate the efficacy and safety of Jianpi Bushen formulae for patients who undergo chemotherapy.This statistical analysis plan(SAP)is intended to enhance the transparency and research quality of our randomized controlled trial.Methods:Our study is a multicenter,double-blind,randomized controlled clinical trial.This trial aimed to compare the completion rate of chemotherapy in colon cancer patients who are using and not using Jianpi Bushen formula.To attenuate possible selection bias in the final report,we declared the overall trial design,outcome measures,subgroup analyses,and safety measures.Also,we described the data management and statistical analysis methods in detail.Conclusion:The SAP provides more detailed information than the trial protocol for data management and statistical analysis methods.Further post-hoc analyses can be performed by referring to the SAP,and possible selection bias can be attenuated.
文摘The rejected specimens from the Emergency Department of the Center of Clinical Laboratory from January 1,2022 to January 1,2023 were analyzed to reduce the specimen rejection rates and to improve the quality of inspection.The results showed that there were 1488 samples of rejected specimens and the non-conforming rate was 0.58%.The departments involved were mainly the Emergency Department,the Hematology Department,the Cardiology Department,the Intensive Care Department,and the Brain Surgery Department.Among the reasons for rejection,blood hemolysis accounted for 43.15%,blood coagulation accounted for 26.61%,and the rate of insufficient specimens was 17.14%.Among them,the sample rejection rate for arterial blood gas analysis was the highest,which accounted for 1.74%;followed by specimens for coagulation test,which was 1.18%.These results indicate the main reason for producing rejected specimens is mainly due to not following the standard operating procedure.Specimen rejection can largely be avoided if the standards for specimen collection are strictly followed.
基金China National Science Foundation(40730948,41075037,41175063)Special Project of Chinese Academy of Meteorological Sciences(2007Y006)
文摘Based on the Tropical Cyclone(TC briefly thereafter)Yearbook 1980-2009,this paper first analyzes the number and intensity change of the TCs which passed directly over or by the side of Poyang Lake(the distance of TC center is less than 1°longitude or 1°latitude from the Lake)among all the landfalling TCs in China during the past 30 years.Two cases are examined in detail in this paper.One is severe typhoon Rananim with a speed of 3.26 m/s and a change of 1 hPa in intensity when it was passing the Lake.The other is super typhoon Saomai with a faster moving speed of 6.50 m/s and a larger change in intensity of 6 hPa.Through numerical simulation experiments,this paper analyzes how the change of underlying surface from water to land contributes to the differences in intensity,speed and mesoscale convection of the two TCs when they passed the Lake.Results show that the moisture and dynamic condition above the Lake were favorable for the maintenance of the intensity when Rananim was passing through Poyang Lake,despite the moisture supply from the ocean was cut off.As a result,there was strong convection around the lake which led to a rainfall spinning counter-clockwise as it was affected by the TC movement.However,little impact was seen in the Saomai case.These results indicate that for the TCs coming ashore on Poyang Lake with a slow speed,the large water body is conducive to the sustaining of the intensity and strengthening of the convection around the TC center and the subsequent heavy rainfall.On the contrary,a fast-moving TC is less likely to be influenced by the underlying surface in terms of intensity and speed.
基金This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(NRF-2021R1A6A1A03039493).
文摘Due to the advances of intelligent transportation system(ITSs),traffic forecasting has gained significant interest as robust traffic prediction acts as an important part in different ITSs namely traffic signal control,navigation,route mapping,etc.The traffic prediction model aims to predict the traffic conditions based on the past traffic data.For more accurate traffic prediction,this study proposes an optimal deep learning-enabled statistical analysis model.This study offers the design of optimal convolutional neural network with attention long short term memory(OCNN-ALSTM)model for traffic prediction.The proposed OCNN-ALSTM technique primarily preprocesses the traffic data by the use of min-max normalization technique.Besides,OCNN-ALSTM technique was executed for classifying and predicting the traffic data in real time cases.For enhancing the predictive outcomes of the OCNN-ALSTM technique,the bird swarm algorithm(BSA)is employed to it and thereby overall efficacy of the network gets improved.The design of BSA for optimal hyperparameter tuning of the CNN-ALSTM model shows the novelty of the work.The experimental validation of the OCNNALSTM technique is performed using benchmark datasets and the results are examined under several aspects.The simulation results reported the enhanced outcomes of the OCNN-ALSTM model over the recent methods under several dimensions.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.61773091 and 61603073)the LiaoNing Revitalization Talents Program(Grant No.XLYC1807106)the Natural Science Foundation of Liaoning Province,China(Grant No.2020-MZLH-22).
文摘The establishment of effective null models can provide reference networks to accurately describe statistical properties of real-life signed networks.At present,two classical null models of signed networks(i.e.,sign and full-edge randomized models)shuffle both positive and negative topologies at the same time,so it is difficult to distinguish the effect on network topology of positive edges,negative edges,and the correlation between them.In this study,we construct three re-fined edge-randomized null models by only randomizing link relationships without changing positive and negative degree distributions.The results of nontrivial statistical indicators of signed networks,such as average degree connectivity and clustering coefficient,show that the position of positive edges has a stronger effect on positive-edge topology,while the signs of negative edges have a greater influence on negative-edge topology.For some specific statistics(e.g.,embeddedness),the results indicate that the proposed null models can more accurately describe real-life networks compared with the two existing ones,which can be selected to facilitate a better understanding of complex structures,functions,and dynamical behaviors on signed networks.
基金This project was funded by the Deanship of Scientific Research(DSR),King Abdulaziz University,Jeddah,Saudi Arabia under Grant No.(KEP-81-130-42)The authors,therefore acknowledge with thanks DSR technical and financial support。
文摘Security is a vital parameter to conserve energy in wireless sensor networks(WSN).Trust management in the WSN is a crucial process as trust is utilized when collaboration is important for accomplishing trustworthy data transmission.But the available routing techniques do not involve security in the design of routing techniques.This study develops a novel statistical analysis with dingo optimizer enabled reliable routing scheme(SADO-RRS)for WSN.The proposed SADO-RRS technique aims to detect the existence of attacks and optimal routes in WSN.In addition,the presented SADORRS technique derives a new statistics based linear discriminant analysis(LDA)for attack detection,Moreover,a trust based dingo optimizer(TBDO)algorithm is applied for optimal route selection in the WSN and accomplishes secure data transmission in WSN.Besides,the TBDO algorithm involves the derivation of the fitness function involving different input variables of WSN.For demonstrating the enhanced outcomes of the SADO-RRS technique,a wide range of simulations was carried out and the outcomes demonstrated the enhanced outcomes of the SADO-RRS technique.
文摘Geomechanical data are never sufficient in quantity or adequately precise and accurate for design purposes in mining and civil engineering.The objective of this paper is to show the variability of rock properties at the sampled point in the roadway's roof,and then,how the statistical processing of the available geomechanical data can affect the results of numerical modelling of the roadway's stability.Four cases were applied in the numerical analysis,using average values(the most common in geomechanical data analysis),average minus standard deviation,median,and average value minus statistical error.The study show that different approach to the same geomechanical data set can change the modelling results considerably.The case shows that average minus standard deviation is the most conservative and least risky.It gives the displacements and yielded elements zone in four times broader range comparing to the average values scenario,which is the least conservative option.The two other cases need to be studied further.The results obtained from them are placed between most favorable and most adverse values.Taking the average values corrected by statistical error for the numerical analysis seems to be the best solution.Moreover,the confidence level can be adjusted depending on the object importance and the assumed risk level.
文摘The ultimate pit may affect other aspects in the life of a mine such as economical, technical, environmental, and social aspects. What makes it even more complex is that most often there are many pits which are economically minable. This calls for a heuristic approach to determine which of these pits is the ultimate pit. This study presents a means of selecting an ultimate pit during design operations of the Hebei Limestone mine. During optimization processes of the mine, many pit shells were created using Whittle Software. Normally, Whittle Software optimizes these processes and assigns a revenue factor of 1 for the ultimate pit. Unfortunately, the pit shells created did not satisfy the criteria with a revenue factor of 1 based on the parameters. As a result of this, statistical analysis was implemented to further understand the relationship, variability, and correlation of the pit shells created (data). Correlation Analysis, K-means++ Analysis, Principal Component Analysis, and Generalized Linear models were used in the analysis of the pit shells created. The results portray a salient relationship of the optimization variables. In addition, the proposed method was tested on Whittle Sample projects which satisfy the selection of ultimate pit selection with a revenue factor of 1. The results show that the proposed model produced almost the same results as the Whittle model with a revenue factor of 1 and was also able to determine the ultimate pit in cases which did not satisfy the Whittle selection criteria.
文摘Qasab basin is one of the most promising areas for the sustainable development in the Eastern Desert fringes of the Nile Valley, Egypt. The integration between statistical analysis, stable isotopes as well as geochemical modeling tools delineated the geochemical possesses affecting groundwater quality and detected the main recharge source in Qasab basin. The most of groundwater samples are brackish (88%), while the minority (12%) of the samples are fresh. The electrical conductivity of groundwater ranged from 1135 to 10,030 μS/cm. The statistical analysis and hydrochemical diagrams suggest that the groundwater quality is mainly controlled by several intermixed processes (rock weathering and agricultural activities). The mineralization of the Pleistocene groundwater is regulated by the rock weathering source, evaporation processes and reverse cation exchange. The isotopic signatures (δ<sup>2</sup>H and δ<sup>18</sup>O) represent two groundwater groups. The first group, is enriched with the isotopic signature of δ<sup>18</sup>O, which ranges from 0.9‰ to 5.5‰. This group is mostly affected by the recent meteoric recharge from the surface water leakage. The second group, is relatively depleted with the isotopic signature of δ<sup>18</sup>O, reflecting a palaeo recharge source of colder climate. The δ<sup>18</sup>O‰ varies from <span style="color:#4F4F4F;font-family:"font-size:14px;white-space:normal;background-color:#FFFFFF;">-</span>10.1‰ to <span style="color:#4F4F4F;font-family:"font-size:14px;white-space:normal;background-color:#FFFFFF;">-</span>6.4‰, indicating upward leakage of the Nubian sandstone aquifer through deep seated faults. The inverse geochemical model reflects that the salinity source of the groundwater samples is due to the leaching and dissolution processes of carbonate, sulphate and chloride minerals from the aquifer matrix. This study can demonstrate the hydrochemistry assessment guide to support sustainable development in Qasab basin to ensure that adequate groundwater management can play to reduce poverty and support socioeconomic development.
基金The authors thank National Key Research and Development Program of China(2018YFC1705900)National Natural Science Foundation of China(No.81903706)+1 种基金Distinguished professor of Liaoning Province(XLYC2002008)Science Foundation of Department of Education of Liaoning Province(LZ2020054)for financial support.
文摘Jinhongtang is a traditional Chinese medicine formula composed of Rheum palmatum L.stem,Sargentodoxa cuneata stem,and Taraxacum mongolicum and is used for the treatment of sepsis.However,quality assessment method for Jinhongtang is not available.In present study,we developed a UFLC-MS/MS method to determine 16 analytes in 20 batches of home-made and commercial Jinhongtang.Multivariate statistical analysis revealed the significant differences in the quality of home-made and commercial Jinhongtang and the difference in the quality of home-made samples was more significant.The integrated strategy based on UFLC-MS/MS and multivariate statistical analysis provided a new basis for the overall quality assessment of Jinhongtang.
文摘Biology is a challenging and complicated mess. Understanding this challenging complexity is the realm of the biological sciences: Trying to make sense of the massive, messy data in terms of discovering patterns and revealing its underlying general rules. Among the most powerful mathematical tools for organizing and helping to structure complex, heterogeneous and noisy data are the tools provided by multivariate statistical analysis (MSA) approaches. These eigenvector/eigenvalue data-compression approaches were first introduced to electron microscopy (EM) in 1980 to help sort out different views of macromolecules in a micrograph. After 35 years of continuous use and developments, new MSA applications are still being proposed regularly. The speed of computing has increased dramatically in the decades since their first use in electron microscopy. However, we have also seen a possibly even more rapid increase in the size and complexity of the EM data sets to be studied. MSA computations had thus become a very serious bottleneck limiting its general use. The parallelization of our programs—speeding up the process by orders of magnitude—has opened whole new avenues of research. The speed of the automatic classification in the compressed eigenvector space had also become a bottleneck which needed to be removed. In this paper we explain the basic principles of multivariate statistical eigenvector-eigenvalue data compression;we provide practical tips and application examples for those working in structural biology, and we provide the more experienced researcher in this and other fields with the formulas associated with these powerful MSA approaches.
文摘Chinese traditional shadow play has been selected into the List of Intangible Cultural Heritage in 2011.Yet,reflecting abundant national cultural values,such traditional art form is degenerating and fading out from people’s sight.As the earliest statistical analysis software,Statistical Package for the Social Science(SPSS)is comprehensive in analyzing and managing statistical data.This study explores the application of SPSS in minimizing the workload of researchers while raising the validity of data in supporting the analysis of the survey data which reflected the inheritance and development of Chinese traditional shadow play in schools.
文摘Social computing and online groups have accompanied in a new age of the network, where information, networking and communication technologies are enabling systematized human efforts in primarily innovative ways. The social network communities working on various social network domains face different hurdles, including various new research studies and challenges in social computing. The researcher should try to expand the scope and establish new ideas and methods even from other disciplines to address the various challenges. This idea has diverse academic association, social links and technical characteristics. Thus it offers an ultimate opportunity for researchers to find out the issues in social computing and provide innovative solutions for conveying the information between social online groups on network computing. In this research paper we investigate the different issues in social media like users’ privacy and security, network reliabilities, and desire data availability on these social media, users’ awareness about the social networks and problems faced by academic domains. A huge number of users operated the social networks for retrieving and disseminating their real time and offline information to various places. The information may be transmitted on local networks or may be on global networks. The main concerns of users on social media are secure and fast communication channels. Facebook and YouTube both claimed for efficient security mechanism and fast communication channels for multimedia data. In this research a survey has been conducted in the most populated cities where a large number of Facebook and YouTube users have been found. During the survey several regular users indicate the certain potential issues continuously occurred on these social web sites interfaces, for example unwanted advertisement, fake IDS, uncensored videos and unknown friend request which cause the poor speed of channel communication, poor uploading and downloading data speed, channel interferences, security of data, privacy of users, integrity and reliability of user communication on these social sites. The major issues faced by active users of Facebook and YouTube have been highlighted in this research.
文摘This paper analyzes the application value of statistical analysis method of big data in economic management from the macro and micro perspectives,and analyzes its specific application from three aspects such as economic trends,industrial operations and marketing strategies.
文摘Since Ko Kung-chen published History of Chinese Journalism with Shanghai Commercial Press in 1927,this work had been published by different publishing houses and reprinted many times amounting to 18 editions.There are many historical narrative and fact errors of alien press in this work,and up to date these errors are not revised.This essay aims to revise the omitted errors of alien press in History of Chinese Journalism systematically besides those corrected by other works or essays,and carry on the statistical analysis of the errors to probe new approach for the study of alien periodicals and press in China.
文摘This study analyzed rainfall variability in Southeast region of Nigeria using graphical models,as well as using statistical approach to investigate any significant relationship between the global North Atlantic Oscillation(NAO)Index and the regional rainfall variability in region.The study was conducted in three States of Southeastern Nigeria namely,Abia,Ebonyi and Imo States that lie between Latitudes 4040’and 8050’N and Longitudes 6020’and 8050’E.Data for the study included 30 years(1988-2017)archival time-series monthly rainfall values for the three study States,acquired from Nigerian Meteorological Agency(NIMET),offices in the states,and Standardized values of NAOI(North Atlantic Oscillation Index)for the same period,which were collected from a website,on the NOAA Data Center,USA.In the data analyses,the first method was adopted by using graphs to illustrate mean annual rainfall values for thirty years.Coefficient of variability was employed in evaluating the degree of variability of values from the mean rate.The second analysis was accomplished using correlation models to ascertain any relationship between NAOI and rainfall in Southeast Nigeria.The results showed a significant variability of rainfall in the region from January to December(mean monthly)within the study period.A negative correlation value of 0.7525 was obtained from the correlation analysis,showing that the global NAO index and rainfall variability deviate in the opposite direction.Coefficient of multiple determinations(CMD)subsequently showed value of 0.031%,being the variation in rainfall as influenced by the global teleconnectivity,and this means that the NAO index has zero or no influence on rainfall variability in Southeast region of Nigeria.
文摘This study comprises a climatology of the spatial variability of precipitation over the São Francisco River Basin (SFRB), characterized by its geographic heterogeneity. The different rainfall regimes in the region were analyzed through statistical and spectral analyses. Measured precipitation data, Pacific Decennial Climate indexes, ENSO, Atlantic Multidecadal Oscillation, North Atlantic Oscillation, Atlantic dipole, and the sunspot cycle over 65 years were used. The rainfall data were filtered and filled in using the regional weighting method. The spatial and temporal variability of precipitation along the SFRB is remarkable. A pattern was observed along with the time series of precipitation over the SFRB. The cluster analysis identified four homogeneous regions in the SFRB and explained 87.4% of the total variance of the average monthly rainfall of the 199 rain gauges. The Cross-wavelet analysis identified the relationship between the precipitation data series and the climatic indexes that are analyzed in this work.
文摘Overlapping latent fingermarks constitute a serious challenge to database related recognition and matching algorithms in biometry, forensic and crime scene investigations. Mass spectrometry imaging (MSI) is a powerful tool for deciphering and analyzing overlapping fingermarks based on the individual chemical information of each deposit. Fingermark MSI in practice still requires a subjective judgment of an MSI expert, such that rapid analysis, automation, standardization, and a quantitative evaluation of the complete detection and separation process of overlapped fingermarks from MSI data sets is the ultimate goal and will be necessary to become an accepted process in criminal investigations and law enforcement. Here we investigated the feasibility and efficiency of different statistical approaches for the separation of overlapped latent fingermarks based on MSI data. Entropy analysis of generated m/z-images was used to evaluate the results obtained from the statistical analysis. Furthermore, we demonstrate and discuss the opportunity to reconstitute and separate overlapping fingermarks by discrete scanning at selected x,y-positions defined from a previous image analysis using a more simple schema based on visible and therefore optical distinguishable overlapped ink-based fingermarks. The overlapped latent fingermarks were developed by rapid gold sputter coating and analyzed by laser based MSI, without (organic) matrix preparation. Latent finger marks can be transferred from the substrate/surface with and conserved on a soft gold sputtered soft membrane at low temperatures.