期刊文献+
共找到164,651篇文章
< 1 2 250 >
每页显示 20 50 100
Use of machine learning models for the prognostication of liver transplantation: A systematic review 被引量:1
1
作者 Gidion Chongo Jonathan Soldera 《World Journal of Transplantation》 2024年第1期164-188,共25页
BACKGROUND Liver transplantation(LT)is a life-saving intervention for patients with end-stage liver disease.However,the equitable allocation of scarce donor organs remains a formidable challenge.Prognostic tools are p... BACKGROUND Liver transplantation(LT)is a life-saving intervention for patients with end-stage liver disease.However,the equitable allocation of scarce donor organs remains a formidable challenge.Prognostic tools are pivotal in identifying the most suitable transplant candidates.Traditionally,scoring systems like the model for end-stage liver disease have been instrumental in this process.Nevertheless,the landscape of prognostication is undergoing a transformation with the integration of machine learning(ML)and artificial intelligence models.AIM To assess the utility of ML models in prognostication for LT,comparing their performance and reliability to established traditional scoring systems.METHODS Following the Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines,we conducted a thorough and standardized literature search using the PubMed/MEDLINE database.Our search imposed no restrictions on publication year,age,or gender.Exclusion criteria encompassed non-English studies,review articles,case reports,conference papers,studies with missing data,or those exhibiting evident methodological flaws.RESULTS Our search yielded a total of 64 articles,with 23 meeting the inclusion criteria.Among the selected studies,60.8%originated from the United States and China combined.Only one pediatric study met the criteria.Notably,91%of the studies were published within the past five years.ML models consistently demonstrated satisfactory to excellent area under the receiver operating characteristic curve values(ranging from 0.6 to 1)across all studies,surpassing the performance of traditional scoring systems.Random forest exhibited superior predictive capabilities for 90-d mortality following LT,sepsis,and acute kidney injury(AKI).In contrast,gradient boosting excelled in predicting the risk of graft-versus-host disease,pneumonia,and AKI.CONCLUSION This study underscores the potential of ML models in guiding decisions related to allograft allocation and LT,marking a significant evolution in the field of prognostication. 展开更多
关键词 Liver transplantation machine learning models PROGNOSTICATION Allograft allocation Artificial intelligence
下载PDF
Machine learning for predicting the outcome of terminal ballistics events
2
作者 Shannon Ryan Neeraj Mohan Sushma +4 位作者 Arun Kumar AV Julian Berk Tahrima Hashem Santu Rana Svetha Venkatesh 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2024年第1期14-26,共13页
Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression mode... Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression models,extreme gradient boosting(XGBoost),artificial neural network(ANN),support vector regression(SVR),and Gaussian process regression(GP),on two common terminal ballistics’ problems:(a)predicting the V50ballistic limit of monolithic metallic armour impacted by small and medium calibre projectiles and fragments,and(b) predicting the depth to which a projectile will penetrate a target of semi-infinite thickness.To achieve this we utilise two datasets,each consisting of approximately 1000samples,collated from public release sources.We demonstrate that all four model types provide similarly excellent agreement when interpolating within the training data and diverge when extrapolating outside this range.Although extrapolation is not advisable for ML-based regression models,for applications such as lethality/survivability analysis,such capability is required.To circumvent this,we implement expert knowledge and physics-based models via enforced monotonicity,as a Gaussian prior mean,and through a modified loss function.The physics-informed models demonstrate improved performance over both classical physics-based models and the basic ML regression models,providing an ability to accurately fit experimental data when it is available and then revert to the physics-based model when not.The resulting models demonstrate high levels of predictive accuracy over a very wide range of projectile types,target materials and thicknesses,and impact conditions significantly more diverse than that achievable from any existing analytical approach.Compared with numerical analysis tools such as finite element solvers the ML models run orders of magnitude faster.We provide some general guidelines throughout for the development,application,and reporting of ML models in terminal ballistics problems. 展开更多
关键词 machine learning Artificial intelligence Physics-informed machine learning Terminal ballistics Armour
下载PDF
Advancements in machine learning for material design and process optimization in the field of additive manufacturing
3
作者 Hao-ran Zhou Hao Yang +8 位作者 Huai-qian Li Ying-chun Ma Sen Yu Jian shi Jing-chang Cheng Peng Gao Bo Yu Zhi-quan Miao Yan-peng Wei 《China Foundry》 SCIE EI CAS CSCD 2024年第2期101-115,共15页
Additive manufacturing technology is highly regarded due to its advantages,such as high precision and the ability to address complex geometric challenges.However,the development of additive manufacturing process is co... Additive manufacturing technology is highly regarded due to its advantages,such as high precision and the ability to address complex geometric challenges.However,the development of additive manufacturing process is constrained by issues like unclear fundamental principles,complex experimental cycles,and high costs.Machine learning,as a novel artificial intelligence technology,has the potential to deeply engage in the development of additive manufacturing process,assisting engineers in learning and developing new techniques.This paper provides a comprehensive overview of the research and applications of machine learning in the field of additive manufacturing,particularly in model design and process development.Firstly,it introduces the background and significance of machine learning-assisted design in additive manufacturing process.It then further delves into the application of machine learning in additive manufacturing,focusing on model design and process guidance.Finally,it concludes by summarizing and forecasting the development trends of machine learning technology in the field of additive manufacturing. 展开更多
关键词 additive manufacturing machine learning material design process optimization intersection of disciplines embedded machine learning
下载PDF
Routine utilization of machine perfusion in liver transplantation:Ready for prime time?
4
作者 Alessandro Parente Keyue Sun +2 位作者 Philipp Dutkowski AM James Shapiro Andrea Schlegel 《World Journal of Gastroenterology》 SCIE CAS 2024年第11期1488-1493,共6页
The last decade has been notable for increasing high-quality research and dramatic improvement in outcomes with dynamic liver preservation.Robust evidence from numerous randomized controlled trials has been pooled by ... The last decade has been notable for increasing high-quality research and dramatic improvement in outcomes with dynamic liver preservation.Robust evidence from numerous randomized controlled trials has been pooled by meta-analyses,providing the highest available evidence on the protective effect of machine perfusion(MP)over static cold storage in liver transplantation(LT).Based on a protective effect with less complications and improved graft survival,the field has seen a paradigm shift in organ preservation.This editorial focuses on the role of MP in LT and how it could become the new“gold standard”.Strong collaborative efforts are needed to explore its effects on long-term outcomes. 展开更多
关键词 Liver transplantation machine perfusion Viability assessment Hypothermic oxygenated perfusion Normothermic machine perfusion
下载PDF
Machine learning model based on non-convex penalized huberized-SVM
5
作者 Peng Wang Ji Guo Lin-Feng Li 《Journal of Electronic Science and Technology》 EI CAS CSCD 2024年第1期81-94,共14页
The support vector machine(SVM)is a classical machine learning method.Both the hinge loss and least absolute shrinkage and selection operator(LASSO)penalty are usually used in traditional SVMs.However,the hinge loss i... The support vector machine(SVM)is a classical machine learning method.Both the hinge loss and least absolute shrinkage and selection operator(LASSO)penalty are usually used in traditional SVMs.However,the hinge loss is not differentiable,and the LASSO penalty does not have the Oracle property.In this paper,the huberized loss is combined with non-convex penalties to obtain a model that has the advantages of both the computational simplicity and the Oracle property,contributing to higher accuracy than traditional SVMs.It is experimentally demonstrated that the two non-convex huberized-SVM methods,smoothly clipped absolute deviation huberized-SVM(SCAD-HSVM)and minimax concave penalty huberized-SVM(MCP-HSVM),outperform the traditional SVM method in terms of the prediction accuracy and classifier performance.They are also superior in terms of variable selection,especially when there is a high linear correlation between the variables.When they are applied to the prediction of listed companies,the variables that can affect and predict financial distress are accurately filtered out.Among all the indicators,the indicators per share have the greatest influence while those of solvency have the weakest influence.Listed companies can assess the financial situation with the indicators screened by our algorithm and make an early warning of their possible financial distress in advance with higher precision. 展开更多
关键词 Huberized loss machine learning Non-convex penalties Support vector machine(SVM)
下载PDF
Prediction of lime utilization ratio of dephosphorization in BOF steelmaking based on online sequential extreme learning machine with forgetting mechanism
6
作者 Runhao Zhang Jian Yang +1 位作者 Han Sun Wenkui Yang 《International Journal of Minerals,Metallurgy and Materials》 SCIE EI CAS CSCD 2024年第3期508-517,共10页
The machine learning models of multiple linear regression(MLR),support vector regression(SVR),and extreme learning ma-chine(ELM)and the proposed ELM models of online sequential ELM(OS-ELM)and OS-ELM with forgetting me... The machine learning models of multiple linear regression(MLR),support vector regression(SVR),and extreme learning ma-chine(ELM)and the proposed ELM models of online sequential ELM(OS-ELM)and OS-ELM with forgetting mechanism(FOS-ELM)are applied in the prediction of the lime utilization ratio of dephosphorization in the basic oxygen furnace steelmaking process.The ELM model exhibites the best performance compared with the models of MLR and SVR.OS-ELM and FOS-ELM are applied for sequential learning and model updating.The optimal number of samples in validity term of the FOS-ELM model is determined to be 1500,with the smallest population mean absolute relative error(MARE)value of 0.058226 for the population.The variable importance analysis reveals lime weight,initial P content,and hot metal weight as the most important variables for the lime utilization ratio.The lime utilization ratio increases with the decrease in lime weight and the increases in the initial P content and hot metal weight.A prediction system based on FOS-ELM is applied in actual industrial production for one month.The hit ratios of the predicted lime utilization ratio in the error ranges of±1%,±3%,and±5%are 61.16%,90.63%,and 94.11%,respectively.The coefficient of determination,MARE,and root mean square error are 0.8670,0.06823,and 1.4265,respectively.The system exhibits desirable performance for applications in actual industrial pro-duction. 展开更多
关键词 basic oxygen furnace steelmaking machine learning lime utilization ratio DEPHOSPHORIZATION online sequential extreme learning machine forgetting mechanism
下载PDF
Comparative study of different machine learning models in landslide susceptibility assessment: A case study of Conghua District, Guangzhou, China
7
作者 Ao Zhang Xin-wen Zhao +8 位作者 Xing-yuezi Zhao Xiao-zhan Zheng Min Zeng Xuan Huang Pan Wu Tuo Jiang Shi-chang Wang Jun He Yi-yong Li 《China Geology》 CAS CSCD 2024年第1期104-115,共12页
Machine learning is currently one of the research hotspots in the field of landslide prediction.To clarify and evaluate the differences in characteristics and prediction effects of different machine learning models,Co... Machine learning is currently one of the research hotspots in the field of landslide prediction.To clarify and evaluate the differences in characteristics and prediction effects of different machine learning models,Conghua District,which is the most prone to landslide disasters in Guangzhou,was selected for landslide susceptibility evaluation.The evaluation factors were selected by using correlation analysis and variance expansion factor method.Applying four machine learning methods namely Logistic Regression(LR),Random Forest(RF),Support Vector Machines(SVM),and Extreme Gradient Boosting(XGB),landslide models were constructed.Comparative analysis and evaluation of the model were conducted through statistical indices and receiver operating characteristic(ROC)curves.The results showed that LR,RF,SVM,and XGB models have good predictive performance for landslide susceptibility,with the area under curve(AUC)values of 0.752,0.965,0.996,and 0.998,respectively.XGB model had the highest predictive ability,followed by RF model,SVM model,and LR model.The frequency ratio(FR)accuracy of LR,RF,SVM,and XGB models was 0.775,0.842,0.759,and 0.822,respectively.RF and XGB models were superior to LR and SVM models,indicating that the integrated algorithm has better predictive ability than a single classification algorithm in regional landslide classification problems. 展开更多
关键词 Landslides susceptibility assessment machine learning Logistic Regression Random Forest Support Vector machines XGBoost Assessment model Geological disaster investigation and prevention engineering
下载PDF
Machine learning for membrane design and discovery
8
作者 Haoyu Yin Muzi Xu +4 位作者 Zhiyao Luo Xiaotian Bi Jiali Li Sui Zhang Xiaonan Wang 《Green Energy & Environment》 SCIE EI CAS CSCD 2024年第1期54-70,共17页
Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research an... Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research and development norm of new materials for energy and environment.This review provides an overview and perspectives on ML methodologies and their applications in membrane design and dis-covery.A brief overview of membrane technologies isfirst provided with the current bottlenecks and potential solutions.Through an appli-cations-based perspective of AI-aided membrane design and discovery,we further show how ML strategies are applied to the membrane discovery cycle(including membrane material design,membrane application,membrane process design,and knowledge extraction),in various membrane systems,ranging from gas,liquid,and fuel cell separation membranes.Furthermore,the best practices of integrating ML methods and specific application targets in membrane design and discovery are presented with an ideal paradigm proposed.The challenges to be addressed and prospects of AI applications in membrane discovery are also highlighted in the end. 展开更多
关键词 machine learning Membranes AI for Membrane DATA-DRIVEN DESIGN
下载PDF
Enhanced prediction of anisotropic deformation behavior using machine learning with data augmentation
9
作者 Sujeong Byun Jinyeong Yu +3 位作者 Seho Cheon Seong Ho Lee Sung Hyuk Park Taekyung Lee 《Journal of Magnesium and Alloys》 SCIE EI CAS CSCD 2024年第1期186-196,共11页
Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary w... Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys. 展开更多
关键词 Plastic anisotropy Compression ANNEALING machine learning Data augmentation
下载PDF
Thermal conductivity of GeTe crystals based on machine learning potentials
10
作者 张健 张昊春 +1 位作者 李伟峰 张刚 《Chinese Physics B》 SCIE EI CAS CSCD 2024年第4期104-107,共4页
GeTe has attracted extensive research interest for thermoelectric applications.In this paper,we first train a neuroevolution potential(NEP)based on a dataset constructed by ab initio molecular dynamics,with the Gaussi... GeTe has attracted extensive research interest for thermoelectric applications.In this paper,we first train a neuroevolution potential(NEP)based on a dataset constructed by ab initio molecular dynamics,with the Gaussian approximation potential(GAP)as a reference.The phonon density of states is then calculated by two machine learning potentials and compared with density functional theory results,with the GAP potential having higher accuracy.Next,the thermal conductivity of a GeTe crystal at 300 K is calculated by the equilibrium molecular dynamics method using both machine learning potentials,and both of them are in good agreement with the experimental results;however,the calculation speed when using the NEP potential is about 500 times faster than when using the GAP potential.Finally,the lattice thermal conductivity in the range of 300 K-600 K is calculated using the NEP potential.The lattice thermal conductivity decreases as the temperature increases due to the phonon anharmonic effect.This study provides a theoretical tool for the study of the thermal conductivity of GeTe. 展开更多
关键词 machine learning potentials thermal conductivity molecular dynamics
原文传递
Reconstruction of poloidal magnetic field profiles in field-reversed configurations with machine learning in laser-driven ion-beam trace probe
11
作者 徐栩涛 徐田超 +4 位作者 肖池阶 张祖煜 何任川 袁瑞鑫 许平 《Plasma Science and Technology》 SCIE EI CAS CSCD 2024年第3期83-87,共5页
The diagnostic of poloidal magnetic field(B_(p))in field-reversed configuration(FRC),promising for achieving efficient plasma confinement due to its highβ,is a huge challenge because B_(p)is small and reverses around... The diagnostic of poloidal magnetic field(B_(p))in field-reversed configuration(FRC),promising for achieving efficient plasma confinement due to its highβ,is a huge challenge because B_(p)is small and reverses around the core region.The laser-driven ion-beam trace probe(LITP)has been proven to diagnose the B_(p)profile in FRCs recently,whereas the existing iterative reconstruction approach cannot handle the measurement errors well.In this work,the machine learning approach,a fast-growing and powerful technology in automation and control,is applied to B_(p)reconstruction in FRCs based on LITP principles and it has a better performance than the previous approach.The machine learning approach achieves a more accurate reconstruction of B_(p)profile when 20%detector errors are considered,15%B_(p)fluctuation is introduced and the size of the detector is remarkably reduced.Therefore,machine learning could be a powerful support for LITP diagnosis of the magnetic field in magnetic confinement fusion devices. 展开更多
关键词 FRC LITP poloidal magnetic field diagnostics machine learning
下载PDF
Machine learning in metal-ion battery research: Advancing material prediction, characterization, and status evaluation
12
作者 Tong Yu Chunyang Wang +1 位作者 Huicong Yang Feng Li 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第3期191-204,I0006,共15页
Metal-ion batteries(MIBs),including alkali metal-ion(Li^(+),Na^(+),and K^(3)),multi-valent metal-ion(Zn^(2+),Mg^(2+),and Al^(3+)),metal-air,and metal-sulfur batteries,play an indispensable role in electrochemical ener... Metal-ion batteries(MIBs),including alkali metal-ion(Li^(+),Na^(+),and K^(3)),multi-valent metal-ion(Zn^(2+),Mg^(2+),and Al^(3+)),metal-air,and metal-sulfur batteries,play an indispensable role in electrochemical energy storage.However,the performance of MIBs is significantly influenced by numerous variables,resulting in multi-dimensional and long-term challenges in the field of battery research and performance enhancement.Machine learning(ML),with its capability to solve intricate tasks and perform robust data processing,is now catalyzing a revolutionary transformation in the development of MIB materials and devices.In this review,we summarize the utilization of ML algorithms that have expedited research on MIBs over the past five years.We present an extensive overview of existing algorithms,elucidating their details,advantages,and limitations in various applications,which encompass electrode screening,material property prediction,electrolyte formulation design,electrode material characterization,manufacturing parameter optimization,and real-time battery status monitoring.Finally,we propose potential solutions and future directions for the application of ML in advancing MIB development. 展开更多
关键词 Metal-ion battery machine learning Electrode materials CHARACTERIZATION Status evaluation
下载PDF
Assessment of compressive strength of jet grouting by machine learning
13
作者 Esteban Diaz Edgar Leonardo Salamanca-Medina Roberto Tomas 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2024年第1期102-111,共10页
Jet grouting is one of the most popular soil improvement techniques,but its design usually involves great uncertainties that can lead to economic cost overruns in construction projects.The high dispersion in the prope... Jet grouting is one of the most popular soil improvement techniques,but its design usually involves great uncertainties that can lead to economic cost overruns in construction projects.The high dispersion in the properties of the improved material leads to designers assuming a conservative,arbitrary and unjustified strength,which is even sometimes subjected to the results of the test fields.The present paper presents an approach for prediction of the uniaxial compressive strength(UCS)of jet grouting columns based on the analysis of several machine learning algorithms on a database of 854 results mainly collected from different research papers.The selected machine learning model(extremely randomized trees)relates the soil type and various parameters of the technique to the value of the compressive strength.Despite the complex mechanism that surrounds the jet grouting process,evidenced by the high dispersion and low correlation of the variables studied,the trained model allows to optimally predict the values of compressive strength with a significant improvement with respect to the existing works.Consequently,this work proposes for the first time a reliable and easily applicable approach for estimation of the compressive strength of jet grouting columns. 展开更多
关键词 Jet grouting Ground improvement Compressive strength machine learning
下载PDF
Recent advances in protein conformation sampling by combining machine learning with molecular simulation
14
作者 唐一鸣 杨中元 +7 位作者 姚逸飞 周运 谈圆 王子超 潘瞳 熊瑞 孙俊力 韦广红 《Chinese Physics B》 SCIE EI CAS CSCD 2024年第3期80-87,共8页
The rapid advancement and broad application of machine learning(ML)have driven a groundbreaking revolution in computational biology.One of the most cutting-edge and important applications of ML is its integration with... The rapid advancement and broad application of machine learning(ML)have driven a groundbreaking revolution in computational biology.One of the most cutting-edge and important applications of ML is its integration with molecular simulations to improve the sampling efficiency of the vast conformational space of large biomolecules.This review focuses on recent studies that utilize ML-based techniques in the exploration of protein conformational landscape.We first highlight the recent development of ML-aided enhanced sampling methods,including heuristic algorithms and neural networks that are designed to refine the selection of reaction coordinates for the construction of bias potential,or facilitate the exploration of the unsampled region of the energy landscape.Further,we review the development of autoencoder based methods that combine molecular simulations and deep learning to expand the search for protein conformations.Lastly,we discuss the cutting-edge methodologies for the one-shot generation of protein conformations with precise Boltzmann weights.Collectively,this review demonstrates the promising potential of machine learning in revolutionizing our insight into the complex conformational ensembles of proteins. 展开更多
关键词 machine learning molecular simulation protein conformational space enhanced sampling
原文传递
Smart Energy Management System Using Machine Learning
15
作者 Ali Sheraz Akram Sagheer Abbas +3 位作者 Muhammad Adnan Khan Atifa Athar Taher M.Ghazal Hussam Al Hamadi 《Computers, Materials & Continua》 SCIE EI 2024年第1期959-973,共15页
Energy management is an inspiring domain in developing of renewable energy sources.However,the growth of decentralized energy production is revealing an increased complexity for power grid managers,inferring more qual... Energy management is an inspiring domain in developing of renewable energy sources.However,the growth of decentralized energy production is revealing an increased complexity for power grid managers,inferring more quality and reliability to regulate electricity flows and less imbalance between electricity production and demand.The major objective of an energy management system is to achieve optimum energy procurement and utilization throughout the organization,minimize energy costs without affecting production,and minimize environmental effects.Modern energy management is an essential and complex subject because of the excessive consumption in residential buildings,which necessitates energy optimization and increased user comfort.To address the issue of energy management,many researchers have developed various frameworks;while the objective of each framework was to sustain a balance between user comfort and energy consumption,this problem hasn’t been fully solved because of how difficult it is to solve it.An inclusive and Intelligent Energy Management System(IEMS)aims to provide overall energy efficiency regarding increased power generation,increase flexibility,increase renewable generation systems,improve energy consumption,reduce carbon dioxide emissions,improve stability,and reduce energy costs.Machine Learning(ML)is an emerging approach that may be beneficial to predict energy efficiency in a better way with the assistance of the Internet of Energy(IoE)network.The IoE network is playing a vital role in the energy sector for collecting effective data and usage,resulting in smart resource management.In this research work,an IEMS is proposed for Smart Cities(SC)using the ML technique to better resolve the energy management problem.The proposed system minimized the energy consumption with its intelligent nature and provided better outcomes than the previous approaches in terms of 92.11% accuracy,and 7.89% miss-rate. 展开更多
关键词 Intelligent energy management system smart cities machine learning
下载PDF
Machine Learning-Based Decision-Making Mechanism for Risk Assessment of Cardiovascular Disease
16
作者 Cheng Wang Haoran Zhu Congjun Rao 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第1期691-718,共28页
Cardiovascular disease(CVD)has gradually become one of the main causes of harm to the life and health of residents.Exploring the influencing factors and risk assessment methods of CVD has become a general trend.In thi... Cardiovascular disease(CVD)has gradually become one of the main causes of harm to the life and health of residents.Exploring the influencing factors and risk assessment methods of CVD has become a general trend.In this paper,a machine learning-based decision-making mechanism for risk assessment of CVD is designed.In this mechanism,the logistics regression analysismethod and factor analysismodel are used to select age,obesity degree,blood pressure,blood fat,blood sugar,smoking status,drinking status,and exercise status as the main pathogenic factors of CVD,and an index systemof risk assessment for CVD is established.Then,a two-stage model combining K-means cluster analysis and random forest(RF)is proposed to evaluate and predict the risk of CVD,and the predicted results are compared with the methods of Bayesian discrimination,K-means cluster analysis and RF.The results show that thepredictioneffect of theproposedtwo-stagemodel is better than that of the comparedmethods.Moreover,several suggestions for the government,the medical industry and the public are provided based on the research results. 展开更多
关键词 CVD influencing factors risk assessment machine learning two-stage model
下载PDF
AutoRhythmAI: A Hybrid Machine and Deep Learning Approach for Automated Diagnosis of Arrhythmias
17
作者 S.Jayanthi S.Prasanna Devi 《Computers, Materials & Continua》 SCIE EI 2024年第2期2137-2158,共22页
In healthcare,the persistent challenge of arrhythmias,a leading cause of global mortality,has sparked extensive research into the automation of detection using machine learning(ML)algorithms.However,traditional ML and... In healthcare,the persistent challenge of arrhythmias,a leading cause of global mortality,has sparked extensive research into the automation of detection using machine learning(ML)algorithms.However,traditional ML and AutoML approaches have revealed their limitations,notably regarding feature generalization and automation efficiency.This glaring research gap has motivated the development of AutoRhythmAI,an innovative solution that integrates both machine and deep learning to revolutionize the diagnosis of arrhythmias.Our approach encompasses two distinct pipelines tailored for binary-class and multi-class arrhythmia detection,effectively bridging the gap between data preprocessing and model selection.To validate our system,we have rigorously tested AutoRhythmAI using a multimodal dataset,surpassing the accuracy achieved using a single dataset and underscoring the robustness of our methodology.In the first pipeline,we employ signal filtering and ML algorithms for preprocessing,followed by data balancing and split for training.The second pipeline is dedicated to feature extraction and classification,utilizing deep learning models.Notably,we introduce the‘RRI-convoluted trans-former model’as a novel addition for binary-class arrhythmias.An ensemble-based approach then amalgamates all models,considering their respective weights,resulting in an optimal model pipeline.In our study,the VGGRes Model achieved impressive results in multi-class arrhythmia detection,with an accuracy of 97.39%and firm performance in precision(82.13%),recall(31.91%),and F1-score(82.61%).In the binary-class task,the proposed model achieved an outstanding accuracy of 96.60%.These results highlight the effectiveness of our approach in improving arrhythmia detection,with notably high accuracy and well-balanced performance metrics. 展开更多
关键词 Automated machine learning neural networks deep learning ARRHYTHMIAS
下载PDF
Social Media-Based Surveillance Systems for Health Informatics Using Machine and Deep Learning Techniques:A Comprehensive Review and Open Challenges
18
作者 Samina Amin Muhammad Ali Zeb +3 位作者 Hani Alshahrani Mohammed Hamdi Mohammad Alsulami Asadullah Shaikh 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第5期1167-1202,共36页
Social media(SM)based surveillance systems,combined with machine learning(ML)and deep learning(DL)techniques,have shown potential for early detection of epidemic outbreaks.This review discusses the current state of SM... Social media(SM)based surveillance systems,combined with machine learning(ML)and deep learning(DL)techniques,have shown potential for early detection of epidemic outbreaks.This review discusses the current state of SM-based surveillance methods for early epidemic outbreaks and the role of ML and DL in enhancing their performance.Since,every year,a large amount of data related to epidemic outbreaks,particularly Twitter data is generated by SM.This paper outlines the theme of SM analysis for tracking health-related issues and detecting epidemic outbreaks in SM,along with the ML and DL techniques that have been configured for the detection of epidemic outbreaks.DL has emerged as a promising ML technique that adaptsmultiple layers of representations or features of the data and yields state-of-the-art extrapolation results.In recent years,along with the success of ML and DL in many other application domains,both ML and DL are also popularly used in SM analysis.This paper aims to provide an overview of epidemic outbreaks in SM and then outlines a comprehensive analysis of ML and DL approaches and their existing applications in SM analysis.Finally,this review serves the purpose of offering suggestions,ideas,and proposals,along with highlighting the ongoing challenges in the field of early outbreak detection that still need to be addressed. 展开更多
关键词 Social media EPIDEMIC machine learning deep learning health informatics PANDEMIC
下载PDF
Machine learning-enhanced Monte Carlo and subset simulations for advanced risk assessment in transportation infrastructure
19
作者 Furquan AHMAD Pijush SAMUI S.S.MISHRA 《Journal of Mountain Science》 SCIE CSCD 2024年第2期690-717,共28页
The maintenance of safety and dependability in rail and road embankments is of utmost importance in order to facilitate the smooth operation of transportation networks.This study introduces a comprehensive methodology... The maintenance of safety and dependability in rail and road embankments is of utmost importance in order to facilitate the smooth operation of transportation networks.This study introduces a comprehensive methodology for soil slope stability evaluation,employing Monte Carlo Simulation(MCS)and Subset Simulation(SS)with the"UPSS 3.0 Add-in"in MS-Excel.Focused on an 11.693-meter embankment with a soil slope(inclination ratio of 2H:1V),the investigation considers earthquake coefficients(kh)and pore water pressure ratios(ru)following Indian zoning requirements.The chance of slope failure showed a considerable increase as the Coefficient of Variation(COV),seismic coefficients(kh),and pore water pressure ratios(ru)experienced an escalation.The SS approach showed exceptional efficacy in calculating odds of failure that are notably low.Within computational modeling,the study optimized the worst-case scenario using ANFIS-GA,ANFIS-GWO,ANFIS-PSO,and ANFIS-BBO models.The ANFIS-PSO model exhibits exceptional accuracy(training R2=0.9011,RMSE=0.0549;testing R2=0.8968,RMSE=0.0615),emerging as the most promising.This study highlights the significance of conducting thorough risk assessments and offers practical insights into evaluating and improving the stability of soil slopes in transportation infrastructure.These findings contribute to the enhancement of safety and reliability in real-world situations. 展开更多
关键词 Monte Carlo Simulation Subset Simulation machine Learning Seismic coefficient
原文传递
Prediction of Damping Capacity Demand in Seismic Base Isolators via Machine Learning
20
作者 Ayla Ocak Umit Isıkdag +3 位作者 Gebrail Bekdas Sinan Melih Nigdeli Sanghun Kim ZongWoo Geem 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第3期2899-2924,共26页
Base isolators used in buildings provide both a good acceleration reduction and structural vibration control structures.The base isolators may lose their damping capacity over time due to environmental or dynamic effe... Base isolators used in buildings provide both a good acceleration reduction and structural vibration control structures.The base isolators may lose their damping capacity over time due to environmental or dynamic effects.This deterioration of them requires the determination of the maintenance and repair needs and is important for the long-termisolator life.In this study,an artificial intelligence prediction model has been developed to determine the damage and maintenance-repair requirements of isolators as a result of environmental effects and dynamic factors over time.With the developed model,the required damping capacity of the isolator structure was estimated and compared with the previously placed isolator capacity,and the decrease in the damping property was tried to be determined.For this purpose,a data set was created by collecting the behavior of structures with single degrees of freedom(SDOF),different stiffness,damping ratio and natural period isolated from the foundation under far fault earthquakes.The data is divided into 5 different damping classes varying between 10%and 50%.Machine learning model was trained in damping classes with the data on the structure’s response to random seismic vibrations.As a result of the isolator behavior under randomly selected earthquakes,the recorded motion and structural acceleration of the structure against any seismic vibration were examined,and the decrease in the damping capacity was estimated on a class basis.The performance loss of the isolators,which are separated according to their damping properties,has been tried to be determined,and the reductions in the amounts to be taken into account have been determined by class.In the developed prediction model,using various supervised machine learning classification algorithms,the classification algorithm providing the highest precision for the model has been decided.When the results are examined,it has been determined that the damping of the isolator structure with the machine learning method is predicted successfully at a level exceeding 96%,and it is an effective method in deciding whether there is a decrease in the damping capacity. 展开更多
关键词 Vibration control base isolation machine learning damping capacity
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部