期刊文献+
共找到171,731篇文章
< 1 2 250 >
每页显示 20 50 100
Significant risk factors for intensive care unit-acquired weakness:A processing strategy based on repeated machine learning 被引量:9
1
作者 Ling Wang Deng-Yan Long 《World Journal of Clinical Cases》 SCIE 2024年第7期1235-1242,共8页
BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective pr... BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective preventive measures.AIM To identify significant risk factors for ICU-AW through iterative machine learning techniques and offer recommendations for its prevention and treatment.METHODS Patients were categorized into ICU-AW and non-ICU-AW groups on the 14th day post-ICU admission.Relevant data from the initial 14 d of ICU stay,such as age,comorbidities,sedative dosage,vasopressor dosage,duration of mechanical ventilation,length of ICU stay,and rehabilitation therapy,were gathered.The relationships between these variables and ICU-AW were examined.Utilizing iterative machine learning techniques,a multilayer perceptron neural network model was developed,and its predictive performance for ICU-AW was assessed using the receiver operating characteristic curve.RESULTS Within the ICU-AW group,age,duration of mechanical ventilation,lorazepam dosage,adrenaline dosage,and length of ICU stay were significantly higher than in the non-ICU-AW group.Additionally,sepsis,multiple organ dysfunction syndrome,hypoalbuminemia,acute heart failure,respiratory failure,acute kidney injury,anemia,stress-related gastrointestinal bleeding,shock,hypertension,coronary artery disease,malignant tumors,and rehabilitation therapy ratios were significantly higher in the ICU-AW group,demonstrating statistical significance.The most influential factors contributing to ICU-AW were identified as the length of ICU stay(100.0%)and the duration of mechanical ventilation(54.9%).The neural network model predicted ICU-AW with an area under the curve of 0.941,sensitivity of 92.2%,and specificity of 82.7%.CONCLUSION The main factors influencing ICU-AW are the length of ICU stay and the duration of mechanical ventilation.A primary preventive strategy,when feasible,involves minimizing both ICU stay and mechanical ventilation duration. 展开更多
关键词 Intensive care unit-acquired weakness Risk factors machine learning PREVENTION Strategies
下载PDF
High-throughput calculations combining machine learning to investigate the corrosion properties of binary Mg alloys 被引量:3
2
作者 Yaowei Wang Tian Xie +4 位作者 Qingli Tang Mingxu Wang Tao Ying Hong Zhu Xiaoqin Zeng 《Journal of Magnesium and Alloys》 SCIE EI CAS CSCD 2024年第4期1406-1418,共13页
Magnesium(Mg)alloys have shown great prospects as both structural and biomedical materials,while poor corrosion resistance limits their further application.In this work,to avoid the time-consuming and laborious experi... Magnesium(Mg)alloys have shown great prospects as both structural and biomedical materials,while poor corrosion resistance limits their further application.In this work,to avoid the time-consuming and laborious experiment trial,a high-throughput computational strategy based on first-principles calculations is designed for screening corrosion-resistant binary Mg alloy with intermetallics,from both the thermodynamic and kinetic perspectives.The stable binary Mg intermetallics with low equilibrium potential difference with respect to the Mg matrix are firstly identified.Then,the hydrogen adsorption energies on the surfaces of these Mg intermetallics are calculated,and the corrosion exchange current density is further calculated by a hydrogen evolution reaction(HER)kinetic model.Several intermetallics,e.g.Y_(3)Mg,Y_(2)Mg and La_(5)Mg,are identified to be promising intermetallics which might effectively hinder the cathodic HER.Furthermore,machine learning(ML)models are developed to predict Mg intermetallics with proper hydrogen adsorption energy employing work function(W_(f))and weighted first ionization energy(WFIE).The generalization of the ML models is tested on five new binary Mg intermetallics with the average root mean square error(RMSE)of 0.11 eV.This study not only predicts some promising binary Mg intermetallics which may suppress the galvanic corrosion,but also provides a high-throughput screening strategy and ML models for the design of corrosion-resistant alloy,which can be extended to ternary Mg alloys or other alloy systems. 展开更多
关键词 Mg intermetallics Corrosion property HIGH-THROUGHPUT Density functional theory machine learning
下载PDF
Prediction model for corrosion rate of low-alloy steels under atmospheric conditions using machine learning algorithms 被引量:2
3
作者 Jingou Kuang Zhilin Long 《International Journal of Minerals,Metallurgy and Materials》 SCIE EI CAS CSCD 2024年第2期337-350,共14页
This work constructed a machine learning(ML)model to predict the atmospheric corrosion rate of low-alloy steels(LAS).The material properties of LAS,environmental factors,and exposure time were used as the input,while ... This work constructed a machine learning(ML)model to predict the atmospheric corrosion rate of low-alloy steels(LAS).The material properties of LAS,environmental factors,and exposure time were used as the input,while the corrosion rate as the output.6 dif-ferent ML algorithms were used to construct the proposed model.Through optimization and filtering,the eXtreme gradient boosting(XG-Boost)model exhibited good corrosion rate prediction accuracy.The features of material properties were then transformed into atomic and physical features using the proposed property transformation approach,and the dominant descriptors that affected the corrosion rate were filtered using the recursive feature elimination(RFE)as well as XGBoost methods.The established ML models exhibited better predic-tion performance and generalization ability via property transformation descriptors.In addition,the SHapley additive exPlanations(SHAP)method was applied to analyze the relationship between the descriptors and corrosion rate.The results showed that the property transformation model could effectively help with analyzing the corrosion behavior,thereby significantly improving the generalization ability of corrosion rate prediction models. 展开更多
关键词 machine learning low-alloy steel atmospheric corrosion prediction corrosion rate feature fusion
下载PDF
Machine learning with active pharmaceutical ingredient/polymer interaction mechanism:Prediction for complex phase behaviors of pharmaceuticals and formulations 被引量:2
4
作者 Kai Ge Yiping Huang Yuanhui Ji 《Chinese Journal of Chemical Engineering》 SCIE EI CAS CSCD 2024年第2期263-272,共10页
The high throughput prediction of the thermodynamic phase behavior of active pharmaceutical ingredients(APIs)with pharmaceutically relevant excipients remains a major scientific challenge in the screening of pharmaceu... The high throughput prediction of the thermodynamic phase behavior of active pharmaceutical ingredients(APIs)with pharmaceutically relevant excipients remains a major scientific challenge in the screening of pharmaceutical formulations.In this work,a developed machine-learning model efficiently predicts the solubility of APIs in polymers by learning the phase equilibrium principle and using a few molecular descriptors.Under the few-shot learning framework,thermodynamic theory(perturbed-chain statistical associating fluid theory)was used for data augmentation,and computational chemistry was applied for molecular descriptors'screening.The results showed that the developed machine-learning model can predict the API-polymer phase diagram accurately,broaden the solubility data of APIs in polymers,and reproduce the relationship between API solubility and the interaction mechanisms between API and polymer successfully,which provided efficient guidance for the development of pharmaceutical formulations. 展开更多
关键词 Multi-task machine learning Density functional theory Hydrogen bond interaction MISCIBILITY SOLUBILITY
下载PDF
Machine learning applications in stroke medicine:advancements,challenges,and future prospectives 被引量:3
5
作者 Mario Daidone Sergio Ferrantelli Antonino Tuttolomondo 《Neural Regeneration Research》 SCIE CAS CSCD 2024年第4期769-773,共5页
Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning technique... Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning techniques have emerged as promising tools in stroke medicine,enabling efficient analysis of large-scale datasets and facilitating personalized and precision medicine approaches.This abstract provides a comprehensive overview of machine learning’s applications,challenges,and future directions in stroke medicine.Recently introduced machine learning algorithms have been extensively employed in all the fields of stroke medicine.Machine learning models have demonstrated remarkable accuracy in imaging analysis,diagnosing stroke subtypes,risk stratifications,guiding medical treatment,and predicting patient prognosis.Despite the tremendous potential of machine learning in stroke medicine,several challenges must be addressed.These include the need for standardized and interoperable data collection,robust model validation and generalization,and the ethical considerations surrounding privacy and bias.In addition,integrating machine learning models into clinical workflows and establishing regulatory frameworks are critical for ensuring their widespread adoption and impact in routine stroke care.Machine learning promises to revolutionize stroke medicine by enabling precise diagnosis,tailored treatment selection,and improved prognostication.Continued research and collaboration among clinicians,researchers,and technologists are essential for overcoming challenges and realizing the full potential of machine learning in stroke care,ultimately leading to enhanced patient outcomes and quality of life.This review aims to summarize all the current implications of machine learning in stroke diagnosis,treatment,and prognostic evaluation.At the same time,another purpose of this paper is to explore all the future perspectives these techniques can provide in combating this disabling disease. 展开更多
关键词 cerebrovascular disease deep learning machine learning reinforcement learning STROKE stroke therapy supervised learning unsupervised learning
下载PDF
Machine learning-assisted efficient design of Cu-based shape memory alloy with specific phase transition temperature 被引量:2
6
作者 Mengwei Wu Wei Yong +2 位作者 Cunqin Fu Chunmei Ma Ruiping Liu 《International Journal of Minerals,Metallurgy and Materials》 SCIE EI CAS CSCD 2024年第4期773-785,共13页
The martensitic transformation temperature is the basis for the application of shape memory alloys(SMAs),and the ability to quickly and accurately predict the transformation temperature of SMAs has very important prac... The martensitic transformation temperature is the basis for the application of shape memory alloys(SMAs),and the ability to quickly and accurately predict the transformation temperature of SMAs has very important practical significance.In this work,machine learning(ML)methods were utilized to accelerate the search for shape memory alloys with targeted properties(phase transition temperature).A group of component data was selected to design shape memory alloys using reverse design method from numerous unexplored data.Component modeling and feature modeling were used to predict the phase transition temperature of the shape memory alloys.The experimental results of the shape memory alloys were obtained to verify the effectiveness of the support vector regression(SVR)model.The results show that the machine learning model can obtain target materials more efficiently and pertinently,and realize the accurate and rapid design of shape memory alloys with specific target phase transition temperature.On this basis,the relationship between phase transition temperature and material descriptors is analyzed,and it is proved that the key factors affecting the phase transition temperature of shape memory alloys are based on the strength of the bond energy between atoms.This work provides new ideas for the controllable design and performance optimization of Cu-based shape memory alloys. 展开更多
关键词 machine learning support vector regression shape memory alloys martensitic transformation temperature
下载PDF
Machine Learning Analysis of Impact of Western US Fires on Central US Hailstorms 被引量:1
7
作者 Xinming LIN Jiwen FAN +1 位作者 Yuwei ZHANG ZJason HOU 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2024年第7期1450-1462,共13页
Fires,including wildfires,harm air quality and essential public services like transportation,communication,and utilities.These fires can also influence atmospheric conditions,including temperature and aerosols,potenti... Fires,including wildfires,harm air quality and essential public services like transportation,communication,and utilities.These fires can also influence atmospheric conditions,including temperature and aerosols,potentially affecting severe convective storms.Here,we investigate the remote impacts of fires in the western United States(WUS)on the occurrence of large hail(size:≥2.54 cm)in the central US(CUS)over the 20-year period of 2001–20 using the machine learning(ML),Random Forest(RF),and Extreme Gradient Boosting(XGB)methods.The developed RF and XGB models demonstrate high accuracy(>90%)and F1 scores of up to 0.78 in predicting large hail occurrences when WUS fires and CUS hailstorms coincide,particularly in four states(Wyoming,South Dakota,Nebraska,and Kansas).The key contributing variables identified from both ML models include the meteorological variables in the fire region(temperature and moisture),the westerly wind over the plume transport path,and the fire features(i.e.,the maximum fire power and burned area).The results confirm a linkage between WUS fires and severe weather in the CUS,corroborating the findings of our previous modeling study conducted on case simulations with a detailed physics model. 展开更多
关键词 WILDFIRE severe convective storm HAILSTORM machine learning
下载PDF
Machine learning for membrane design and discovery 被引量:1
8
作者 Haoyu Yin Muzi Xu +4 位作者 Zhiyao Luo Xiaotian Bi Jiali Li Sui Zhang Xiaonan Wang 《Green Energy & Environment》 SCIE EI CAS CSCD 2024年第1期54-70,共17页
Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research an... Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research and development norm of new materials for energy and environment.This review provides an overview and perspectives on ML methodologies and their applications in membrane design and dis-covery.A brief overview of membrane technologies isfirst provided with the current bottlenecks and potential solutions.Through an appli-cations-based perspective of AI-aided membrane design and discovery,we further show how ML strategies are applied to the membrane discovery cycle(including membrane material design,membrane application,membrane process design,and knowledge extraction),in various membrane systems,ranging from gas,liquid,and fuel cell separation membranes.Furthermore,the best practices of integrating ML methods and specific application targets in membrane design and discovery are presented with an ideal paradigm proposed.The challenges to be addressed and prospects of AI applications in membrane discovery are also highlighted in the end. 展开更多
关键词 machine learning Membranes AI for Membrane DATA-DRIVEN DESIGN
下载PDF
Enhanced prediction of anisotropic deformation behavior using machine learning with data augmentation 被引量:1
9
作者 Sujeong Byun Jinyeong Yu +3 位作者 Seho Cheon Seong Ho Lee Sung Hyuk Park Taekyung Lee 《Journal of Magnesium and Alloys》 SCIE EI CAS CSCD 2024年第1期186-196,共11页
Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary w... Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys. 展开更多
关键词 Plastic anisotropy Compression ANNEALING machine learning Data augmentation
下载PDF
Heterogeneous decentralised machine unlearning with seed model distillation 被引量:1
10
作者 Guanhua Ye Tong Chen +1 位作者 Quoc Viet Hung Nguyen Hongzhi Yin 《CAAI Transactions on Intelligence Technology》 SCIE EI 2024年第3期608-619,共12页
As some recent information security legislation endowed users with unconditional rights to be forgotten by any trained machine learning model,personalised IoT service pro-viders have to put unlearning functionality in... As some recent information security legislation endowed users with unconditional rights to be forgotten by any trained machine learning model,personalised IoT service pro-viders have to put unlearning functionality into their consideration.The most straight-forward method to unlearn users'contribution is to retrain the model from the initial state,which is not realistic in high throughput applications with frequent unlearning requests.Though some machine unlearning frameworks have been proposed to speed up the retraining process,they fail to match decentralised learning scenarios.A decentralised unlearning framework called heterogeneous decentralised unlearning framework with seed(HDUS)is designed,which uses distilled seed models to construct erasable en-sembles for all clients.Moreover,the framework is compatible with heterogeneous on-device models,representing stronger scalability in real-world applications.Extensive experiments on three real-world datasets show that our HDUS achieves state-of-the-art performance. 展开更多
关键词 data mining data privacy machine learning
下载PDF
Assessment of compressive strength of jet grouting by machine learning 被引量:1
11
作者 Esteban Diaz Edgar Leonardo Salamanca-Medina Roberto Tomas 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2024年第1期102-111,共10页
Jet grouting is one of the most popular soil improvement techniques,but its design usually involves great uncertainties that can lead to economic cost overruns in construction projects.The high dispersion in the prope... Jet grouting is one of the most popular soil improvement techniques,but its design usually involves great uncertainties that can lead to economic cost overruns in construction projects.The high dispersion in the properties of the improved material leads to designers assuming a conservative,arbitrary and unjustified strength,which is even sometimes subjected to the results of the test fields.The present paper presents an approach for prediction of the uniaxial compressive strength(UCS)of jet grouting columns based on the analysis of several machine learning algorithms on a database of 854 results mainly collected from different research papers.The selected machine learning model(extremely randomized trees)relates the soil type and various parameters of the technique to the value of the compressive strength.Despite the complex mechanism that surrounds the jet grouting process,evidenced by the high dispersion and low correlation of the variables studied,the trained model allows to optimally predict the values of compressive strength with a significant improvement with respect to the existing works.Consequently,this work proposes for the first time a reliable and easily applicable approach for estimation of the compressive strength of jet grouting columns. 展开更多
关键词 Jet grouting Ground improvement Compressive strength machine learning
下载PDF
Feature extraction for machine learning-based intrusion detection in IoT networks 被引量:1
12
作者 Mohanad Sarhan Siamak Layeghy +2 位作者 Nour Moustafa Marcus Gallagher Marius Portmann 《Digital Communications and Networks》 SCIE CSCD 2024年第1期205-216,共12页
A large number of network security breaches in IoT networks have demonstrated the unreliability of current Network Intrusion Detection Systems(NIDSs).Consequently,network interruptions and loss of sensitive data have ... A large number of network security breaches in IoT networks have demonstrated the unreliability of current Network Intrusion Detection Systems(NIDSs).Consequently,network interruptions and loss of sensitive data have occurred,which led to an active research area for improving NIDS technologies.In an analysis of related works,it was observed that most researchers aim to obtain better classification results by using a set of untried combinations of Feature Reduction(FR)and Machine Learning(ML)techniques on NIDS datasets.However,these datasets are different in feature sets,attack types,and network design.Therefore,this paper aims to discover whether these techniques can be generalised across various datasets.Six ML models are utilised:a Deep Feed Forward(DFF),Convolutional Neural Network(CNN),Recurrent Neural Network(RNN),Decision Tree(DT),Logistic Regression(LR),and Naive Bayes(NB).The accuracy of three Feature Extraction(FE)algorithms is detected;Principal Component Analysis(PCA),Auto-encoder(AE),and Linear Discriminant Analysis(LDA),are evaluated using three benchmark datasets:UNSW-NB15,ToN-IoT and CSE-CIC-IDS2018.Although PCA and AE algorithms have been widely used,the determination of their optimal number of extracted dimensions has been overlooked.The results indicate that no clear FE method or ML model can achieve the best scores for all datasets.The optimal number of extracted dimensions has been identified for each dataset,and LDA degrades the performance of the ML models on two datasets.The variance is used to analyse the extracted dimensions of LDA and PCA.Finally,this paper concludes that the choice of datasets significantly alters the performance of the applied techniques.We believe that a universal(benchmark)feature set is needed to facilitate further advancement and progress of research in this field. 展开更多
关键词 Feature extraction machine learning Network intrusion detection system IOT
下载PDF
Machine Learning-Based Decision-Making Mechanism for Risk Assessment of Cardiovascular Disease 被引量:1
13
作者 Cheng Wang Haoran Zhu Congjun Rao 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第1期691-718,共28页
Cardiovascular disease(CVD)has gradually become one of the main causes of harm to the life and health of residents.Exploring the influencing factors and risk assessment methods of CVD has become a general trend.In thi... Cardiovascular disease(CVD)has gradually become one of the main causes of harm to the life and health of residents.Exploring the influencing factors and risk assessment methods of CVD has become a general trend.In this paper,a machine learning-based decision-making mechanism for risk assessment of CVD is designed.In this mechanism,the logistics regression analysismethod and factor analysismodel are used to select age,obesity degree,blood pressure,blood fat,blood sugar,smoking status,drinking status,and exercise status as the main pathogenic factors of CVD,and an index systemof risk assessment for CVD is established.Then,a two-stage model combining K-means cluster analysis and random forest(RF)is proposed to evaluate and predict the risk of CVD,and the predicted results are compared with the methods of Bayesian discrimination,K-means cluster analysis and RF.The results show that thepredictioneffect of theproposedtwo-stagemodel is better than that of the comparedmethods.Moreover,several suggestions for the government,the medical industry and the public are provided based on the research results. 展开更多
关键词 CVD influencing factors risk assessment machine learning two-stage model
下载PDF
Accurate and efficient remaining useful life prediction of batteries enabled by physics-informed machine learning 被引量:1
14
作者 Liang Ma Jinpeng Tian +2 位作者 Tieling Zhang Qinghua Guo Chunsheng Hu 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第4期512-521,共10页
The safe and reliable operation of lithium-ion batteries necessitates the accurate prediction of remaining useful life(RUL).However,this task is challenging due to the diverse ageing mechanisms,various operating condi... The safe and reliable operation of lithium-ion batteries necessitates the accurate prediction of remaining useful life(RUL).However,this task is challenging due to the diverse ageing mechanisms,various operating conditions,and limited measured signals.Although data-driven methods are perceived as a promising solution,they ignore intrinsic battery physics,leading to compromised accuracy,low efficiency,and low interpretability.In response,this study integrates domain knowledge into deep learning to enhance the RUL prediction performance.We demonstrate accurate RUL prediction using only a single charging curve.First,a generalisable physics-based model is developed to extract ageing-correlated parameters that can describe and explain battery degradation from battery charging data.The parameters inform a deep neural network(DNN)to predict RUL with high accuracy and efficiency.The trained model is validated under 3 types of batteries working under 7 conditions,considering fully charged and partially charged cases.Using data from one cycle only,the proposed method achieves a root mean squared error(RMSE)of 11.42 cycles and a mean absolute relative error(MARE)of 3.19%on average,which are over45%and 44%lower compared to the two state-of-the-art data-driven methods,respectively.Besides its accuracy,the proposed method also outperforms existing methods in terms of efficiency,input burden,and robustness.The inherent relationship between the model parameters and the battery degradation mechanism is further revealed,substantiating the intrinsic superiority of the proposed method. 展开更多
关键词 Lithium-ion batteries Remaining useful life Physics-informed machine learning
下载PDF
An Intelligent SDN-IoT Enabled Intrusion Detection System for Healthcare Systems Using a Hybrid Deep Learning and Machine Learning Approach 被引量:1
15
作者 R Arthi S Krishnaveni Sherali Zeadally 《China Communications》 SCIE CSCD 2024年第10期267-287,共21页
The advent of pandemics such as COVID-19 significantly impacts human behaviour and lives every day.Therefore,it is essential to make medical services connected to internet,available in every remote location during the... The advent of pandemics such as COVID-19 significantly impacts human behaviour and lives every day.Therefore,it is essential to make medical services connected to internet,available in every remote location during these situations.Also,the security issues in the Internet of Medical Things(IoMT)used in these service,make the situation even more critical because cyberattacks on the medical devices might cause treatment delays or clinical failures.Hence,services in the healthcare ecosystem need rapid,uninterrupted,and secure facilities.The solution provided in this research addresses security concerns and services availability for patients with critical health in remote areas.This research aims to develop an intelligent Software Defined Networks(SDNs)enabled secure framework for IoT healthcare ecosystem.We propose a hybrid of machine learning and deep learning techniques(DNN+SVM)to identify network intrusions in the sensor-based healthcare data.In addition,this system can efficiently monitor connected devices and suspicious behaviours.Finally,we evaluate the performance of our proposed framework using various performance metrics based on the healthcare application scenarios.the experimental results show that the proposed approach effectively detects and mitigates attacks in the SDN-enabled IoT networks and performs better that other state-of-art-approaches. 展开更多
关键词 deep neural network healthcare intrusion detection system IOT machine learning software-defined networks
下载PDF
Leveraging machine learning for early recurrence prediction in hepatocellular carcinoma:A step towards precision medicine 被引量:1
16
作者 Abhimati Ravikulan Kamran Rostami 《World Journal of Gastroenterology》 SCIE CAS 2024年第5期424-428,共5页
The high rate of early recurrence in hepatocellular carcinoma(HCC)post curative surgical intervention poses a substantial clinical hurdle,impacting patient outcomes and complicating postoperative management.The advent... The high rate of early recurrence in hepatocellular carcinoma(HCC)post curative surgical intervention poses a substantial clinical hurdle,impacting patient outcomes and complicating postoperative management.The advent of machine learning provides a unique opportunity to harness vast datasets,identifying subtle patterns and factors that elude conventional prognostic methods.Machine learning models,equipped with the ability to analyse intricate relationships within datasets,have shown promise in predicting outcomes in various medical disciplines.In the context of HCC,the application of machine learning to predict early recurrence holds potential for personalized postoperative care strategies.This editorial comments on the study carried out exploring the merits and efficacy of random survival forests(RSF)in identifying significant risk factors for recurrence,stratifying patients at low and high risk of HCC recurrence and comparing this to traditional COX proportional hazard models(CPH).In doing so,the study demonstrated that the RSF models are superior to traditional CPH models in predicting recurrence of HCC and represent a giant leap towards precision medicine. 展开更多
关键词 machine learning Artificial intelligence Hepatocellular carcinoma HEPATOLOGY Early recurrence Liver resection
下载PDF
Outsmarting Android Malware with Cutting-Edge Feature Engineering and Machine Learning Techniques 被引量:1
17
作者 Ahsan Wajahat Jingsha He +4 位作者 Nafei Zhu Tariq Mahmood Tanzila Saba Amjad Rehman Khan Faten S.A.lamri 《Computers, Materials & Continua》 SCIE EI 2024年第4期651-673,共23页
The growing usage of Android smartphones has led to a significant rise in incidents of Android malware andprivacy breaches.This escalating security concern necessitates the development of advanced technologies capable... The growing usage of Android smartphones has led to a significant rise in incidents of Android malware andprivacy breaches.This escalating security concern necessitates the development of advanced technologies capableof automatically detecting andmitigatingmalicious activities in Android applications(apps).Such technologies arecrucial for safeguarding user data and maintaining the integrity of mobile devices in an increasingly digital world.Current methods employed to detect sensitive data leaks in Android apps are hampered by two major limitationsthey require substantial computational resources and are prone to a high frequency of false positives.This meansthat while attempting to identify security breaches,these methods often consume considerable processing powerand mistakenly flag benign activities as malicious,leading to inefficiencies and reduced reliability in malwaredetection.The proposed approach includes a data preprocessing step that removes duplicate samples,managesunbalanced datasets,corrects inconsistencies,and imputes missing values to ensure data accuracy.The Minimaxmethod is then used to normalize numerical data,followed by feature vector extraction using the Gain ratio andChi-squared test to identify and extract the most significant characteristics using an appropriate prediction model.This study focuses on extracting a subset of attributes best suited for the task and recommending a predictivemodel based on domain expert opinion.The proposed method is evaluated using Drebin and TUANDROMDdatasets containing 15,036 and 4,464 benign and malicious samples,respectively.The empirical result shows thatthe RandomForest(RF)and Support VectorMachine(SVC)classifiers achieved impressive accuracy rates of 98.9%and 98.8%,respectively,in detecting unknown Androidmalware.A sensitivity analysis experiment was also carriedout on all three ML-based classifiers based on MAE,MSE,R2,and sensitivity parameters,resulting in a flawlessperformance for both datasets.This approach has substantial potential for real-world applications and can serve asa valuable tool for preventing the spread of Androidmalware and enhancing mobile device security. 展开更多
关键词 Android malware detection machine learning SVC K-Nearest Neighbors(KNN) RF
下载PDF
Use of machine learning models for the prognostication of liver transplantation: A systematic review 被引量:2
18
作者 Gidion Chongo Jonathan Soldera 《World Journal of Transplantation》 2024年第1期164-188,共25页
BACKGROUND Liver transplantation(LT)is a life-saving intervention for patients with end-stage liver disease.However,the equitable allocation of scarce donor organs remains a formidable challenge.Prognostic tools are p... BACKGROUND Liver transplantation(LT)is a life-saving intervention for patients with end-stage liver disease.However,the equitable allocation of scarce donor organs remains a formidable challenge.Prognostic tools are pivotal in identifying the most suitable transplant candidates.Traditionally,scoring systems like the model for end-stage liver disease have been instrumental in this process.Nevertheless,the landscape of prognostication is undergoing a transformation with the integration of machine learning(ML)and artificial intelligence models.AIM To assess the utility of ML models in prognostication for LT,comparing their performance and reliability to established traditional scoring systems.METHODS Following the Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines,we conducted a thorough and standardized literature search using the PubMed/MEDLINE database.Our search imposed no restrictions on publication year,age,or gender.Exclusion criteria encompassed non-English studies,review articles,case reports,conference papers,studies with missing data,or those exhibiting evident methodological flaws.RESULTS Our search yielded a total of 64 articles,with 23 meeting the inclusion criteria.Among the selected studies,60.8%originated from the United States and China combined.Only one pediatric study met the criteria.Notably,91%of the studies were published within the past five years.ML models consistently demonstrated satisfactory to excellent area under the receiver operating characteristic curve values(ranging from 0.6 to 1)across all studies,surpassing the performance of traditional scoring systems.Random forest exhibited superior predictive capabilities for 90-d mortality following LT,sepsis,and acute kidney injury(AKI).In contrast,gradient boosting excelled in predicting the risk of graft-versus-host disease,pneumonia,and AKI.CONCLUSION This study underscores the potential of ML models in guiding decisions related to allograft allocation and LT,marking a significant evolution in the field of prognostication. 展开更多
关键词 Liver transplantation machine learning models PROGNOSTICATION Allograft allocation Artificial intelligence
下载PDF
ArcCHECK Machine QA工具在医用直线加速器质量保证中的应用效果
19
作者 张上超 曾华驱 王思阳 《医疗装备》 2024年第7期19-24,共6页
目的探讨ArcCHECK Machine QA工具在医用直线加速器质量保证中的应用效果。方法利用ArcCHECK Machine QA工具和ArcCHECK体模对医用直线加速器进行性能测试,项目包括机架角度、机架旋转速度、机架旋转中心、多叶准直器和铅门位置的一致... 目的探讨ArcCHECK Machine QA工具在医用直线加速器质量保证中的应用效果。方法利用ArcCHECK Machine QA工具和ArcCHECK体模对医用直线加速器进行性能测试,项目包括机架角度、机架旋转速度、机架旋转中心、多叶准直器和铅门位置的一致性、机架旋转出束时的平坦度和对称性,评估该工具在医用直线加速器质量保证中的应用效果。结果旋转模式下机架平均旋转速度为3.6 deg/s,最大偏差约0.5 deg/s;机架旋转等中心形成的平均半径为0.4 mm,多叶准直器与铅门的最大距离正、负差异平均值分别为0.7 mm、-0.7 mm;旋转出束模式下Y方向的平坦度为1.8%,Y方向的对称性为1.1%,X方向的对称性为4.3%。结论ArcCHECK Machine QA工具可用于医用直线加速器常规及容积调强出束性能质量保证。 展开更多
关键词 ArcCHECK machine QA工具 质量保证 容积调强 等中心
下载PDF
Dynamic Hand Gesture-Based Person Identification Using Leap Motion and Machine Learning Approaches 被引量:1
20
作者 Jungpil Shin Md.AlMehedi Hasan +2 位作者 Md.Maniruzzaman Taiki Watanabe Issei Jozume 《Computers, Materials & Continua》 SCIE EI 2024年第4期1205-1222,共18页
Person identification is one of the most vital tasks for network security. People are more concerned about theirsecurity due to traditional passwords becoming weaker or leaking in various attacks. In recent decades, f... Person identification is one of the most vital tasks for network security. People are more concerned about theirsecurity due to traditional passwords becoming weaker or leaking in various attacks. In recent decades, fingerprintsand faces have been widely used for person identification, which has the risk of information leakage as a resultof reproducing fingers or faces by taking a snapshot. Recently, people have focused on creating an identifiablepattern, which will not be reproducible falsely by capturing psychological and behavioral information of a personusing vision and sensor-based techniques. In existing studies, most of the researchers used very complex patternsin this direction, which need special training and attention to remember the patterns and failed to capturethe psychological and behavioral information of a person properly. To overcome these problems, this researchdevised a novel dynamic hand gesture-based person identification system using a Leap Motion sensor. Thisstudy developed two hand gesture-based pattern datasets for performing the experiments, which contained morethan 500 samples, collected from 25 subjects. Various static and dynamic features were extracted from the handgeometry. Randomforest was used to measure feature importance using the Gini Index. Finally, the support vectormachinewas implemented for person identification and evaluate its performance using identification accuracy. Theexperimental results showed that the proposed system produced an identification accuracy of 99.8% for arbitraryhand gesture-based patterns and 99.6% for the same dynamic hand gesture-based patterns. This result indicatedthat the proposed system can be used for person identification in the field of security. 展开更多
关键词 Person identification leap motion hand gesture random forest support vector machine
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部