期刊文献+
共找到5篇文章
< 1 >
每页显示 20 50 100
面向数据权利、数据定价和隐私计算的数据驱动学习 被引量:2
1
作者 徐基珉 洪暖欣 +11 位作者 许哲宁 赵洲 吴超 况琨 王嘉平 朱明杰 周靖人 任奎 杨小虎 卢策吾 裴健 沈向洋 《Engineering》 SCIE EI CAS CSCD 2023年第6期66-76,M0004,共12页
近年来,数据已成为数字经济中最重要的生产要素之一。与传统生产要素不同,数据的数字化性质使其难以合同和交易。因此,建立一个高效和标准的数据交易市场体系将有利于降低成本,提高行业各方的生产力。尽管许多研究致力于数据法规和其他... 近年来,数据已成为数字经济中最重要的生产要素之一。与传统生产要素不同,数据的数字化性质使其难以合同和交易。因此,建立一个高效和标准的数据交易市场体系将有利于降低成本,提高行业各方的生产力。尽管许多研究致力于数据法规和其他数据交易问题,如隐私和定价,但很少有工作对机器学习和数据科学领域的这些研究进行全面回顾。为了提供对这个主题的完整和最新的理解,本文涵盖了数据交易过程中的三个关键问题:数据权利、数据定价和隐私计算。通过厘清这些主题之间的关系,本文提供了一个数据生态系统的全貌,其中数据由个人、研究机构和政府等数据主体生成,而数据处理者出于创新或运营目的获取数据,并通过适当的定价机制根据数据主体各自的所有权分配收益。为了使人工智能(AI)能够长期有益于人类社会的发展,人工智能算法需要通过数据保护法规(即隐私保护法规)进行评估,以帮助构建日常生活中值得信赖的人工智能系统。 展开更多
关键词 人工智能系统 隐私计算 机器学习 数据保护 传统生产要素 数据驱动学习 面向数据 人工智能算法
下载PDF
HXPY: A High-Performance Data Processing Package for Financial Time-Series Data
2
作者 郭家栋 彭靖姝 +1 位作者 苑航 倪明选 《Journal of Computer Science & Technology》 SCIE EI CSCD 2023年第1期3-24,共22页
A tremendous amount of data has been generated by global financial markets everyday,and such time-series data needs to be analyzed in real time to explore its potential value.In recent years,we have witnessed the succ... A tremendous amount of data has been generated by global financial markets everyday,and such time-series data needs to be analyzed in real time to explore its potential value.In recent years,we have witnessed the successful adoption of machine learning models on financial data,where the importance of accuracy and timeliness demands highly effective computing frameworks.However,traditional financial time-series data processing frameworks have shown performance degradation and adaptation issues,such as the outlier handling with stock suspension in Pandas and TA-Lib.In this paper,we propose HXPY,a high-performance data processing package with a C++/Python interface for financial time-series data.HXPY supports miscellaneous acceleration techniques such as the streaming algorithm,the vectorization instruction set,and memory optimization,together with various functions such as time window functions,group operations,down-sampling operations,cross-section operations,row-wise or column-wise operations,shape transformations,and alignment functions.The results of benchmark and incremental analysis demonstrate the superior performance of HXPY compared with its counterparts.From MiBs to GiBs data,HXPY significantly outperforms other in-memory dataframe computing rivals even up to hundreds of times. 展开更多
关键词 dataframe time-series data SIMD(single instruction multiple data) CUDA(Compute Unified Device Architecture)
原文传递
On the principles of Parsimony and Self-consistency for the emergence of intelligence 被引量:2
3
作者 Yi MA Doris TSAO Heung-Yeung SHUM 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2022年第9期1298-1323,共26页
Ten years into the revival of deep networks and artificial intelligence,we propose a theoretical framework that sheds light on understanding deep networks within a bigger picture of intelligence in general.We introduc... Ten years into the revival of deep networks and artificial intelligence,we propose a theoretical framework that sheds light on understanding deep networks within a bigger picture of intelligence in general.We introduce two fundamental principles,Parsimony and Self-consistency,which address two fundamental questions regarding intelligence:what to learn and how to learn,respectively.We believe the two principles serve as the cornerstone for the emergence of intelligence,artificial or natural.While they have rich classical roots,we argue that they can be stated anew in entirely measurable and computable ways.More specifically,the two principles lead to an effective and efficient computational framework,compressive closed-loop transcription,which unifies and explains the evolution of modern deep networks and most practices of artificial intelligence.While we use mainly visual data modeling as an example,we believe the two principles will unify understanding of broad families of autonomous intelligent systems and provide a framework for understanding the brain. 展开更多
关键词 INTELLIGENCE PARSIMONY SELF-CONSISTENCY Rate reduction Deep networks Closed-loop transcription
原文传递
Specialising neural network potentials for accurate properties and application to the mechanical response of titanium 被引量:4
4
作者 Tongqi Wen Rui Wang +4 位作者 Lingyu Zhu Linfeng Zhang Han Wang David J.Srolovitz Zhaoxuan Wu 《npj Computational Materials》 SCIE EI CSCD 2021年第1期1908-1918,共11页
Large scale atomistic simulations provide direct access to important materials phenomena not easily accessible to experiments or quantum mechanics-based calculation approaches.Accurate and efficient interatomic potent... Large scale atomistic simulations provide direct access to important materials phenomena not easily accessible to experiments or quantum mechanics-based calculation approaches.Accurate and efficient interatomic potentials are the key enabler,but their development remains a challenge for complex materials and/or complex phenomena.Machine learning potentials,such as the Deep Potential(DP)approach,provide robust means to produce general purpose interatomic potentials.Here,we provide a methodology for specialising machine learning potentials for high fidelity simulations of complex phenomena,where general potentials do not suffice.As an example,we specialise a general purpose DP method to describe the mechanical response of two allotropes of titanium(in addition to other defect,thermodynamic and structural properties).The resulting DP correctly captures the structures,energies,elastic constants andγ-lines of Ti in both the HCP and BCC structures,as well as properties such as dislocation core structures,vacancy formation energies,phase transition temperatures,and thermal expansion.The DP thus enables direct atomistic modelling of plastic and fracture behaviour of Ti.The approach to specialising DP interatomic potential,DPspecX,for accurate reproduction of properties of interest“X”,is general and extensible to other systems and properties. 展开更多
关键词 TITANIUM enable NEURAL
原文传递
Deep potentials for materials science 被引量:11
5
作者 Tongqi Wen Linfeng Zhang +2 位作者 Han Wang Weinan E David J Srolovitz 《Materials Futures》 2022年第2期89-115,共27页
To fill the gap between accurate(and expensive)ab initio calculations and efficient atomistic simulations based on empirical interatomic potentials,a new class of descriptions of atomic interactions has emerged and be... To fill the gap between accurate(and expensive)ab initio calculations and efficient atomistic simulations based on empirical interatomic potentials,a new class of descriptions of atomic interactions has emerged and been widely applied;i.e.machine learning potentials(MLPs).One recently developed type of MLP is the deep potential(DP)method.In this review,we provide an introduction to DP methods in computational materials science.The theory underlying the DP method is presented along with a step-by-step introduction to their development and use.We also review materials applications of DPs in a wide range of materials systems.The DP Library provides a platform for the development of DPs and a database of extant DPs.We discuss the accuracy and efficiency of DPs compared with ab initio methods and empirical potentials. 展开更多
关键词 deep potential atomistic simulation machine learning potential neural network
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部