期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
Unique,Persistent,Resolvable:Identifiers as the Foundation of FAIR 被引量:12
1
作者 Nick Juty Sarala M.Wimalaratne +3 位作者 stian soiland-reyes John Kunze Carole A.Goble Tim Clark 《Data Intelligence》 2020年第1期30-39,302,共11页
The FAIR principles describe characteristics intended to support access to and reuse of digital artifacts in the scientific research ecosystem.Persistent,globally unique identifiers,resolvable on the Web,and associate... The FAIR principles describe characteristics intended to support access to and reuse of digital artifacts in the scientific research ecosystem.Persistent,globally unique identifiers,resolvable on the Web,and associated with a set of additional descriptive metadata,are foundational to FAIR data.Here we describe some basic principles and exemplars for their design,use and orchestration with other system elements to achieve FAIRness for digital research objects. 展开更多
关键词 Identifiers METADATA findability FAIR data data infrastructures
原文传递
FAIR Computational Workflows 被引量:4
2
作者 Carole Goble Sarah Cohen-Boulakia +5 位作者 stian soiland-reyes Daniel Garijo Yolanda Gil Michael R.Crusoe Kristian Peters Daniel Schober 《Data Intelligence》 2020年第1期108-121,307,308,309,共17页
Computational workflows describe the complex multi-step methods that are used for data collection,data preparation,analytics,predictive modelling,and simulation that lead to new data products.They can inherently contr... Computational workflows describe the complex multi-step methods that are used for data collection,data preparation,analytics,predictive modelling,and simulation that lead to new data products.They can inherently contribute to the FAIR data principles:by processing data according to established metadata;by creating metadata themselves during the processing of data;and by tracking and recording data provenance.These properties aid data quality assessment and contribute to secondary data usage.Moreover,workflows are digital objects in their own right.This paper argues that FAIR principles for workflows need to address their specific nature in terms of their composition of executable software steps,their provenance,and their development. 展开更多
关键词 Computational workflow REPRODUCIBILITY Software FAIR data PROVENANCE
原文传递
Making Canonical Workflow Building Blocks Interoperable across Workflow Languages
3
作者 stian soiland-reyes Genis Bayarri +5 位作者 Pau Andrio Robin Long Douglas Lowe Ania Niewielska Adam Hospital Paul Groth 《Data Intelligence》 EI 2022年第2期342-357,共16页
We introduce the concept of Canonical Workflow Building Blocks(CWBB),a methodology of describing and wrapping computational tools,in order for them to be utilised in a reproducible manner from multiple workflow langua... We introduce the concept of Canonical Workflow Building Blocks(CWBB),a methodology of describing and wrapping computational tools,in order for them to be utilised in a reproducible manner from multiple workflow languages and execution platforms.The concept is implemented and demonstrated with the BioExcel Building Blocks library(BioBB),a collection of tool wrappers in the field of computational biomolecular simulation.Interoperability across different workflow languages is showcased through a protein Molecular Dynamics setup transversal workflow,built using this library and run with 5 different Workflow Manager Systems(WfMS).We argue such practice is a necessary requirement for FAIR Computational Workflows and an element of Canonical Workflow Frameworks for Research(CWFR)in order to improve widespread adoption and reuse of computational methods across workflow language barriers. 展开更多
关键词 Scientific workflows Interoperable FAIR Computational tools Containers Software packaging FAIR digital object(FDO) BioExcel Building Blocks library(BioBB) Canonical Workflow Frameworks for Research(CWFR)
原文传递
The Specimen Data Refinery:A Canonical Workflow Framework and FAIR Digital Object Approach to Speeding up Digital Mobilisation of Natural History Collections
4
作者 Alex Hardisty Paul Brack +5 位作者 Carole Goble Laurence Livermore Ben Scott Quentin Groom Stuart Owen stian soiland-reyes 《Data Intelligence》 EI 2022年第2期320-341,共22页
A key limiting factor in organising and using information from physical specimens curated in natural science collections is making that information computable,with institutional digitization tending to focus more on i... A key limiting factor in organising and using information from physical specimens curated in natural science collections is making that information computable,with institutional digitization tending to focus more on imaging the specimens themselves than on efficiently capturing computable data about them.Label data are traditionally manually transcribed today with high cost and low throughput,rendering such a task constrained for many collection-holding institutions at current funding levels.We show how computer vision,optical character recognition,handwriting recognition,named entity recognition and language translation technologies can be implemented into canonical workflow component libraries with findable,accessible,interoperable,and reusable(FAIR)characteristics.These libraries are being developed in a cloudbased workflow plaform-the Specimen Data Refinery'(SDR)-founded on Galaxy workflow engine,Common Workflow Language,Research Object Crates(RO-Crate)and WorkflowHub technologies.The SDR can be applied to specimens'labels and other artefacts,offering the prospect of greatly accelerated and more accurate data capture in computable form.Two kinds of FAIR Digital Objects(FDO)are created by packaging outputs of SDR workflows and workflow components as digital objects with metadata,a persistent identifier,and a specific type definition.The first kind of FDO are computable Digital Specimen(DS)objects that can be consumed/produced by workflows,and other applications.A single DS is the input data structure submitted to a workflow that is modified by each workflow component in turn to produce a refined DS at the end.The Specimen Data Refinery provides a library of such components that can be used individually,or in series.To cofunction,each library component describes the fields it requires from the DS and the fields it will in turn populate or enrich.The second kind of FDO,RO-Crates gather and archive the diverse set of digital and real-world resources,configurations,and actions(the provenance)contributing to a unit of research work,allowing that work to be faithfully recorded and reproduced.Here we describe the Specimen Data Refinery with its motivating requirements,focusing on what is essential in the creation of canonical workflow component libraries and its conformance with the requirements of an emerging FDO Core Specification being developed by the FDO Forum. 展开更多
关键词 Digital Specimen WORKFLOW FAIR Digital Object RO-Crate
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部