At present,the acquisition of seismic data is developing toward high-precision and high-density methods.However,complex natural environments and cultural factors in many exploration areas cause difficulties in achievi...At present,the acquisition of seismic data is developing toward high-precision and high-density methods.However,complex natural environments and cultural factors in many exploration areas cause difficulties in achieving uniform and intensive acquisition,which makes complete seismic data collection impossible.Therefore,data reconstruction is required in the processing link to ensure imaging accuracy.Deep learning,as a new field in rapid development,presents clear advantages in feature extraction and modeling.In this study,the convolutional neural network deep learning algorithm is applied to seismic data reconstruction.Based on the convolutional neural network algorithm and combined with the characteristics of seismic data acquisition,two training strategies of supervised and unsupervised learning are designed to reconstruct sparse acquisition seismic records.First,a supervised learning strategy is proposed for labeled data,wherein the complete seismic data are segmented as the input of the training set and are randomly sampled before each training,thereby increasing the number of samples and the richness of features.Second,an unsupervised learning strategy based on large samples is proposed for unlabeled data,and the rolling segmentation method is used to update(pseudo)labels and training parameters in the training process.Through the reconstruction test of simulated and actual data,the deep learning algorithm based on a convolutional neural network shows better reconstruction quality and higher accuracy than compressed sensing based on Curvelet transform.展开更多
Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic i...Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic inversion accuracy.Regularization methods play a central role in solving the underdetermined inverse problem of seismic data reconstruction.In this paper,a novel regularization approach is proposed,the low dimensional manifold model(LDMM),for reconstructing the missing seismic data.Our work relies on the fact that seismic patches always occupy a low dimensional manifold.Specifically,we exploit the dimension of the seismic patches manifold as a regularization term in the reconstruction problem,and reconstruct the missing seismic data by enforcing low dimensionality on this manifold.The crucial procedure of the proposed method is to solve the dimension of the patches manifold.Toward this,we adopt an efficient dimensionality calculation method based on low-rank approximation,which provides a reliable safeguard to enforce the constraints in the reconstruction process.Numerical experiments performed on synthetic and field seismic data demonstrate that,compared with the curvelet-based sparsity-promoting L1-norm minimization method and the multichannel singular spectrum analysis method,the proposed method obtains state-of-the-art reconstruction results.展开更多
Oil and gas seismic exploration have to adopt irregular seismic acquisition due to the increasingly complex exploration conditions to adapt to complex geological conditions and environments.However,the irregular seism...Oil and gas seismic exploration have to adopt irregular seismic acquisition due to the increasingly complex exploration conditions to adapt to complex geological conditions and environments.However,the irregular seismic acquisition is accompanied by the lack of acquisition data,which requires high-precision regularization.The sparse signal feature in the transform domain in compressed sensing theory is used in this paper to recover the missing signal,involving sparse transform base optimization and threshold modeling.First,this paper analyzes and compares the effects of six sparse transformation bases on the reconstruction accuracy and efficiency of irregular seismic data and establishes the quantitative relationship between sparse transformation and reconstruction accuracy and efficiency.Second,an adaptive threshold modeling method based on sparse coefficient is provided to improve the reconstruction accuracy.Test results show that the method has good adaptability to different seismic data and sparse transform bases.The f-x domain reconstruction method of effective frequency samples is studied to address the problem of low computational efficiency.The parallel computing strategy of curvelet transform combined with OpenMP is further proposed,which substantially improves the computational efficiency under the premise of ensuring the reconstruction accuracy.Finally,the actual acquisition data are used to verify the proposed method.The results indicate that the proposed method strategy can solve the regularization problem of irregular seismic data in production and improve the imaging quality of the target layer economically and efficiently.展开更多
BACKGROUND Lutetium has been shown to be an important potential innovation in pre-treated metastatic castration-resistant prostate cancer.Two clinical trials have evaluated lutetium thus far(therap and vision with 99 ...BACKGROUND Lutetium has been shown to be an important potential innovation in pre-treated metastatic castration-resistant prostate cancer.Two clinical trials have evaluated lutetium thus far(therap and vision with 99 and 385 patients,respectively),but their results are discordant.AIM To synthetize the available evidence on the effectiveness of lutetium in pre-treated metastatic castration-resistant prostate cancer;and to test the application of a new artificial intelligence technique that synthetizes effectiveness based on reconstructed patient-level data.METHODS We employed a new artificial intelligence method(shiny method)to pool the survival data of these two trials and evaluate to what extent the lutetium cohorts differed from one another.The shiny technique employs an original reconstruction of individual patient data from the Kaplan-Meier curves.The progression-free survival graphs of the two lutetium cohorts were analyzed and compared.RESULTS The hazard ratio estimated was in favor of the vision trial;the difference was statistically significant(P<0.001).These results indicate that further studies on lutetium are needed because the survival data of the two trials published thus far are conflicting.CONCLUSION Our study confirms the feasibility of reconstructing patient-level data from survival graphs in order to generate a survival statistics.展开更多
We propose a novel machine learning approach to reconstruct meshless surface wind speed fields,i.e.,to reconstruct the surface wind speed at any location,based on meteorological background fields and geographical info...We propose a novel machine learning approach to reconstruct meshless surface wind speed fields,i.e.,to reconstruct the surface wind speed at any location,based on meteorological background fields and geographical information.The random forest method is selected to develop the machine learning data reconstruction model(MLDRM-RF)for wind speeds over Beijing from 2015-19.We use temporal,geospatial attribute and meteorological background field features as inputs.The wind speed field can be reconstructed at any station in the region not used in the training process to cross-validate model performance.The evaluation considers the spatial distribution of and seasonal variations in the root mean squared error(RMSE)of the reconstructed wind speed field across Beijing.The average RMSE is 1.09 m s^(−1),considerably smaller than the result(1.29 m s^(−1))obtained with inverse distance weighting(IDW)interpolation.Finally,we extract the important feature permutations by the method of mean decrease in impurity(MDI)and discuss the reasonableness of the model prediction results.MLDRM-RF is a reasonable approach with excellent potential for the improved reconstruction of historical surface wind speed fields with arbitrary grid resolutions.Such a model is needed in many wind applications,such as wind energy and aviation safety assessments.展开更多
Streamflow forecasting in drylands is challenging.Data are scarce,catchments are highly humanmodified and streamflow exhibits strong nonlinear responses to rainfall.The goal of this study was to evaluate the monthly a...Streamflow forecasting in drylands is challenging.Data are scarce,catchments are highly humanmodified and streamflow exhibits strong nonlinear responses to rainfall.The goal of this study was to evaluate the monthly and seasonal streamflow forecasting in two large catchments in the Jaguaribe River Basin in the Brazilian semi-arid area.We adopted four different lead times:one month ahead for monthly scale and two,three and four months ahead for seasonal scale.The gaps of the historic streamflow series were filled up by using rainfall-runoff modelling.Then,time series model techniques were applied,i.e.,the locally constant,the locally averaged,the k-nearest-neighbours algorithm(k-NN)and the autoregressive(AR)model.The criterion of reliability of the validation results is that the forecast is more skillful than streamflow climatology.Our approach outperformed the streamflow climatology for all monthly streamflows.On average,the former was 25%better than the latter.The seasonal streamflow forecasting(SSF)was also reliable(on average,20%better than the climatology),failing slightly only for the high flow season of one catchment(6%worse than the climatology).Considering an uncertainty envelope(probabilistic forecasting),which was considerably narrower than the data standard deviation,the streamflow forecasting performance increased by about 50%at both scales.The forecast errors were mainly driven by the streamflow intra-seasonality at monthly scale,while they were by the forecast lead time at seasonal scale.The best-fit and worst-fit time series model were the k-NN approach and the AR model,respectively.The rainfall-runoff modelling outputs played an important role in improving streamflow forecasting for one streamgauge that showed 35%of data gaps.The developed data-driven approach is mathematical and computationally very simple,demands few resources to accomplish its operational implementation and is applicable to other dryland watersheds.Our findings may be part of drought forecasting systems and potentially help allocating water months in advance.Moreover,the developed strategy can serve as a baseline for more complex streamflow forecast systems.展开更多
A method based on slope stitching for measurement of a large off-axis parabolic trough collector is proposed and applied to the surface shape reconstructed from the gradient data acquired by using the reverse Hartmann...A method based on slope stitching for measurement of a large off-axis parabolic trough collector is proposed and applied to the surface shape reconstructed from the gradient data acquired by using the reverse Hartmann test.The entire reflector is divided into three sections with overlapping zones along the concentration direction.A mathematical model for the slope stitching algorithm is developed.An improved reconstruction method combining Zernike slope polynomials iterative fitting with the Southwell integration algorithm is utilized to recover the real three-dimensional(3 D) shape of the collector.The efficiency and validity of the improved reconstruction method and the stitching algorithm are experimentally verified.展开更多
An online model was proposed to identify the reasons behind changes in the energy consumption of the reheating furnace of a steel processing plant.The heat conversion of the furnace was analyzed and integrated with th...An online model was proposed to identify the reasons behind changes in the energy consumption of the reheating furnace of a steel processing plant.The heat conversion of the furnace was analyzed and integrated with the fuel consumption of the furnace to obtain a model of the energy consumption.Combined with the mechanism analysis,the basic parameters affecting energy consumption were determined,and four key influencing factors were obtained:furnace output,furnace charging temperature,furnace tapping temperature,and steel type.The specific calculation method of the contribution of each influencing factor was derived to define the conditions of the baseline energy consumption,while the online data were used to calculate the energy value and the actual performance value of the baseline energy consumption.The contribution of each influencing factor was determined through normalization.The cloud platform was used for database reconstruction and programming to realize the online intelligent evaluation of the energy consumption of the reheating furnace.Finally,a case study of the evaluation of the practical energy consumption of a steel rolling furnace in a steel plant was presented.The intelligent evaluation results were quantified and displayed online,and the performance of the system in reducing production line energy consumption was demonstrated.展开更多
基金This study was supported by the National Natural Science Foundation of China under the project‘Research on the Dynamic Location of Receiver Points and Wave Field Separation Technology Based on Deep Learning in OBN Seismic Exploration’(No.42074140).
文摘At present,the acquisition of seismic data is developing toward high-precision and high-density methods.However,complex natural environments and cultural factors in many exploration areas cause difficulties in achieving uniform and intensive acquisition,which makes complete seismic data collection impossible.Therefore,data reconstruction is required in the processing link to ensure imaging accuracy.Deep learning,as a new field in rapid development,presents clear advantages in feature extraction and modeling.In this study,the convolutional neural network deep learning algorithm is applied to seismic data reconstruction.Based on the convolutional neural network algorithm and combined with the characteristics of seismic data acquisition,two training strategies of supervised and unsupervised learning are designed to reconstruct sparse acquisition seismic records.First,a supervised learning strategy is proposed for labeled data,wherein the complete seismic data are segmented as the input of the training set and are randomly sampled before each training,thereby increasing the number of samples and the richness of features.Second,an unsupervised learning strategy based on large samples is proposed for unlabeled data,and the rolling segmentation method is used to update(pseudo)labels and training parameters in the training process.Through the reconstruction test of simulated and actual data,the deep learning algorithm based on a convolutional neural network shows better reconstruction quality and higher accuracy than compressed sensing based on Curvelet transform.
基金supported by National Natural Science Foundation of China(Grant No.41874146 and No.42030103)Postgraduate Innovation Project of China University of Petroleum(East China)(No.YCX2021012)
文摘Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic inversion accuracy.Regularization methods play a central role in solving the underdetermined inverse problem of seismic data reconstruction.In this paper,a novel regularization approach is proposed,the low dimensional manifold model(LDMM),for reconstructing the missing seismic data.Our work relies on the fact that seismic patches always occupy a low dimensional manifold.Specifically,we exploit the dimension of the seismic patches manifold as a regularization term in the reconstruction problem,and reconstruct the missing seismic data by enforcing low dimensionality on this manifold.The crucial procedure of the proposed method is to solve the dimension of the patches manifold.Toward this,we adopt an efficient dimensionality calculation method based on low-rank approximation,which provides a reliable safeguard to enforce the constraints in the reconstruction process.Numerical experiments performed on synthetic and field seismic data demonstrate that,compared with the curvelet-based sparsity-promoting L1-norm minimization method and the multichannel singular spectrum analysis method,the proposed method obtains state-of-the-art reconstruction results.
基金supported by the National Science and Technology Major project(No.2016ZX05024001003)the Innovation Consortium Project of China Petroleum,and the Southwest Petroleum University(No.2020CX010201).
文摘Oil and gas seismic exploration have to adopt irregular seismic acquisition due to the increasingly complex exploration conditions to adapt to complex geological conditions and environments.However,the irregular seismic acquisition is accompanied by the lack of acquisition data,which requires high-precision regularization.The sparse signal feature in the transform domain in compressed sensing theory is used in this paper to recover the missing signal,involving sparse transform base optimization and threshold modeling.First,this paper analyzes and compares the effects of six sparse transformation bases on the reconstruction accuracy and efficiency of irregular seismic data and establishes the quantitative relationship between sparse transformation and reconstruction accuracy and efficiency.Second,an adaptive threshold modeling method based on sparse coefficient is provided to improve the reconstruction accuracy.Test results show that the method has good adaptability to different seismic data and sparse transform bases.The f-x domain reconstruction method of effective frequency samples is studied to address the problem of low computational efficiency.The parallel computing strategy of curvelet transform combined with OpenMP is further proposed,which substantially improves the computational efficiency under the premise of ensuring the reconstruction accuracy.Finally,the actual acquisition data are used to verify the proposed method.The results indicate that the proposed method strategy can solve the regularization problem of irregular seismic data in production and improve the imaging quality of the target layer economically and efficiently.
文摘BACKGROUND Lutetium has been shown to be an important potential innovation in pre-treated metastatic castration-resistant prostate cancer.Two clinical trials have evaluated lutetium thus far(therap and vision with 99 and 385 patients,respectively),but their results are discordant.AIM To synthetize the available evidence on the effectiveness of lutetium in pre-treated metastatic castration-resistant prostate cancer;and to test the application of a new artificial intelligence technique that synthetizes effectiveness based on reconstructed patient-level data.METHODS We employed a new artificial intelligence method(shiny method)to pool the survival data of these two trials and evaluate to what extent the lutetium cohorts differed from one another.The shiny technique employs an original reconstruction of individual patient data from the Kaplan-Meier curves.The progression-free survival graphs of the two lutetium cohorts were analyzed and compared.RESULTS The hazard ratio estimated was in favor of the vision trial;the difference was statistically significant(P<0.001).These results indicate that further studies on lutetium are needed because the survival data of the two trials published thus far are conflicting.CONCLUSION Our study confirms the feasibility of reconstructing patient-level data from survival graphs in order to generate a survival statistics.
基金supported by the Strategic Priority Research Program of the Chinese Academy of Sciences(Grant No.XDA19030402)the Key Special Projects for International Cooperation in Science and Technology Innovation between Governments(Grant No.2017YFE0133600the Beijing Municipal Natural Science Foundation Youth Project 8214066:Application Research of Beijing Road Visibility Prediction Based on Machine Learning Methods.
文摘We propose a novel machine learning approach to reconstruct meshless surface wind speed fields,i.e.,to reconstruct the surface wind speed at any location,based on meteorological background fields and geographical information.The random forest method is selected to develop the machine learning data reconstruction model(MLDRM-RF)for wind speeds over Beijing from 2015-19.We use temporal,geospatial attribute and meteorological background field features as inputs.The wind speed field can be reconstructed at any station in the region not used in the training process to cross-validate model performance.The evaluation considers the spatial distribution of and seasonal variations in the root mean squared error(RMSE)of the reconstructed wind speed field across Beijing.The average RMSE is 1.09 m s^(−1),considerably smaller than the result(1.29 m s^(−1))obtained with inverse distance weighting(IDW)interpolation.Finally,we extract the important feature permutations by the method of mean decrease in impurity(MDI)and discuss the reasonableness of the model prediction results.MLDRM-RF is a reasonable approach with excellent potential for the improved reconstruction of historical surface wind speed fields with arbitrary grid resolutions.Such a model is needed in many wind applications,such as wind energy and aviation safety assessments.
基金The first author thanks the Brazilian National Council for Scientific and Technological Development for the Post-Doc scholarship(155814/2018-4).
文摘Streamflow forecasting in drylands is challenging.Data are scarce,catchments are highly humanmodified and streamflow exhibits strong nonlinear responses to rainfall.The goal of this study was to evaluate the monthly and seasonal streamflow forecasting in two large catchments in the Jaguaribe River Basin in the Brazilian semi-arid area.We adopted four different lead times:one month ahead for monthly scale and two,three and four months ahead for seasonal scale.The gaps of the historic streamflow series were filled up by using rainfall-runoff modelling.Then,time series model techniques were applied,i.e.,the locally constant,the locally averaged,the k-nearest-neighbours algorithm(k-NN)and the autoregressive(AR)model.The criterion of reliability of the validation results is that the forecast is more skillful than streamflow climatology.Our approach outperformed the streamflow climatology for all monthly streamflows.On average,the former was 25%better than the latter.The seasonal streamflow forecasting(SSF)was also reliable(on average,20%better than the climatology),failing slightly only for the high flow season of one catchment(6%worse than the climatology).Considering an uncertainty envelope(probabilistic forecasting),which was considerably narrower than the data standard deviation,the streamflow forecasting performance increased by about 50%at both scales.The forecast errors were mainly driven by the streamflow intra-seasonality at monthly scale,while they were by the forecast lead time at seasonal scale.The best-fit and worst-fit time series model were the k-NN approach and the AR model,respectively.The rainfall-runoff modelling outputs played an important role in improving streamflow forecasting for one streamgauge that showed 35%of data gaps.The developed data-driven approach is mathematical and computationally very simple,demands few resources to accomplish its operational implementation and is applicable to other dryland watersheds.Our findings may be part of drought forecasting systems and potentially help allocating water months in advance.Moreover,the developed strategy can serve as a baseline for more complex streamflow forecast systems.
基金supported by the National “973” Program of China(No.2013CB733100)the National Natural Science Foundation of China(No.61008033)
文摘A method based on slope stitching for measurement of a large off-axis parabolic trough collector is proposed and applied to the surface shape reconstructed from the gradient data acquired by using the reverse Hartmann test.The entire reflector is divided into three sections with overlapping zones along the concentration direction.A mathematical model for the slope stitching algorithm is developed.An improved reconstruction method combining Zernike slope polynomials iterative fitting with the Southwell integration algorithm is utilized to recover the real three-dimensional(3 D) shape of the collector.The efficiency and validity of the improved reconstruction method and the stitching algorithm are experimentally verified.
基金supported by the National Key Researchand Development Programof China (Grant No.2020YFB1711101)the Anhui Provincial University Natural Science Foundation Key Project (Grant No.KJ2019A127).
文摘An online model was proposed to identify the reasons behind changes in the energy consumption of the reheating furnace of a steel processing plant.The heat conversion of the furnace was analyzed and integrated with the fuel consumption of the furnace to obtain a model of the energy consumption.Combined with the mechanism analysis,the basic parameters affecting energy consumption were determined,and four key influencing factors were obtained:furnace output,furnace charging temperature,furnace tapping temperature,and steel type.The specific calculation method of the contribution of each influencing factor was derived to define the conditions of the baseline energy consumption,while the online data were used to calculate the energy value and the actual performance value of the baseline energy consumption.The contribution of each influencing factor was determined through normalization.The cloud platform was used for database reconstruction and programming to realize the online intelligent evaluation of the energy consumption of the reheating furnace.Finally,a case study of the evaluation of the practical energy consumption of a steel rolling furnace in a steel plant was presented.The intelligent evaluation results were quantified and displayed online,and the performance of the system in reducing production line energy consumption was demonstrated.