Using GIS spatial statistical analysis method, with ArcGIS software as an analysis tool, taking the diseased maize in Hedong District of Linyi City as the study object, the distribution characteristic of the diseased ...Using GIS spatial statistical analysis method, with ArcGIS software as an analysis tool, taking the diseased maize in Hedong District of Linyi City as the study object, the distribution characteristic of the diseased crops this time in spatial location was analyzed. The results showed that the diseased crops mainly dis- tributed along with river tributaries and downstream of main rivers. The correlation between adjacent diseased plots was little, so the infection of pests and diseases were excluded, and the major reason of incidence might be river pollution.展开更多
Some geophysical parameters, such as those related to gravitation and the geomagnetic field, could change during solar eclipses. In order to observe geomagnetic fluctuations, geomagnetic measurements were carded out i...Some geophysical parameters, such as those related to gravitation and the geomagnetic field, could change during solar eclipses. In order to observe geomagnetic fluctuations, geomagnetic measurements were carded out in a limited time frame during the partial solar eclipse that occurred on 2011 January 4 and was observed in Canakkale and Ankara, Turkey. Additionally, records of the geomagnetic field spanning 24 hours, obtained from another observatory (in Iznik, Turkey), were also analyzed to check for any peculiar variations. In the data processing stage, a polynomial fit, following the application of a running average routine, was applied to the geomagnetic field data sets. Geomagnetic field data sets indicated there was a characteristic decrease at the beginning of the solar eclipse and this decrease can be well-correlated with previous geomagnetic field measurements that were taken during the total solar eclipse that was observed in Turkey on 2006 March 29. The behavior of the geomagnetic field is also consistent with previous observations in the literature. As a result of these analyses, it can be suggested that eclipses can cause a shielding effect on the geomagnetic field of the Earth.展开更多
Noise is a significant part within a millimeter-wave molecular line datacube.Analyzing the noise improves our understanding of noise characteristics,and further contributes to scientific discoveries.We measure the noi...Noise is a significant part within a millimeter-wave molecular line datacube.Analyzing the noise improves our understanding of noise characteristics,and further contributes to scientific discoveries.We measure the noise level of a single datacube from MWISP and perform statistical analyses.We identified major factors which increase the noise level of a single datacube,including bad channels,edge effects,baseline distortion and line contamination.Cleaning algorithms are applied to remove or reduce these noise components.As a result,we obtained the cleaned datacube in which noise follows a positively skewed normal distribution.We further analyzed the noise structure distribution of a 3 D mosaicked datacube in the range l=40°7 to 43°3 and b=-2°3 to 0°3 and found that noise in the final mosaicked datacube is mainly characterized by noise fluctuation among the cells.展开更多
We introduced a decision tree method called Random Forests for multiwavelength data classification. The data were adopted from different databases, including the Sloan Digital Sky Survey (SDSS) Data Release five, US...We introduced a decision tree method called Random Forests for multiwavelength data classification. The data were adopted from different databases, including the Sloan Digital Sky Survey (SDSS) Data Release five, USNO, FIRST and ROSAT. We then studied the discrimination of quasars from stars and the classification of quasars, stars and galaxies with the sample from optical and radio bands and with that from optical and X-ray bands. Moreover, feature selection and feature weighting based on Random Forests were investigated. The performances based on different input patterns were compared. The experimental results show that the random forest method is an effective method for astronomical object classification and can be applied to other classification problems faced in astronomy. In addition, Random Forests will show its superiorities due to its own merits, e.g. classification, feature selection, feature weighting as well as outlier detection.展开更多
The Chinese Space Station Telescope(CSST)spectroscopic survey aims to deliver high-quality low-resolution(R>200)slitless spectra for hundreds of millions of targets down to a limiting magnitude of about 21 mag,dist...The Chinese Space Station Telescope(CSST)spectroscopic survey aims to deliver high-quality low-resolution(R>200)slitless spectra for hundreds of millions of targets down to a limiting magnitude of about 21 mag,distributed within a large survey area(17500 deg2)and covering a wide wavelength range(255-1000 nm by three bands GU,GV,and GI).As slitless spectroscopy precludes the usage of wavelength calibration lamps,wavelength calibration is one of the most challenging issues in the reduction of slitless spectra,yet it plays a key role in measuring precise radial velocities of stars and redshifts of galaxies.In this work,we propose a star-based method that can monitor and correct for possible errors in the CSST wavelength calibration using normal scientific observations,taking advantage of the facts that(ⅰ)there are about ten million stars with reliable radial velocities now available thanks to spectroscopic surveys like LAMOST,(ⅱ)the large field of view of CSST enables efficient observations of such stars in a short period of time,and(ⅲ)radial velocities of such stars can be reliably measured using only a narrow segment of CSST spectra.We demonstrate that it is possible to achieve a wavelength calibration precision of a few km s^(-1) for the GU band,and about 10 to 20 kms^(-1) for the GV and GI bands,with only a few hundred velocity standard stars.Implementations of the method to other surveys are also discussed.展开更多
PLS (Partial Least Squares regression) is introduced into an automatic estimation of fundamental stellar spectral parameters. It extracts the most correlative spectral component to the parameters (Teff, log g and [...PLS (Partial Least Squares regression) is introduced into an automatic estimation of fundamental stellar spectral parameters. It extracts the most correlative spectral component to the parameters (Teff, log g and [Fe/H]), and sets up a linear regression function from spectra to the corresponding parameters. Considering the properties of stellar spectra and the PLS algorithm, we present a piecewise PLS regression method for estimation of stellar parameters, which is composed of one PLS model for Teff, and seven PLS models for log g and [Fe/H] estimation. Its performance is investigated by large experiments on flux calibrated spectra and continuum normalized spectra at different signal-to-noise ratios (SNRs) and resolutions. The results show that the piecewise PLS method is robust for spectra at the medium resolution of 0.23 nm. For low resolution 0.5 nm and 1 nm spectra, it achieves competitive results at higher SNR. Experiments using ELODIE spectra of 0.23 nm resolution illustrate that our piecewise PLS models trained with MILES spectra are efficient for O ~ G stars: for flux calibrated spectra, the systematic offsets are 3.8%, 0.14 dex, and -0.09 dex for Teff, log g and [Fe/H], with error scatters of 5.2%, 0.44 dex and 0.38 dex, respectively; for continuum normalized spectra, the systematic offsets are 3.8%, 0.12dex, and -0.13 dex for Teff, log g and [Fe/H], with error scatters of 5.2%, 0.49 dex and 0.41 dex, respectively. The PLS method is rapid, easy to use and does not rely as strongly on the tightness of a parameter grid of templates to reach high precision as Artificial Neural Networks or minimum distance methods do.展开更多
Predicting seeing of astronomical observations can provide hints of the quality of optical imaging in the near future,and facilitate flexible scheduling of observation tasks to maximize the use of astronomical observa...Predicting seeing of astronomical observations can provide hints of the quality of optical imaging in the near future,and facilitate flexible scheduling of observation tasks to maximize the use of astronomical observatories.Traditional approaches to seeing prediction mostly rely on regional weather models to capture the in-dome optical turbulence patterns.Thanks to the developing of data gathering and aggregation facilities of astronomical observatories in recent years,data-driven approaches are becoming increasingly feasible and attractive to predict astronomical seeing.This paper systematically investigates data-driven approaches to seeing prediction by leveraging various big data techniques,from traditional statistical modeling,machine learning to new emerging deep learning methods,on the monitoring data of the Large sky Area Multi-Object fiber Spectroscopic Telescope(LAMOST).The raw monitoring data are preprocessed to allow for big data modeling.Then we formulate the seeing prediction task under each type of modeling framework and develop seeing prediction models through using representative big data techniques,including ARIMA and Prophet for statistical modeling,MLP and XGBoost for machine learning,and LSTM,GRU and Transformer for deep learning.We perform empirical studies on the developed models with a variety of feature configurations,yielding notable insights into the applicability of big data techniques to the seeing prediction task.展开更多
The extraction of high-temperature regions in active regions(ARs)is an important means to help understand the mechanism of coronal heating.The important observational means of high-temperature radiation in ARs is the ...The extraction of high-temperature regions in active regions(ARs)is an important means to help understand the mechanism of coronal heating.The important observational means of high-temperature radiation in ARs is the main emission line of Fe XVⅢin the 94?of the Atmospheric Imaging Assembly.However,the diagnostic algorithms for Fe XVⅢ,including the differential emission measure(DEM)and linear diagnostics proposed by Del based on the DEM,have been greatly limited for a long time,and the results obtained are different from the predictions.In this paper,we use the outlier detection method to establish the nonlinear correlation between 94?and 171,193,211?based on the former researches by others.A neural network based on 171,193,211?is constructed to replace the low-temperature emission lines in the ARs of 94?.The predicted results are regarded as the low-temperature components of 94?,and then the predicted results are subtracted from 94?to obtain the outlier component of 94?,or Fe XVⅢ.Then,the outlier components obtained by neural network are compared with the Fe XVⅢobtained by DEM and Del's method,and a high similarity is found,which proves the reliability of neural network to obtain the high-temperature components of ARs,but there are still many differences.In order to analyze the differences between the Fe XVⅢobtained by the three methods,we subtract the Fe XVⅢobtained by the DEM and Del's method from the Fe XVⅢobtained by the neural network to obtain the residual value,and compare it with the results of Fe XIV in the temperature range of 6.1-6.45 MK.It is found that there is a great similarity,which also shows that the Fe XVⅢobtained by DEM and Del's method still has a large low-temperature component dominated by Fe XIV,and the Fe XVⅢobtained by neural network is relatively pure.展开更多
In pulsar astronomy, detecting effective pulsar signals among numerous pulsar candidates is an important research topic. Starting from space X-ray pulsar signals, the two-dimensional autocorrelation profile map(2 D-AP...In pulsar astronomy, detecting effective pulsar signals among numerous pulsar candidates is an important research topic. Starting from space X-ray pulsar signals, the two-dimensional autocorrelation profile map(2 D-APM) feature modelling method, which utilizes epoch folding of the autocorrelation function of X-ray signals and expands the time-domain information of the periodic axis, is proposed. A uniform setting criterion regarding the time resolution of the periodic axis addresses pulsar signals without any prior information. Compared with the traditional profile, the model has a strong anti-noise ability, a greater abundance of information and consistent characteristics. The new feature is simulated with double Gaussian components, and the characteristic distribution of the model is revealed to be closely related to the distance between the double peaks of the profile. Next, a deep convolutional neural network(DCNN)is built, named Inception-Res Net. According to the order of the peak separation and number of arriving photons, 30 data sets based on the Poisson process are simulated to construct the training set, and the observation data of PSRs B0531+21, B0540-69 and B1509-58 from the Rossi X-ray Timing Explorer(RXTE) are selected to generate the test set. The number of training sets and the number of test sets are 30 000 and 5400, respectively. After achieving convergence stability, more than 99% of the pulsar signals are recognized, and more than 99% of the interference is successfully rejected, which verifies the high degree of agreement between the network and the feature model and the high potential of the proposed method in searching for pulsars.展开更多
Stellar classification and radius estimation are crucial for understanding the structure of the Universe and stella evolution.With the advent of the era of astronomical big data,multimodal data are available and theor...Stellar classification and radius estimation are crucial for understanding the structure of the Universe and stella evolution.With the advent of the era of astronomical big data,multimodal data are available and theoretically effective for stellar classification and radius estimation.A problem is how to improve the performance of this task by jointly using the multimodal data.However,existing research primarily focuses on using single-modal data.To this end,this paper proposes a model,Multi-Modal SCNet,and its ensemble model Multimodal Ensemble fo Stellar Classification and Regression(MESCR)for improving stellar classification and radius estimation performance by fusing two modality data.In this problem,a typical phenomenon is that the sample numbers o some types of stars are evidently more than others.This imbalance has negative effects on model performance Therefore,this work utilizes a weighted sampling strategy to deal with the imbalance issues in MESCR.Som evaluation experiments are conducted on a test set for MESCR and the classification accuracy is 96.1%,and th radius estimation performance Mean of Absolute Error andσare 0.084 dex and 0.149 R_(⊙),respectively.Moreover we assessed the uncertainty of model predictions,confirming good consistency within a reasonable deviation range.Finally,we applied our model to 50,871,534 SDSS stars without spectra and published a new catalog.展开更多
The Yellow River Basin of China is a key region that contains myriad interactions between human activities and natural environment.Industrialization and urbanization promote social-economic development,but they also h...The Yellow River Basin of China is a key region that contains myriad interactions between human activities and natural environment.Industrialization and urbanization promote social-economic development,but they also have generated a series of environmental and ecological issues in this basin.Previous researches have evaluated urban resilience at the national,regional,urban agglomeration,city,and prefecture levels,but not at the watershed level.To address this research gap and elevate the Yellow River Basin’s urban resilience level,we constructed an urban resilience evaluation index system from five dimensions:industrial resilience,social resilience,environmental resilience,technological resilience,and organizational resilience.The entropy weight method was used to comprehensively evaluate urban resilience in the Yellow River Basin.The exploratory spatial data analysis method was employed to study the spatiotemporal differences in urban resilience in the Yellow River Basin in 2010,2015,and 2020.Furthermore,the grey correlation analysis method was utilized to explore the influencing factors of these differences.The results of this study are as follows:(1)the overall level of urban resilience in the Yellow River Basin was relatively low but showed an increasing trend during 2010–2015,and significant spatial distribution differences were observed,with a higher resilience level in the eastern region and a low-medium resilience level in the western region;(2)the differences in urban resilience were noticeable,with industrial resilience and social resilience being relatively highly developed,whereas organizational resilience and environmental resilience were relatively weak;and(3)the correlation ranking of resilience influencing factors was as follows:science and technology level>administrative power>openness>market forces.This research can provide a basis for improving the resilience level of cities in the Yellow River Basin and contribute to the high-quality development of the region.展开更多
It is well known that the nonparametric estimation of the regression function is highly sensitive to the presence of even a small proportion of outliers in the data.To solve the problem of typical observations when th...It is well known that the nonparametric estimation of the regression function is highly sensitive to the presence of even a small proportion of outliers in the data.To solve the problem of typical observations when the covariates of the nonparametric component are functional,the robust estimates for the regression parameter and regression operator are introduced.The main propose of the paper is to consider data-driven methods of selecting the number of neighbors in order to make the proposed processes fully automatic.We use thek Nearest Neighbors procedure(kNN)to construct the kernel estimator of the proposed robust model.Under some regularity conditions,we state consistency results for kNN functional estimators,which are uniform in the number of neighbors(UINN).Furthermore,a simulation study and an empirical application to a real data analysis of octane gasoline predictions are carried out to illustrate the higher predictive performances and the usefulness of the kNN approach.展开更多
Spectrum denoising is an important procedure for large-scale spectroscopical surveys. This work proposes a novel stellar spectrum denoising method based on deep Bayesian modeling. The construction of our model include...Spectrum denoising is an important procedure for large-scale spectroscopical surveys. This work proposes a novel stellar spectrum denoising method based on deep Bayesian modeling. The construction of our model includes a prior distribution for each stellar subclass, a spectrum generator and a flow-based noise model. Our method takes into account the noise correlation structure, and it is not susceptible to strong sky emission lines and cosmic rays. Moreover, it is able to naturally handle spectra with missing flux values without ad-hoc imputation. The proposed method is evaluated on real stellar spectra from the Sloan Digital Sky Survey(SDSS) with a comprehensive list of common stellar subclasses and compared to the standard denoising auto-encoder. Our denoising method demonstrates a superior performance to the standard denoising auto-encoder, in respect of denoising quality and missing flux imputation. It may be potentially helpful in improving the accuracy of the classification and physical parameter measurement of stars when applying our method during data preprocessing.展开更多
Any change in technical or environmental conditions of observations may result in bias from the precise values of observed climatic variables. The common name of these biases is inhomogeneity (IH). IHs usually appear ...Any change in technical or environmental conditions of observations may result in bias from the precise values of observed climatic variables. The common name of these biases is inhomogeneity (IH). IHs usually appear in a form of sudden shift or gradual trends in the time series of any variable, and the timing of the shift indicates the date of change in the conditions of observation. The seasonal cycle of radiation intensity often causes marked seasonal cycle in the IHs of observed temperature time series, since a substantial portion of them has direct or indirect connection to radiation changes in the micro-environment of the thermometer. Therefore the magnitudes of temperature IHs tend to be larger in summer than in winter. A new homogenisation method (ACMANT) has recently been developed which treats in a special way the seasonal changes of IH-sizes in temperature time series. The ACMANT is a further development of the Caussinus-Mestre method, that is one of the most effective tool among the known homogenising methods. The ACMANT applies a bivariate test for searching the timings of IHs, the two variables are the annual mean temperature and the amplitude of seasonal temperature-cycle. The ACMANT contains several further innovations whose efficiencies are tested with the benchmark of the COST ES0601 project. The paper describes the properties and the operation of ACMANT and presents some verification results. The results show that the ACMANT has outstandingly high performance. The ACMANT is a recommended method for homogenising networks of monthly temperature time series that observed in mid- or high geographical latitudes, because the harmonic seasonal cycle of IH-size is valid for these time series only.展开更多
Forbush decrease(FD),discovered by Scott E.Forbush about 80 years ago,is referred to as the non-repetitive short-term depression in Galactic cosmic ray(GCR)flux,presumed to be associated with large-scale perturbations...Forbush decrease(FD),discovered by Scott E.Forbush about 80 years ago,is referred to as the non-repetitive short-term depression in Galactic cosmic ray(GCR)flux,presumed to be associated with large-scale perturbations in solar wind and interplanetary magnetic field(IMF).It is the most spectacular variability in the GCR intensity which appears to be the compass for investigators seeking solar-terrestrial relationships.The method of selection and validation of FD events is very important to cosmic ray(CR)scientists.We have deployed new computer software to determine the amplitude and timing of FDs from daily-averaged CR data at Oulu Neutron Monitor station.The code selected 230 FDs between 1998 and 2002.In an attempt to validate the new FD automated catalog,the relationship between the amplitude of FDs,and IMF,solar wind speed(SWS)and geomagnetic storm indices(Dst,kp,ap)is tested here.A two-dimensional regression analysis indicates significant linear relationship between large FDs(CR(%)≤-3)and solar wind data and geomagnetic storm indices in the present sample.The implications of the relationship among these parameters are discussed.展开更多
Six high-resolution TiO-band image sequences from the New Vacuum Solar Telescope (NVST) are used to investigate the properties of intergranular bright points (igBPs). We detect the igBPs using a Laplacian and morp...Six high-resolution TiO-band image sequences from the New Vacuum Solar Telescope (NVST) are used to investigate the properties of intergranular bright points (igBPs). We detect the igBPs using a Laplacian and morphological dilation algorithm (LMD) and automatically track them using a three- dimensional segmentation algorithm, and then investigate the morphologic, photometric and dynamic prop- erties of igBPs in terms of equivalent diameter, intensity contrast, lifetime, horizontal velocity, diffusion index, motion range and motion type. The statistical results confirm previous studies based on G-band or TiO-band igBPs from other telescopes. These results illustrate that TiO data from the NVST are stable and reliable, and are suitable for studying igBPs. In addition, our method is feasible for detecting and track- ing igBPs with TiO data from the NVST. With the aid of vector magnetograms obtained from the Solar Dynamics Observatory/Helioseismic and Magnetic Imager, the properties of igBPs are found to be strongly influenced by their embedded magnetic environments. The areal coverage, size and intensity contrast values of igBPs are generally larger in regions with higher magnetic flux. However, the dynamics of igBPs, includ- ing the horizontal velocity, diffusion index, ratio of motion range and index of motion type are generally larger in the regions with lower magnetic flux. This suggests that the absence of strong magnetic fields in the medium makes it possible for the igBPs to look smaller and weaker, diffuse faster, and move faster and further along a straighter path.展开更多
Orbital correlation of space objects is one of the most important elements in space object iden- tification. Using the orbital elements, we provide correlation criteria to determine if objects are coplanar, co-orbital...Orbital correlation of space objects is one of the most important elements in space object iden- tification. Using the orbital elements, we provide correlation criteria to determine if objects are coplanar, co-orbital or the same. We analyze the prediction error of the correlation parameters for different orbital types and propose an orbital correlation method for space objects. The method is validated using two line elements and multisatellite launching data. The experimental results show that the proposed method is ef- fective, especially for space objects in near-circular orbits.展开更多
文摘Using GIS spatial statistical analysis method, with ArcGIS software as an analysis tool, taking the diseased maize in Hedong District of Linyi City as the study object, the distribution characteristic of the diseased crops this time in spatial location was analyzed. The results showed that the diseased crops mainly dis- tributed along with river tributaries and downstream of main rivers. The correlation between adjacent diseased plots was little, so the infection of pests and diseases were excluded, and the major reason of incidence might be river pollution.
文摘Some geophysical parameters, such as those related to gravitation and the geomagnetic field, could change during solar eclipses. In order to observe geomagnetic fluctuations, geomagnetic measurements were carded out in a limited time frame during the partial solar eclipse that occurred on 2011 January 4 and was observed in Canakkale and Ankara, Turkey. Additionally, records of the geomagnetic field spanning 24 hours, obtained from another observatory (in Iznik, Turkey), were also analyzed to check for any peculiar variations. In the data processing stage, a polynomial fit, following the application of a running average routine, was applied to the geomagnetic field data sets. Geomagnetic field data sets indicated there was a characteristic decrease at the beginning of the solar eclipse and this decrease can be well-correlated with previous geomagnetic field measurements that were taken during the total solar eclipse that was observed in Turkey on 2006 March 29. The behavior of the geomagnetic field is also consistent with previous observations in the literature. As a result of these analyses, it can be suggested that eclipses can cause a shielding effect on the geomagnetic field of the Earth.
基金supported by the National Key R&D Program of China(2017YFA0402701)Key Research Program of Frontier Sciences of CAS(QYZDJ-SSW-SLH047)partially supported by the National Natural Science Foundation of China(Grant No.U2031202)。
文摘Noise is a significant part within a millimeter-wave molecular line datacube.Analyzing the noise improves our understanding of noise characteristics,and further contributes to scientific discoveries.We measure the noise level of a single datacube from MWISP and perform statistical analyses.We identified major factors which increase the noise level of a single datacube,including bad channels,edge effects,baseline distortion and line contamination.Cleaning algorithms are applied to remove or reduce these noise components.As a result,we obtained the cleaned datacube in which noise follows a positively skewed normal distribution.We further analyzed the noise structure distribution of a 3 D mosaicked datacube in the range l=40°7 to 43°3 and b=-2°3 to 0°3 and found that noise in the final mosaicked datacube is mainly characterized by noise fluctuation among the cells.
基金Supported by the National Natural Science Foundation of ChinaThis paper is funded by the National Natural Science Foundation of China under grant under GrantNos. 10473013, 90412016 and 10778724 by the 863 project under Grant No. 2006AA01A120
文摘We introduced a decision tree method called Random Forests for multiwavelength data classification. The data were adopted from different databases, including the Sloan Digital Sky Survey (SDSS) Data Release five, USNO, FIRST and ROSAT. We then studied the discrimination of quasars from stars and the classification of quasars, stars and galaxies with the sample from optical and radio bands and with that from optical and X-ray bands. Moreover, feature selection and feature weighting based on Random Forests were investigated. The performances based on different input patterns were compared. The experimental results show that the random forest method is an effective method for astronomical object classification and can be applied to other classification problems faced in astronomy. In addition, Random Forests will show its superiorities due to its own merits, e.g. classification, feature selection, feature weighting as well as outlier detection.
基金supported by the National Key Basic R&D Program of China(2019YFA0405500)the National Natural Science Foundation of China(No.11603002)Beijing Normal University(No.310232102)。
文摘The Chinese Space Station Telescope(CSST)spectroscopic survey aims to deliver high-quality low-resolution(R>200)slitless spectra for hundreds of millions of targets down to a limiting magnitude of about 21 mag,distributed within a large survey area(17500 deg2)and covering a wide wavelength range(255-1000 nm by three bands GU,GV,and GI).As slitless spectroscopy precludes the usage of wavelength calibration lamps,wavelength calibration is one of the most challenging issues in the reduction of slitless spectra,yet it plays a key role in measuring precise radial velocities of stars and redshifts of galaxies.In this work,we propose a star-based method that can monitor and correct for possible errors in the CSST wavelength calibration using normal scientific observations,taking advantage of the facts that(ⅰ)there are about ten million stars with reliable radial velocities now available thanks to spectroscopic surveys like LAMOST,(ⅱ)the large field of view of CSST enables efficient observations of such stars in a short period of time,and(ⅲ)radial velocities of such stars can be reliably measured using only a narrow segment of CSST spectra.We demonstrate that it is possible to achieve a wavelength calibration precision of a few km s^(-1) for the GU band,and about 10 to 20 kms^(-1) for the GV and GI bands,with only a few hundred velocity standard stars.Implementations of the method to other surveys are also discussed.
基金Supported by the National Natural Science Foundation of China
文摘PLS (Partial Least Squares regression) is introduced into an automatic estimation of fundamental stellar spectral parameters. It extracts the most correlative spectral component to the parameters (Teff, log g and [Fe/H]), and sets up a linear regression function from spectra to the corresponding parameters. Considering the properties of stellar spectra and the PLS algorithm, we present a piecewise PLS regression method for estimation of stellar parameters, which is composed of one PLS model for Teff, and seven PLS models for log g and [Fe/H] estimation. Its performance is investigated by large experiments on flux calibrated spectra and continuum normalized spectra at different signal-to-noise ratios (SNRs) and resolutions. The results show that the piecewise PLS method is robust for spectra at the medium resolution of 0.23 nm. For low resolution 0.5 nm and 1 nm spectra, it achieves competitive results at higher SNR. Experiments using ELODIE spectra of 0.23 nm resolution illustrate that our piecewise PLS models trained with MILES spectra are efficient for O ~ G stars: for flux calibrated spectra, the systematic offsets are 3.8%, 0.14 dex, and -0.09 dex for Teff, log g and [Fe/H], with error scatters of 5.2%, 0.44 dex and 0.38 dex, respectively; for continuum normalized spectra, the systematic offsets are 3.8%, 0.12dex, and -0.13 dex for Teff, log g and [Fe/H], with error scatters of 5.2%, 0.49 dex and 0.41 dex, respectively. The PLS method is rapid, easy to use and does not rely as strongly on the tightness of a parameter grid of templates to reach high precision as Artificial Neural Networks or minimum distance methods do.
基金supported by the National Natural Science Foundation of China(U1931207,61602278 and 61702306)Sci.&Tech.Development Fund of Shandong Province of China(2016ZDJS02A11,ZR2017BF015 and ZR2017MF027)+1 种基金the Humanities and Social Science Research Project of the Ministry of Education(18YJAZH017)the Taishan Scholar Program of Shandong Province,and the Science and Technology Support Plan of Youth Innovation Team of Shandong Higher School(2019KJN024)。
文摘Predicting seeing of astronomical observations can provide hints of the quality of optical imaging in the near future,and facilitate flexible scheduling of observation tasks to maximize the use of astronomical observatories.Traditional approaches to seeing prediction mostly rely on regional weather models to capture the in-dome optical turbulence patterns.Thanks to the developing of data gathering and aggregation facilities of astronomical observatories in recent years,data-driven approaches are becoming increasingly feasible and attractive to predict astronomical seeing.This paper systematically investigates data-driven approaches to seeing prediction by leveraging various big data techniques,from traditional statistical modeling,machine learning to new emerging deep learning methods,on the monitoring data of the Large sky Area Multi-Object fiber Spectroscopic Telescope(LAMOST).The raw monitoring data are preprocessed to allow for big data modeling.Then we formulate the seeing prediction task under each type of modeling framework and develop seeing prediction models through using representative big data techniques,including ARIMA and Prophet for statistical modeling,MLP and XGBoost for machine learning,and LSTM,GRU and Transformer for deep learning.We perform empirical studies on the developed models with a variety of feature configurations,yielding notable insights into the applicability of big data techniques to the seeing prediction task.
基金supported by the National Natural Science Foundation of China under Grant Nos.U2031140,11873027,and 12073077。
文摘The extraction of high-temperature regions in active regions(ARs)is an important means to help understand the mechanism of coronal heating.The important observational means of high-temperature radiation in ARs is the main emission line of Fe XVⅢin the 94?of the Atmospheric Imaging Assembly.However,the diagnostic algorithms for Fe XVⅢ,including the differential emission measure(DEM)and linear diagnostics proposed by Del based on the DEM,have been greatly limited for a long time,and the results obtained are different from the predictions.In this paper,we use the outlier detection method to establish the nonlinear correlation between 94?and 171,193,211?based on the former researches by others.A neural network based on 171,193,211?is constructed to replace the low-temperature emission lines in the ARs of 94?.The predicted results are regarded as the low-temperature components of 94?,and then the predicted results are subtracted from 94?to obtain the outlier component of 94?,or Fe XVⅢ.Then,the outlier components obtained by neural network are compared with the Fe XVⅢobtained by DEM and Del's method,and a high similarity is found,which proves the reliability of neural network to obtain the high-temperature components of ARs,but there are still many differences.In order to analyze the differences between the Fe XVⅢobtained by the three methods,we subtract the Fe XVⅢobtained by the DEM and Del's method from the Fe XVⅢobtained by the neural network to obtain the residual value,and compare it with the results of Fe XIV in the temperature range of 6.1-6.45 MK.It is found that there is a great similarity,which also shows that the Fe XVⅢobtained by DEM and Del's method still has a large low-temperature component dominated by Fe XIV,and the Fe XVⅢobtained by neural network is relatively pure.
基金funded by the National Natural Science Foundation of China(Grant No.11973021)。
文摘In pulsar astronomy, detecting effective pulsar signals among numerous pulsar candidates is an important research topic. Starting from space X-ray pulsar signals, the two-dimensional autocorrelation profile map(2 D-APM) feature modelling method, which utilizes epoch folding of the autocorrelation function of X-ray signals and expands the time-domain information of the periodic axis, is proposed. A uniform setting criterion regarding the time resolution of the periodic axis addresses pulsar signals without any prior information. Compared with the traditional profile, the model has a strong anti-noise ability, a greater abundance of information and consistent characteristics. The new feature is simulated with double Gaussian components, and the characteristic distribution of the model is revealed to be closely related to the distance between the double peaks of the profile. Next, a deep convolutional neural network(DCNN)is built, named Inception-Res Net. According to the order of the peak separation and number of arriving photons, 30 data sets based on the Poisson process are simulated to construct the training set, and the observation data of PSRs B0531+21, B0540-69 and B1509-58 from the Rossi X-ray Timing Explorer(RXTE) are selected to generate the test set. The number of training sets and the number of test sets are 30 000 and 5400, respectively. After achieving convergence stability, more than 99% of the pulsar signals are recognized, and more than 99% of the interference is successfully rejected, which verifies the high degree of agreement between the network and the feature model and the high potential of the proposed method in searching for pulsars.
基金supported by the National Natural Science Foundation of China(12261141689,12273075,and 12373108)the National Key R&D Program of China No.2019YFA0405502the science research grants from the China Manned Space Project with No.CMS-CSST-2021-B05。
文摘Stellar classification and radius estimation are crucial for understanding the structure of the Universe and stella evolution.With the advent of the era of astronomical big data,multimodal data are available and theoretically effective for stellar classification and radius estimation.A problem is how to improve the performance of this task by jointly using the multimodal data.However,existing research primarily focuses on using single-modal data.To this end,this paper proposes a model,Multi-Modal SCNet,and its ensemble model Multimodal Ensemble fo Stellar Classification and Regression(MESCR)for improving stellar classification and radius estimation performance by fusing two modality data.In this problem,a typical phenomenon is that the sample numbers o some types of stars are evidently more than others.This imbalance has negative effects on model performance Therefore,this work utilizes a weighted sampling strategy to deal with the imbalance issues in MESCR.Som evaluation experiments are conducted on a test set for MESCR and the classification accuracy is 96.1%,and th radius estimation performance Mean of Absolute Error andσare 0.084 dex and 0.149 R_(⊙),respectively.Moreover we assessed the uncertainty of model predictions,confirming good consistency within a reasonable deviation range.Finally,we applied our model to 50,871,534 SDSS stars without spectra and published a new catalog.
基金supported by the Institute of Geographic Sciences and Natural Resources Research,Chinese Academy of Sciences.
文摘The Yellow River Basin of China is a key region that contains myriad interactions between human activities and natural environment.Industrialization and urbanization promote social-economic development,but they also have generated a series of environmental and ecological issues in this basin.Previous researches have evaluated urban resilience at the national,regional,urban agglomeration,city,and prefecture levels,but not at the watershed level.To address this research gap and elevate the Yellow River Basin’s urban resilience level,we constructed an urban resilience evaluation index system from five dimensions:industrial resilience,social resilience,environmental resilience,technological resilience,and organizational resilience.The entropy weight method was used to comprehensively evaluate urban resilience in the Yellow River Basin.The exploratory spatial data analysis method was employed to study the spatiotemporal differences in urban resilience in the Yellow River Basin in 2010,2015,and 2020.Furthermore,the grey correlation analysis method was utilized to explore the influencing factors of these differences.The results of this study are as follows:(1)the overall level of urban resilience in the Yellow River Basin was relatively low but showed an increasing trend during 2010–2015,and significant spatial distribution differences were observed,with a higher resilience level in the eastern region and a low-medium resilience level in the western region;(2)the differences in urban resilience were noticeable,with industrial resilience and social resilience being relatively highly developed,whereas organizational resilience and environmental resilience were relatively weak;and(3)the correlation ranking of resilience influencing factors was as follows:science and technology level>administrative power>openness>market forces.This research can provide a basis for improving the resilience level of cities in the Yellow River Basin and contribute to the high-quality development of the region.
文摘It is well known that the nonparametric estimation of the regression function is highly sensitive to the presence of even a small proportion of outliers in the data.To solve the problem of typical observations when the covariates of the nonparametric component are functional,the robust estimates for the regression parameter and regression operator are introduced.The main propose of the paper is to consider data-driven methods of selecting the number of neighbors in order to make the proposed processes fully automatic.We use thek Nearest Neighbors procedure(kNN)to construct the kernel estimator of the proposed robust model.Under some regularity conditions,we state consistency results for kNN functional estimators,which are uniform in the number of neighbors(UINN).Furthermore,a simulation study and an empirical application to a real data analysis of octane gasoline predictions are carried out to illustrate the higher predictive performances and the usefulness of the kNN approach.
基金funded by the National Natural Science Foundation of China(Grant Nos.11873066 and U1731109)。
文摘Spectrum denoising is an important procedure for large-scale spectroscopical surveys. This work proposes a novel stellar spectrum denoising method based on deep Bayesian modeling. The construction of our model includes a prior distribution for each stellar subclass, a spectrum generator and a flow-based noise model. Our method takes into account the noise correlation structure, and it is not susceptible to strong sky emission lines and cosmic rays. Moreover, it is able to naturally handle spectra with missing flux values without ad-hoc imputation. The proposed method is evaluated on real stellar spectra from the Sloan Digital Sky Survey(SDSS) with a comprehensive list of common stellar subclasses and compared to the standard denoising auto-encoder. Our denoising method demonstrates a superior performance to the standard denoising auto-encoder, in respect of denoising quality and missing flux imputation. It may be potentially helpful in improving the accuracy of the classification and physical parameter measurement of stars when applying our method during data preprocessing.
文摘Any change in technical or environmental conditions of observations may result in bias from the precise values of observed climatic variables. The common name of these biases is inhomogeneity (IH). IHs usually appear in a form of sudden shift or gradual trends in the time series of any variable, and the timing of the shift indicates the date of change in the conditions of observation. The seasonal cycle of radiation intensity often causes marked seasonal cycle in the IHs of observed temperature time series, since a substantial portion of them has direct or indirect connection to radiation changes in the micro-environment of the thermometer. Therefore the magnitudes of temperature IHs tend to be larger in summer than in winter. A new homogenisation method (ACMANT) has recently been developed which treats in a special way the seasonal changes of IH-sizes in temperature time series. The ACMANT is a further development of the Caussinus-Mestre method, that is one of the most effective tool among the known homogenising methods. The ACMANT applies a bivariate test for searching the timings of IHs, the two variables are the annual mean temperature and the amplitude of seasonal temperature-cycle. The ACMANT contains several further innovations whose efficiencies are tested with the benchmark of the COST ES0601 project. The paper describes the properties and the operation of ACMANT and presents some verification results. The results show that the ACMANT has outstandingly high performance. The ACMANT is a recommended method for homogenising networks of monthly temperature time series that observed in mid- or high geographical latitudes, because the harmonic seasonal cycle of IH-size is valid for these time series only.
文摘Forbush decrease(FD),discovered by Scott E.Forbush about 80 years ago,is referred to as the non-repetitive short-term depression in Galactic cosmic ray(GCR)flux,presumed to be associated with large-scale perturbations in solar wind and interplanetary magnetic field(IMF).It is the most spectacular variability in the GCR intensity which appears to be the compass for investigators seeking solar-terrestrial relationships.The method of selection and validation of FD events is very important to cosmic ray(CR)scientists.We have deployed new computer software to determine the amplitude and timing of FDs from daily-averaged CR data at Oulu Neutron Monitor station.The code selected 230 FDs between 1998 and 2002.In an attempt to validate the new FD automated catalog,the relationship between the amplitude of FDs,and IMF,solar wind speed(SWS)and geomagnetic storm indices(Dst,kp,ap)is tested here.A two-dimensional regression analysis indicates significant linear relationship between large FDs(CR(%)≤-3)and solar wind data and geomagnetic storm indices in the present sample.The implications of the relationship among these parameters are discussed.
基金the support received from the National Natural Science Foundation of China (Nos. 11573012, 11303011, 11263004, 11163004 and U1231205)the Open Research Program of the Key Laboratory of Solar Activity of the Chinese Academy of Sciences (Nos. KLSA201414 and KLSA201505)
文摘Six high-resolution TiO-band image sequences from the New Vacuum Solar Telescope (NVST) are used to investigate the properties of intergranular bright points (igBPs). We detect the igBPs using a Laplacian and morphological dilation algorithm (LMD) and automatically track them using a three- dimensional segmentation algorithm, and then investigate the morphologic, photometric and dynamic prop- erties of igBPs in terms of equivalent diameter, intensity contrast, lifetime, horizontal velocity, diffusion index, motion range and motion type. The statistical results confirm previous studies based on G-band or TiO-band igBPs from other telescopes. These results illustrate that TiO data from the NVST are stable and reliable, and are suitable for studying igBPs. In addition, our method is feasible for detecting and track- ing igBPs with TiO data from the NVST. With the aid of vector magnetograms obtained from the Solar Dynamics Observatory/Helioseismic and Magnetic Imager, the properties of igBPs are found to be strongly influenced by their embedded magnetic environments. The areal coverage, size and intensity contrast values of igBPs are generally larger in regions with higher magnetic flux. However, the dynamics of igBPs, includ- ing the horizontal velocity, diffusion index, ratio of motion range and index of motion type are generally larger in the regions with lower magnetic flux. This suggests that the absence of strong magnetic fields in the medium makes it possible for the igBPs to look smaller and weaker, diffuse faster, and move faster and further along a straighter path.
基金supported by the National Natural Science Foundation of China(Grant Nos.11572166 and 61401515)
文摘Orbital correlation of space objects is one of the most important elements in space object iden- tification. Using the orbital elements, we provide correlation criteria to determine if objects are coplanar, co-orbital or the same. We analyze the prediction error of the correlation parameters for different orbital types and propose an orbital correlation method for space objects. The method is validated using two line elements and multisatellite launching data. The experimental results show that the proposed method is ef- fective, especially for space objects in near-circular orbits.