The detection of hypersonic targets usually confronts range migration(RM)issue before coherent integration(CI).The traditional methods aiming at correcting RM to obtain CI mainly considers the narrow-band radar condit...The detection of hypersonic targets usually confronts range migration(RM)issue before coherent integration(CI).The traditional methods aiming at correcting RM to obtain CI mainly considers the narrow-band radar condition.However,with the increasing requirement of far-range detection,the time bandwidth product,which is corresponding to radar’s mean power,should be promoted in actual application.Thus,the echo signal generates the scale effect(SE)at large time bandwidth product situation,influencing the intra and inter pulse integration performance.To eliminate SE and correct RM,this paper proposes an effective algorithm,i.e.,scaled location rotation transform(ScLRT).The ScLRT can remove SE to obtain the matching pulse compression(PC)as well as correct RM to complete CI via the location rotation transform,being implemented by seeking the actual rotation angle.Compared to the traditional coherent detection algorithms,Sc LRT can address the SE problem to achieve better detection/estimation capabilities.At last,this paper gives several simulations to assess the viability of ScLRT.展开更多
Building emission reduction is an important way to achieve China’s carbon peaking and carbon neutrality goals.Aiming at the problem of low carbon economic operation of a photovoltaic energy storage building system,a ...Building emission reduction is an important way to achieve China’s carbon peaking and carbon neutrality goals.Aiming at the problem of low carbon economic operation of a photovoltaic energy storage building system,a multi-time scale optimal scheduling strategy based on model predictive control(MPC)is proposed under the consideration of load optimization.First,load optimization is achieved by controlling the charging time of electric vehicles as well as adjusting the air conditioning operation temperature,and the photovoltaic energy storage building system model is constructed to propose a day-ahead scheduling strategy with the lowest daily operation cost.Second,considering inter-day to intra-day source-load prediction error,an intraday rolling optimal scheduling strategy based on MPC is proposed that dynamically corrects the day-ahead dispatch results to stabilize system power fluctuations and promote photovoltaic consumption.Finally,taking an office building on a summer work day as an example,the effectiveness of the proposed scheduling strategy is verified.The results of the example show that the strategy reduces the total operating cost of the photovoltaic energy storage building system by 17.11%,improves the carbon emission reduction by 7.99%,and the photovoltaic consumption rate reaches 98.57%,improving the system’s low-carbon and economic performance.展开更多
Due to the impact of source-load prediction power errors and uncertainties,the actual operation of the park will have a wide range of fluctuations compared with the expected state,resulting in its inability to achieve...Due to the impact of source-load prediction power errors and uncertainties,the actual operation of the park will have a wide range of fluctuations compared with the expected state,resulting in its inability to achieve the expected economy.This paper constructs an operating simulation model of the park power grid operation considering demand response and proposes a multi-time scale operating simulation method that combines day-ahead optimization and model predictive control(MPC).In the day-ahead stage,an operating simulation plan that comprehensively considers the user’s side comfort and operating costs is proposed with a long-term time scale of 15 min.In order to cope with power fluctuations of photovoltaic,wind turbine and conventional load,MPC is used to track and roll correct the day-ahead operating simulation plan in the intra-day stage to meet the actual operating operation status of the park.Finally,the validity and economy of the operating simulation strategy are verified through the analysis of arithmetic examples.展开更多
The lethal dose LD<sub>50</sub> represents the most important experimental value for acute toxicity. The simple logarithmic calculation of -log<sub>10</sub> LD<sub>50</sub> = value ...The lethal dose LD<sub>50</sub> represents the most important experimental value for acute toxicity. The simple logarithmic calculation of -log<sub>10</sub> LD<sub>50</sub> = value leads to the possible poison power pLD. As with the pH or pK value, respectively, for acid or the scale of earthquake intensities the logarithm helps making large differences of orders of magnitude easier to understand since they are more comparable. The higher the pLD value, the higher is the power of poison. An increase of the pLD value by 1 stands for a tenfold increase in toxicity. The lethal acute dose for water, one of the most important and at the same time non-toxic substances of all, is about one tenth of the body weight. This leads to a possible pLD value for water of 1, an ideal starting value for a logarithmic poison scale.展开更多
School-based universal screening for behavioral/emotional risk is a necessary first step to providing services in an educational setting for students with emotional and behavioral disorders (EBDs). Psychometric proper...School-based universal screening for behavioral/emotional risk is a necessary first step to providing services in an educational setting for students with emotional and behavioral disorders (EBDs). Psychometric properties are critical to making decisions about choosing a screening instrument. The purpose of the present study was to examine the psychometric properties of the student risk screening scale for internalizing and externalizing behaviors (SRSS-IE). Participants included 3145 students and their teachers. Item-level analyses of the current sample supported the retention of all items. The internal consistency of the SRSS items ranged from 0.83 to 0.85. Convergent validity between the SRSS-IE and a well-established screening tool, the strength and difficulties questionnaire (SDQ), was found for the total score (r = 0.70). Additionally, the results of this study demonstrate strong social validity, suggesting the SRSS-IE to be a useful and functional screening tool. We conclude that the SRSS-IE is a valid and reliable instrument for assessing the level of emotional and behavioral difficulties among elementary students.展开更多
This paper introduces the two Upsilon constants to the reader. Their usefulness is described with respect to acting as coupling constants between the CMB temperature and the Hubble constant. In addition, this paper su...This paper introduces the two Upsilon constants to the reader. Their usefulness is described with respect to acting as coupling constants between the CMB temperature and the Hubble constant. In addition, this paper summarizes the current state of quantum cosmology with respect to the Flat Space Cosmology (FSC) model. Although the FSC quantum cosmology formulae were published in 2018, they are only rearrangements and substitutions of the other assumptions into the original FSC Hubble temperature formula. In a real sense, this temperature formula was the first quantum cosmology formula developed since Hawking’s black hole temperature formula. A recent development in the last month proves that the FSC Hubble temperature formula can be derived from the Stephan-Boltzmann law. Thus, this Hubble temperature formula effectively unites some quantum developments with the general relativity model inherent in FSC. More progress towards unification in the near-future is expected.展开更多
Here we present the foundations of the Scale-Symmetric Theory (SST), i.e. the fundamental phase transitions of the initial inflation field, the atom-like structure of baryons and different types of black holes. Within...Here we present the foundations of the Scale-Symmetric Theory (SST), i.e. the fundamental phase transitions of the initial inflation field, the atom-like structure of baryons and different types of black holes. Within SST we show that the transition from the nuclear strong interactions in the off-shell Higgs boson production to the nuclear weak interactions causes that the real total width of the Higgs boson from the Higgs line shape (i.e. 3.3 GeV) decreases to 4.3 MeV that is the illusory total width. Moreover, there appear some glueballs/condensates with the energy 3.3 GeV that accompany the production of the off-shell Higgs bosons.展开更多
Convolutional neural networks (CNNs) are widely used in image classification tasks, but their increasing model size and computation make them challenging to implement on embedded systems with constrained hardware reso...Convolutional neural networks (CNNs) are widely used in image classification tasks, but their increasing model size and computation make them challenging to implement on embedded systems with constrained hardware resources. To address this issue, the MobileNetV1 network was developed, which employs depthwise convolution to reduce network complexity. MobileNetV1 employs a stride of 2 in several convolutional layers to decrease the spatial resolution of feature maps, thereby lowering computational costs. However, this stride setting can lead to a loss of spatial information, particularly affecting the detection and representation of smaller objects or finer details in images. To maintain the trade-off between complexity and model performance, a lightweight convolutional neural network with hierarchical multi-scale feature fusion based on the MobileNetV1 network is proposed. The network consists of two main subnetworks. The first subnetwork uses a depthwise dilated separable convolution (DDSC) layer to learn imaging features with fewer parameters, which results in a lightweight and computationally inexpensive network. Furthermore, depthwise dilated convolution in DDSC layer effectively expands the field of view of filters, allowing them to incorporate a larger context. The second subnetwork is a hierarchical multi-scale feature fusion (HMFF) module that uses parallel multi-resolution branches architecture to process the input feature map in order to extract the multi-scale feature information of the input image. Experimental results on the CIFAR-10, Malaria, and KvasirV1 datasets demonstrate that the proposed method is efficient, reducing the network parameters and computational cost by 65.02% and 39.78%, respectively, while maintaining the network performance compared to the MobileNetV1 baseline.展开更多
In the nonparametric data envelopment analysis literature,scale elasticity is evaluated in two alternative ways:using either the technical efficiency model or the cost efficiency model.This evaluation becomes problema...In the nonparametric data envelopment analysis literature,scale elasticity is evaluated in two alternative ways:using either the technical efficiency model or the cost efficiency model.This evaluation becomes problematic in several situations,for example(a)when input proportions change in the long run,(b)when inputs are heterogeneous,and(c)when firms face ex-ante price uncertainty in making their production decisions.To address these situations,a scale elasticity evaluation was performed using a value-based cost efficiency model.However,this alternative value-based scale elasticity evaluation is sensitive to the uncertainty and variability underlying input and output data.Therefore,in this study,we introduce a stochastic cost-efficiency model based on chance-constrained programming to develop a value-based measure of the scale elasticity of firms facing data uncertainty.An illustrative empirical application to the Indian banking industry comprising 71 banks for eight years(1998–2005)was made to compare inferences about their efficiency and scale properties.The key findings are as follows:First,both the deterministic model and our proposed stochastic model yield distinctly different results concerning the efficiency and scale elasticity scores at various tolerance levels of chance constraints.However,both models yield the same results at a tolerance level of 0.5,implying that the deterministic model is a special case of the stochastic model in that it reveals the same efficiency and returns to scale characterizations of banks.Second,the stochastic model generates higher efficiency scores for inefficient banks than its deterministic counterpart.Third,public banks exhibit higher efficiency than private and foreign banks.Finally,public and old private banks mostly exhibit either decreasing or constant returns to scale,whereas foreign and new private banks experience either increasing or decreasing returns to scale.Although the application of our proposed stochastic model is illustrative,it can be potentially applied to all firms in the information and distribution-intensive industry with high fixed costs,which have ample potential for reaping scale and scope benefits.展开更多
The monopile is the most common foundation to support offshore wind turbines.In the marine environment,local scour due to combined currents and waves is a significant issue that must be considered in the design of win...The monopile is the most common foundation to support offshore wind turbines.In the marine environment,local scour due to combined currents and waves is a significant issue that must be considered in the design of wind turbine foundations.In this paper,a full-scale numerical model was developed and validated based on field data from Rudong,China.The scour development around monopiles was investigated,and the effects of waves and the Reynolds number Re were analyzed.Several formulas for predicting the scour depth in the literature have been evaluated.It is found that waves can accelerate scour development even if the KC number is small(0.78<KC<1.57).The formula obtained from small-scale model tests may be unsafe or wasteful when it is applied in practical design due to the scale effect.A new equation for predicting the scour depth based on the average pile Reynolds number(Rea)is proposed and validated with field data.The equilibrium scour depth predicted using the proposed equation is evaluated and compared with those from nine equations in the literature.It is demonstrated that the values predicted from the proposed equation and from the S/M(Sheppard/Melville)equation are closer to the field data.展开更多
Due to the scale effect, the uniform distribution of reagents in continuous flow reactor becomes bad when the channel is enlarged to tens of millimeters. Microfluidic field strategy was proposed to produce high mixing...Due to the scale effect, the uniform distribution of reagents in continuous flow reactor becomes bad when the channel is enlarged to tens of millimeters. Microfluidic field strategy was proposed to produce high mixing efficiency in large-scale channel. A 3D spiral baffle structure(3SBS) was designed and optimized to form microfluidic field disturbed by continuous secondary flow in millimeter scale Y-shaped tube mixer(YSTM). Enhancement effect of the 3SBS in liquid-liquid homogeneous chemical processes was verified and evaluated through the combination of simulation and experiment. Compared with 1 mm YSTM, 10 mm YSTM with 3SBS increased the treatment capacity by 100 times, shortened the basic complete mixing time by 0.85 times, which proves the potential of microfluidic field strategy in enhancement and scale-up of liquid-liquid homogeneous chemical process.展开更多
With the rapid development in advanced industries,such as microelectronics and optics sectors,the functional feature size of devises/components has been decreasing from micro to nanometric,and even ACS for higher perf...With the rapid development in advanced industries,such as microelectronics and optics sectors,the functional feature size of devises/components has been decreasing from micro to nanometric,and even ACS for higher performance,smaller volume and lower energy consumption.By this time,a great many quantum structures are proposed,with not only an extreme scale of several or even single atom,but also a nearly ideal lattice structure with no material defect.It is almost no doubt that such structures play critical role in the next generation products,which shows an urgent demand for the ACSM.Laser machining is one of the most important approaches widely used in engineering and scientific research.It is high-efficient and applicable for most kinds of materials.Moreover,the processing scale covers a huge range from millimeters to nanometers,and has already touched the atomic level.Laser–material interaction mechanism,as the foundation of laser machining,determines the machining accuracy and surface quality.It becomes much more sophisticated and dominant with a decrease in processing scale,which is systematically reviewed in this article.In general,the mechanisms of laser-induced material removal are classified into ablation,CE and atomic desorption,with a decrease in the scale from above microns to angstroms.The effects of processing parameters on both fundamental material response and machined surface quality are discussed,as well as theoretical methods to simulate and understand the underlying mechanisms.Examples at nanometric to atomic scale are provided,which demonstrate the capability of laser machining in achieving the ultimate precision and becoming a promising approach to ACSM.展开更多
The extra-large scale multiple-input multiple-output(XL-MIMO)for the beyond fifth/sixth generation mobile communications is a promising technology to provide Tbps data transmission and stable access service.However,th...The extra-large scale multiple-input multiple-output(XL-MIMO)for the beyond fifth/sixth generation mobile communications is a promising technology to provide Tbps data transmission and stable access service.However,the extremely large antenna array aperture arouses the channel near-field effect,resulting in the deteriorated data rate and other challenges in the practice communication systems.Meanwhile,multi-panel MIMO technology has attracted extensive attention due to its flexible configuration,low hardware cost,and wider coverage.By combining the XL-MIMO and multi-panel array structure,we construct multi-panel XL-MIMO and apply it to massive Internet of Things(IoT)access.First,we model the multi-panel XL-MIMO-based near-field channels for massive IoT access scenarios,where the electromagnetic waves corresponding to different panels have different angles of arrival/departure(AoAs/AoDs).Then,by exploiting the sparsity of the near-field massive IoT access channels,we formulate a compressed sensing based joint active user detection(AUD)and channel estimation(CE)problem which is solved by AMP-EM-MMV algorithm.The simulation results exhibit the superiority of the AMP-EM-MMV based joint AUD and CE scheme over the baseline algorithms.展开更多
Many tree planting programmes have long been initiated to increase forest cover to mitigate the effects of global climate change.Successful planting requires careful planning at the project level,including using suita...Many tree planting programmes have long been initiated to increase forest cover to mitigate the effects of global climate change.Successful planting requires careful planning at the project level,including using suitable species with favourable traits.However,there is a paucity of improvement data for tropical tree species.An experimental common garden of Shorea leprosula was established to study traits related to growth performance which are key factors in planting success.Seedlings of S.leprosula were collected from nine geographical forest reserves.To study the effects of genetic variation,seedlings were planted in a common environment following a randomized complete block design.From performance data collected 2017‒2019,one population showed the highest coefficient for relative height growth,significantly higher than most of the other populations.Interestingly,this population from Beserah also exhibited the lowest coefficient for scale insect infestation.This study provides preliminary results on growth performance and susceptibility to scale insect infestation in S.leprosula and the first common garden experiment site conducted on dipterocarp species.It lays a foundation for future genome-wide studies.展开更多
Landform elements with varying morphologies and spatial arrangements are recognized as feature indicator of landform classification and play a critical role in geomorphological studies.Differential geometry method has...Landform elements with varying morphologies and spatial arrangements are recognized as feature indicator of landform classification and play a critical role in geomorphological studies.Differential geometry method has been extensively applied in prior landform element research,while its efficacy in differentiating similar morphological characteristics remains inadequate to date.To reduce reliance on geomorphometric variables and increase awareness of landform patterns,geomorphons method was generated in previous study corresponding to specific landform reclassification map based on lookup table.Besides,to address the problem of feature similarity,hierarchical classification was proposed and effectively utilized for terrain recognition through the analytical strategy of fuzzy gradient features.Thus,combining the advantages of these two aspects,a hierarchical framework was proposed in this study for landform element pattern recognition considering the morphology and hierarchy factors.First,the local triplet patterns derived from geomorphons were enhanced by setting the flatness threshold,and subsequently adopted for the primary landform element recognition.Then,as geomorphic units with the same morphology possess different spatial analytical scales,the unidentified landform elements under the principle of scale adaptation were determined by calculating the spatial correlation and entropy information.To ensure the effectiveness of this proposed method,the sampling points were randomly selected from NASADEM data and then validated against a real 3D terrain model.Quantitative results of landform element pattern recognition demonstrate that our approach can reach above 77%average accuracy.Additionally,it delineates local details more effectively than geomorphons in visual assessment,resulting in a 7%accuracy improvement in overall scale.展开更多
The movement mode of snakes is crawling,and the living environment of snakes with numerous branches and stones will cause plenty of wear for the snake scales.There are plenty of surface structures and morphology on sn...The movement mode of snakes is crawling,and the living environment of snakes with numerous branches and stones will cause plenty of wear for the snake scales.There are plenty of surface structures and morphology on snake scales to avoid severe wear.Among them,the research towards the keeled structure on snake scales is missing.Therefore,in this research,the wear resistance improvement of the keeled structure on the snake scales and the overlapped distribution of snake scales are investigated.The keeled and smooth snake scales were 3D printed and they were distributed on the substrate in the overlapped or paralleled ways.Besides these four samples with keeled/smooth scales and overlapped/paralleled distributed,there is also a reference sample with the same thickness.Based on the tribology test,the number of grooves of samples with the keeled structures is higher than that of samples with smooth surfaces,which indicates that the keeled structure dramatically enhances the wear resistance of snake scales,especially during the wear in the vertical direction.The experiment on surface morphology greatly compromised the result of the tribology test.In addition,the bottom portion of the keeled snake scales can be protected by the keeled structure.Besides,the overlapped distribution can protect the central region of snake scales and provide double-layer protection of the snake body.Overall,the keeled structure and the overlapped distribution play a significant part in the improvement of wear resistance of the snake skin.These findings can enhance the knowledge of the reptiles-mimic surface structure and facilitate the application of military uniforms under high-wear conditions.展开更多
High-resolution precipitation data is conducive to objectively describe the spatial-temporal variability of regional precipitation,and the study of downscaling techniques and spatial scale effects can provide technica...High-resolution precipitation data is conducive to objectively describe the spatial-temporal variability of regional precipitation,and the study of downscaling techniques and spatial scale effects can provide technical and theoretical support to improve the spatial resolution and accuracy of satellite precipitation data.In this study,we used a machine learning algorithm combined with a regression algorithm RF-PLS(Random Forest-Partial Least Squares)to construct a downscaling model to obtain three types of high-resolution TRMM(Tropical Rainfall Measuring Mission)downscaled precipitation data for the years 2000-2017 at 250 m,500 m,and 1km.The scale effects with topographic and geomorphological features in the study area were analysed.Finally,we described the spatial and temporal variation of precipitation based on the optimal TRMM downscaled precipitation data.The results showed that:1)The linear relationships between the TRMM downscaled precipitation data obtained by each of the three downscaled models(PLS,RF,and RF-PLS)and the precipitation at the observation stations were improved compared to the linear relationships between the original TRMM data and the precipitation at the observation stations.The accuracy of the RF-PLS model was better than the other two models.2)Based on the RF-PLS model,the resolution of the TRMM data was increased to three different scales(250 m,500 m,and 1 km),considering the scale effects with topographic and geomorphological features.The precipitation simulation effect with a spatial resolution of 500 m was better than the other two scales.3)The annual precipitation was the highest in the areas with extremely high mountains,followed by the mediumhigh mountain,high mountain,medium mountain,medium-low mountain,plain,low mountain,and basin.展开更多
The impact of lag effects produced by disturbances on primary production has been a major concern among ecologists during the last decade.Sudden and extreme climatic events are imposing drastic reductions in radial gr...The impact of lag effects produced by disturbances on primary production has been a major concern among ecologists during the last decade.Sudden and extreme climatic events are imposing drastic reductions in radial growth of trees as evidenced in tree-rings series Dendrochronological samples are obtained at tree level but analyzed at an aggregated scale(i.e.,mean chronologies),although aggregating tree-ring chronology on a regional scale may reduce the possibility of studying the variability of individual tree response to drought,by amplifying the average population response.Here,we conducted experimental research in which 370 trees of 5 species were analyzed to assess the potential statistical and scaling issues that may occur when using regressionbased methods to analyze ecosystem responses to disturbances.Drought legacy effects were quantified using individual and aggregated scales.Then,lag effects were validated using confidence and prediction intervals to identify values falling outside the certainty of the climate-growth model Individual scale legacy effects contrasted with confidence intervals were commonly distributed across species but were scarce when compared with prediction intervals.The analysis of aggregated scale legacies detected significant growth reductions when validated using prediction intervals;however,individual scale legacy lag effects were not detected.This finding directly contrasts the results obtained when using an aggregated scale.Our results provide empirical evidence on how aggregating ecological data to infer processes that emerge from an individual scale can lead to distorted conclusions.We therefore encourage the use of individual based statistical and ecological procedures to analyze tree rings as a means of further understanding the ecosystem responses to disturbances.展开更多
Natural fish scales demonstrate outstanding mechanical efficiency owing to their elaborate architectures and thereby may serve as ideal prototypes for the architectural design of man-made materials.Here bioinspired ma...Natural fish scales demonstrate outstanding mechanical efficiency owing to their elaborate architectures and thereby may serve as ideal prototypes for the architectural design of man-made materials.Here bioinspired magnesium composites with fish-scale-like orthogonal plywood and double-Bouligand architectures were developed by pressureless infiltration of a magnesium melt into the woven contextures of continuous titanium fibers.The composites exhibit enhanced strength and work-hardening ability compared to those estimated from a simple mixture of their constituents at ambient to elevated temperatures.In particular,the double-Bouligand architecture can effectively deflect cracking paths,alleviate strain localization,and adaptively reorient titanium fibers within the magnesium matrix during the deformation of the composite,representing a successful implementation of the property-optimizing mechanisms in fish scales.The strength of the composites,specifically the effect of their bioinspired architectures,was interpreted based on the adaptation of classical laminate theory.This study may offer a feasible approach for developing new bioinspired metal-matrix composites with improved performance and provide theoretical guidance for their architectural designs.展开更多
基金supported by the National Natural Science Foundation of China(62101099)the Chinese Postdoctoral Science Foundation(2021M690558,2022T150100,2018M633352,2019T120825)+3 种基金the Young Elite Scientist Sponsorship Program(YESS20200082)the Aeronautical Science Foundation of China(2022Z017080001)the Open Foundation of Science and Technology on Electronic Information Control Laboratorythe Natural Science Foundation of Sichuan Province(2023NSFSC1386)。
文摘The detection of hypersonic targets usually confronts range migration(RM)issue before coherent integration(CI).The traditional methods aiming at correcting RM to obtain CI mainly considers the narrow-band radar condition.However,with the increasing requirement of far-range detection,the time bandwidth product,which is corresponding to radar’s mean power,should be promoted in actual application.Thus,the echo signal generates the scale effect(SE)at large time bandwidth product situation,influencing the intra and inter pulse integration performance.To eliminate SE and correct RM,this paper proposes an effective algorithm,i.e.,scaled location rotation transform(ScLRT).The ScLRT can remove SE to obtain the matching pulse compression(PC)as well as correct RM to complete CI via the location rotation transform,being implemented by seeking the actual rotation angle.Compared to the traditional coherent detection algorithms,Sc LRT can address the SE problem to achieve better detection/estimation capabilities.At last,this paper gives several simulations to assess the viability of ScLRT.
文摘Building emission reduction is an important way to achieve China’s carbon peaking and carbon neutrality goals.Aiming at the problem of low carbon economic operation of a photovoltaic energy storage building system,a multi-time scale optimal scheduling strategy based on model predictive control(MPC)is proposed under the consideration of load optimization.First,load optimization is achieved by controlling the charging time of electric vehicles as well as adjusting the air conditioning operation temperature,and the photovoltaic energy storage building system model is constructed to propose a day-ahead scheduling strategy with the lowest daily operation cost.Second,considering inter-day to intra-day source-load prediction error,an intraday rolling optimal scheduling strategy based on MPC is proposed that dynamically corrects the day-ahead dispatch results to stabilize system power fluctuations and promote photovoltaic consumption.Finally,taking an office building on a summer work day as an example,the effectiveness of the proposed scheduling strategy is verified.The results of the example show that the strategy reduces the total operating cost of the photovoltaic energy storage building system by 17.11%,improves the carbon emission reduction by 7.99%,and the photovoltaic consumption rate reaches 98.57%,improving the system’s low-carbon and economic performance.
基金supported by the Science and Technology Project of State Grid Shanxi Electric Power Research Institute:Research on Data-Driven New Power System Operation Simulation and Multi Agent Control Strategy(52053022000F).
文摘Due to the impact of source-load prediction power errors and uncertainties,the actual operation of the park will have a wide range of fluctuations compared with the expected state,resulting in its inability to achieve the expected economy.This paper constructs an operating simulation model of the park power grid operation considering demand response and proposes a multi-time scale operating simulation method that combines day-ahead optimization and model predictive control(MPC).In the day-ahead stage,an operating simulation plan that comprehensively considers the user’s side comfort and operating costs is proposed with a long-term time scale of 15 min.In order to cope with power fluctuations of photovoltaic,wind turbine and conventional load,MPC is used to track and roll correct the day-ahead operating simulation plan in the intra-day stage to meet the actual operating operation status of the park.Finally,the validity and economy of the operating simulation strategy are verified through the analysis of arithmetic examples.
文摘The lethal dose LD<sub>50</sub> represents the most important experimental value for acute toxicity. The simple logarithmic calculation of -log<sub>10</sub> LD<sub>50</sub> = value leads to the possible poison power pLD. As with the pH or pK value, respectively, for acid or the scale of earthquake intensities the logarithm helps making large differences of orders of magnitude easier to understand since they are more comparable. The higher the pLD value, the higher is the power of poison. An increase of the pLD value by 1 stands for a tenfold increase in toxicity. The lethal acute dose for water, one of the most important and at the same time non-toxic substances of all, is about one tenth of the body weight. This leads to a possible pLD value for water of 1, an ideal starting value for a logarithmic poison scale.
文摘School-based universal screening for behavioral/emotional risk is a necessary first step to providing services in an educational setting for students with emotional and behavioral disorders (EBDs). Psychometric properties are critical to making decisions about choosing a screening instrument. The purpose of the present study was to examine the psychometric properties of the student risk screening scale for internalizing and externalizing behaviors (SRSS-IE). Participants included 3145 students and their teachers. Item-level analyses of the current sample supported the retention of all items. The internal consistency of the SRSS items ranged from 0.83 to 0.85. Convergent validity between the SRSS-IE and a well-established screening tool, the strength and difficulties questionnaire (SDQ), was found for the total score (r = 0.70). Additionally, the results of this study demonstrate strong social validity, suggesting the SRSS-IE to be a useful and functional screening tool. We conclude that the SRSS-IE is a valid and reliable instrument for assessing the level of emotional and behavioral difficulties among elementary students.
文摘This paper introduces the two Upsilon constants to the reader. Their usefulness is described with respect to acting as coupling constants between the CMB temperature and the Hubble constant. In addition, this paper summarizes the current state of quantum cosmology with respect to the Flat Space Cosmology (FSC) model. Although the FSC quantum cosmology formulae were published in 2018, they are only rearrangements and substitutions of the other assumptions into the original FSC Hubble temperature formula. In a real sense, this temperature formula was the first quantum cosmology formula developed since Hawking’s black hole temperature formula. A recent development in the last month proves that the FSC Hubble temperature formula can be derived from the Stephan-Boltzmann law. Thus, this Hubble temperature formula effectively unites some quantum developments with the general relativity model inherent in FSC. More progress towards unification in the near-future is expected.
文摘Here we present the foundations of the Scale-Symmetric Theory (SST), i.e. the fundamental phase transitions of the initial inflation field, the atom-like structure of baryons and different types of black holes. Within SST we show that the transition from the nuclear strong interactions in the off-shell Higgs boson production to the nuclear weak interactions causes that the real total width of the Higgs boson from the Higgs line shape (i.e. 3.3 GeV) decreases to 4.3 MeV that is the illusory total width. Moreover, there appear some glueballs/condensates with the energy 3.3 GeV that accompany the production of the off-shell Higgs bosons.
文摘Convolutional neural networks (CNNs) are widely used in image classification tasks, but their increasing model size and computation make them challenging to implement on embedded systems with constrained hardware resources. To address this issue, the MobileNetV1 network was developed, which employs depthwise convolution to reduce network complexity. MobileNetV1 employs a stride of 2 in several convolutional layers to decrease the spatial resolution of feature maps, thereby lowering computational costs. However, this stride setting can lead to a loss of spatial information, particularly affecting the detection and representation of smaller objects or finer details in images. To maintain the trade-off between complexity and model performance, a lightweight convolutional neural network with hierarchical multi-scale feature fusion based on the MobileNetV1 network is proposed. The network consists of two main subnetworks. The first subnetwork uses a depthwise dilated separable convolution (DDSC) layer to learn imaging features with fewer parameters, which results in a lightweight and computationally inexpensive network. Furthermore, depthwise dilated convolution in DDSC layer effectively expands the field of view of filters, allowing them to incorporate a larger context. The second subnetwork is a hierarchical multi-scale feature fusion (HMFF) module that uses parallel multi-resolution branches architecture to process the input feature map in order to extract the multi-scale feature information of the input image. Experimental results on the CIFAR-10, Malaria, and KvasirV1 datasets demonstrate that the proposed method is efficient, reducing the network parameters and computational cost by 65.02% and 39.78%, respectively, while maintaining the network performance compared to the MobileNetV1 baseline.
文摘In the nonparametric data envelopment analysis literature,scale elasticity is evaluated in two alternative ways:using either the technical efficiency model or the cost efficiency model.This evaluation becomes problematic in several situations,for example(a)when input proportions change in the long run,(b)when inputs are heterogeneous,and(c)when firms face ex-ante price uncertainty in making their production decisions.To address these situations,a scale elasticity evaluation was performed using a value-based cost efficiency model.However,this alternative value-based scale elasticity evaluation is sensitive to the uncertainty and variability underlying input and output data.Therefore,in this study,we introduce a stochastic cost-efficiency model based on chance-constrained programming to develop a value-based measure of the scale elasticity of firms facing data uncertainty.An illustrative empirical application to the Indian banking industry comprising 71 banks for eight years(1998–2005)was made to compare inferences about their efficiency and scale properties.The key findings are as follows:First,both the deterministic model and our proposed stochastic model yield distinctly different results concerning the efficiency and scale elasticity scores at various tolerance levels of chance constraints.However,both models yield the same results at a tolerance level of 0.5,implying that the deterministic model is a special case of the stochastic model in that it reveals the same efficiency and returns to scale characterizations of banks.Second,the stochastic model generates higher efficiency scores for inefficient banks than its deterministic counterpart.Third,public banks exhibit higher efficiency than private and foreign banks.Finally,public and old private banks mostly exhibit either decreasing or constant returns to scale,whereas foreign and new private banks experience either increasing or decreasing returns to scale.Although the application of our proposed stochastic model is illustrative,it can be potentially applied to all firms in the information and distribution-intensive industry with high fixed costs,which have ample potential for reaping scale and scope benefits.
基金financially supported by the National Natural Science Foundation of China (Grant No.52378329)。
文摘The monopile is the most common foundation to support offshore wind turbines.In the marine environment,local scour due to combined currents and waves is a significant issue that must be considered in the design of wind turbine foundations.In this paper,a full-scale numerical model was developed and validated based on field data from Rudong,China.The scour development around monopiles was investigated,and the effects of waves and the Reynolds number Re were analyzed.Several formulas for predicting the scour depth in the literature have been evaluated.It is found that waves can accelerate scour development even if the KC number is small(0.78<KC<1.57).The formula obtained from small-scale model tests may be unsafe or wasteful when it is applied in practical design due to the scale effect.A new equation for predicting the scour depth based on the average pile Reynolds number(Rea)is proposed and validated with field data.The equilibrium scour depth predicted using the proposed equation is evaluated and compared with those from nine equations in the literature.It is demonstrated that the values predicted from the proposed equation and from the S/M(Sheppard/Melville)equation are closer to the field data.
基金supported by the National Key Research and Development Program of China (2021YFC2101900 and 2019YFA0905000)National Natural Science Foundation of China (21908094, 21776130 and 22078150)+1 种基金Nanjing International Joint Research and Development Project (202002037)Top-notch Academic Programs Project of Jiangsu Higher Education Institutions。
文摘Due to the scale effect, the uniform distribution of reagents in continuous flow reactor becomes bad when the channel is enlarged to tens of millimeters. Microfluidic field strategy was proposed to produce high mixing efficiency in large-scale channel. A 3D spiral baffle structure(3SBS) was designed and optimized to form microfluidic field disturbed by continuous secondary flow in millimeter scale Y-shaped tube mixer(YSTM). Enhancement effect of the 3SBS in liquid-liquid homogeneous chemical processes was verified and evaluated through the combination of simulation and experiment. Compared with 1 mm YSTM, 10 mm YSTM with 3SBS increased the treatment capacity by 100 times, shortened the basic complete mixing time by 0.85 times, which proves the potential of microfluidic field strategy in enhancement and scale-up of liquid-liquid homogeneous chemical process.
基金supported by the National Natural Science Foundation of China(Nos.52035009,52105475).
文摘With the rapid development in advanced industries,such as microelectronics and optics sectors,the functional feature size of devises/components has been decreasing from micro to nanometric,and even ACS for higher performance,smaller volume and lower energy consumption.By this time,a great many quantum structures are proposed,with not only an extreme scale of several or even single atom,but also a nearly ideal lattice structure with no material defect.It is almost no doubt that such structures play critical role in the next generation products,which shows an urgent demand for the ACSM.Laser machining is one of the most important approaches widely used in engineering and scientific research.It is high-efficient and applicable for most kinds of materials.Moreover,the processing scale covers a huge range from millimeters to nanometers,and has already touched the atomic level.Laser–material interaction mechanism,as the foundation of laser machining,determines the machining accuracy and surface quality.It becomes much more sophisticated and dominant with a decrease in processing scale,which is systematically reviewed in this article.In general,the mechanisms of laser-induced material removal are classified into ablation,CE and atomic desorption,with a decrease in the scale from above microns to angstroms.The effects of processing parameters on both fundamental material response and machined surface quality are discussed,as well as theoretical methods to simulate and understand the underlying mechanisms.Examples at nanometric to atomic scale are provided,which demonstrate the capability of laser machining in achieving the ultimate precision and becoming a promising approach to ACSM.
基金supported by National Key Research and Development Program of China under Grants 2021YFB1600500,2021YFB3201502,and 2022YFB3207704Natural Science Foundation of China(NSFC)under Grants U2233216,62071044,61827901,62088101 and 62201056+1 种基金supported by Shandong Province Natural Science Foundation under Grant ZR2022YQ62supported by Beijing Nova Program,Beijing Institute of Technology Research Fund Program for Young Scholars under grant XSQD-202121009.
文摘The extra-large scale multiple-input multiple-output(XL-MIMO)for the beyond fifth/sixth generation mobile communications is a promising technology to provide Tbps data transmission and stable access service.However,the extremely large antenna array aperture arouses the channel near-field effect,resulting in the deteriorated data rate and other challenges in the practice communication systems.Meanwhile,multi-panel MIMO technology has attracted extensive attention due to its flexible configuration,low hardware cost,and wider coverage.By combining the XL-MIMO and multi-panel array structure,we construct multi-panel XL-MIMO and apply it to massive Internet of Things(IoT)access.First,we model the multi-panel XL-MIMO-based near-field channels for massive IoT access scenarios,where the electromagnetic waves corresponding to different panels have different angles of arrival/departure(AoAs/AoDs).Then,by exploiting the sparsity of the near-field massive IoT access channels,we formulate a compressed sensing based joint active user detection(AUD)and channel estimation(CE)problem which is solved by AMP-EM-MMV algorithm.The simulation results exhibit the superiority of the AMP-EM-MMV based joint AUD and CE scheme over the baseline algorithms.
基金supported by the Government of Malaysia under the 10th and 11th Malaysia Plan.
文摘Many tree planting programmes have long been initiated to increase forest cover to mitigate the effects of global climate change.Successful planting requires careful planning at the project level,including using suitable species with favourable traits.However,there is a paucity of improvement data for tropical tree species.An experimental common garden of Shorea leprosula was established to study traits related to growth performance which are key factors in planting success.Seedlings of S.leprosula were collected from nine geographical forest reserves.To study the effects of genetic variation,seedlings were planted in a common environment following a randomized complete block design.From performance data collected 2017‒2019,one population showed the highest coefficient for relative height growth,significantly higher than most of the other populations.Interestingly,this population from Beserah also exhibited the lowest coefficient for scale insect infestation.This study provides preliminary results on growth performance and susceptibility to scale insect infestation in S.leprosula and the first common garden experiment site conducted on dipterocarp species.It lays a foundation for future genome-wide studies.
基金supported by the National Natural Science Foundation of China(Grant Nos.41930102,41971339 and 41771423)Shandong University of Science and Technology Research Fund(No.2019TDJH103)。
文摘Landform elements with varying morphologies and spatial arrangements are recognized as feature indicator of landform classification and play a critical role in geomorphological studies.Differential geometry method has been extensively applied in prior landform element research,while its efficacy in differentiating similar morphological characteristics remains inadequate to date.To reduce reliance on geomorphometric variables and increase awareness of landform patterns,geomorphons method was generated in previous study corresponding to specific landform reclassification map based on lookup table.Besides,to address the problem of feature similarity,hierarchical classification was proposed and effectively utilized for terrain recognition through the analytical strategy of fuzzy gradient features.Thus,combining the advantages of these two aspects,a hierarchical framework was proposed in this study for landform element pattern recognition considering the morphology and hierarchy factors.First,the local triplet patterns derived from geomorphons were enhanced by setting the flatness threshold,and subsequently adopted for the primary landform element recognition.Then,as geomorphic units with the same morphology possess different spatial analytical scales,the unidentified landform elements under the principle of scale adaptation were determined by calculating the spatial correlation and entropy information.To ensure the effectiveness of this proposed method,the sampling points were randomly selected from NASADEM data and then validated against a real 3D terrain model.Quantitative results of landform element pattern recognition demonstrate that our approach can reach above 77%average accuracy.Additionally,it delineates local details more effectively than geomorphons in visual assessment,resulting in a 7%accuracy improvement in overall scale.
基金funded By Key Scientific And Technological Program Of Ningbo City(No.2021Z108)Yongjiang Talent Introduction Programme(No.2021A-154-G).
文摘The movement mode of snakes is crawling,and the living environment of snakes with numerous branches and stones will cause plenty of wear for the snake scales.There are plenty of surface structures and morphology on snake scales to avoid severe wear.Among them,the research towards the keeled structure on snake scales is missing.Therefore,in this research,the wear resistance improvement of the keeled structure on the snake scales and the overlapped distribution of snake scales are investigated.The keeled and smooth snake scales were 3D printed and they were distributed on the substrate in the overlapped or paralleled ways.Besides these four samples with keeled/smooth scales and overlapped/paralleled distributed,there is also a reference sample with the same thickness.Based on the tribology test,the number of grooves of samples with the keeled structures is higher than that of samples with smooth surfaces,which indicates that the keeled structure dramatically enhances the wear resistance of snake scales,especially during the wear in the vertical direction.The experiment on surface morphology greatly compromised the result of the tribology test.In addition,the bottom portion of the keeled snake scales can be protected by the keeled structure.Besides,the overlapped distribution can protect the central region of snake scales and provide double-layer protection of the snake body.Overall,the keeled structure and the overlapped distribution play a significant part in the improvement of wear resistance of the snake skin.These findings can enhance the knowledge of the reptiles-mimic surface structure and facilitate the application of military uniforms under high-wear conditions.
基金supported by the National Natural Science Foundation of China(Grant No.41941017 and 41877522)the National Key Research and Development Program of China(Grant No.2021YFE0116800)Jiangsu Province Key R&D Program(Social Development)Project of China(Grant No.BE2019776)。
文摘High-resolution precipitation data is conducive to objectively describe the spatial-temporal variability of regional precipitation,and the study of downscaling techniques and spatial scale effects can provide technical and theoretical support to improve the spatial resolution and accuracy of satellite precipitation data.In this study,we used a machine learning algorithm combined with a regression algorithm RF-PLS(Random Forest-Partial Least Squares)to construct a downscaling model to obtain three types of high-resolution TRMM(Tropical Rainfall Measuring Mission)downscaled precipitation data for the years 2000-2017 at 250 m,500 m,and 1km.The scale effects with topographic and geomorphological features in the study area were analysed.Finally,we described the spatial and temporal variation of precipitation based on the optimal TRMM downscaled precipitation data.The results showed that:1)The linear relationships between the TRMM downscaled precipitation data obtained by each of the three downscaled models(PLS,RF,and RF-PLS)and the precipitation at the observation stations were improved compared to the linear relationships between the original TRMM data and the precipitation at the observation stations.The accuracy of the RF-PLS model was better than the other two models.2)Based on the RF-PLS model,the resolution of the TRMM data was increased to three different scales(250 m,500 m,and 1 km),considering the scale effects with topographic and geomorphological features.The precipitation simulation effect with a spatial resolution of 500 m was better than the other two scales.3)The annual precipitation was the highest in the areas with extremely high mountains,followed by the mediumhigh mountain,high mountain,medium mountain,medium-low mountain,plain,low mountain,and basin.
文摘The impact of lag effects produced by disturbances on primary production has been a major concern among ecologists during the last decade.Sudden and extreme climatic events are imposing drastic reductions in radial growth of trees as evidenced in tree-rings series Dendrochronological samples are obtained at tree level but analyzed at an aggregated scale(i.e.,mean chronologies),although aggregating tree-ring chronology on a regional scale may reduce the possibility of studying the variability of individual tree response to drought,by amplifying the average population response.Here,we conducted experimental research in which 370 trees of 5 species were analyzed to assess the potential statistical and scaling issues that may occur when using regressionbased methods to analyze ecosystem responses to disturbances.Drought legacy effects were quantified using individual and aggregated scales.Then,lag effects were validated using confidence and prediction intervals to identify values falling outside the certainty of the climate-growth model Individual scale legacy effects contrasted with confidence intervals were commonly distributed across species but were scarce when compared with prediction intervals.The analysis of aggregated scale legacies detected significant growth reductions when validated using prediction intervals;however,individual scale legacy lag effects were not detected.This finding directly contrasts the results obtained when using an aggregated scale.Our results provide empirical evidence on how aggregating ecological data to infer processes that emerge from an individual scale can lead to distorted conclusions.We therefore encourage the use of individual based statistical and ecological procedures to analyze tree rings as a means of further understanding the ecosystem responses to disturbances.
基金the financial support by the National Key R&D Program of China under grant number 2020YFA0710404the National Natural Science Foundation of China under grant number 51871216+6 种基金the KC Wong Education Foundation(GJTD-2020-09)the Liao Ning Revitalization Talents Programthe State Key Laboratory for Modification of Chemical Fibers and Polymer Materials at Donghua Universitythe Opening Project of Jiangsu Province Key Laboratory of High-End Structural Materials under grant number hsm1801the Opening Project of National Key Laboratory of Shock Wave and Detonation Physics under grant number 6142A03203002the Youth Innovation Promotion Association CASsupported by the Multi-University Research Initiative under grant number AFOSR-FA9550-151-0009 from the Air Force Office of Scientific Research
文摘Natural fish scales demonstrate outstanding mechanical efficiency owing to their elaborate architectures and thereby may serve as ideal prototypes for the architectural design of man-made materials.Here bioinspired magnesium composites with fish-scale-like orthogonal plywood and double-Bouligand architectures were developed by pressureless infiltration of a magnesium melt into the woven contextures of continuous titanium fibers.The composites exhibit enhanced strength and work-hardening ability compared to those estimated from a simple mixture of their constituents at ambient to elevated temperatures.In particular,the double-Bouligand architecture can effectively deflect cracking paths,alleviate strain localization,and adaptively reorient titanium fibers within the magnesium matrix during the deformation of the composite,representing a successful implementation of the property-optimizing mechanisms in fish scales.The strength of the composites,specifically the effect of their bioinspired architectures,was interpreted based on the adaptation of classical laminate theory.This study may offer a feasible approach for developing new bioinspired metal-matrix composites with improved performance and provide theoretical guidance for their architectural designs.