On the basis of analyzing the reliability of the organization’ management chain of the large and medium- sized project in the construction period, the paper studies the factors influencing the reliability of the orga...On the basis of analyzing the reliability of the organization’ management chain of the large and medium- sized project in the construction period, the paper studies the factors influencing the reliability of the organi-zation management chain, which, corresponding to four elements of management chain – “Management Loop”, “Management Link”, “Management Chain”, and “Management Network”, can be summarized as project main body, interface management, connection sequence and management model. The paper then re-searches on the specific influencing factors from the above-mentioned four aspects.展开更多
Based on the practice of oil and gas exploration in the Huizhou Sag of the Pearl River Mouth Basin,the geochemical indexes of source rocks were measured,the reservoir development morphology was restored,the rocks and ...Based on the practice of oil and gas exploration in the Huizhou Sag of the Pearl River Mouth Basin,the geochemical indexes of source rocks were measured,the reservoir development morphology was restored,the rocks and minerals were characterized microscopically,the measured trap sealing indexes were compared,the biomarker compounds of crude oil were extracted,the genesis of condensate gas was identified,and the reservoir-forming conditions were examined.On this basis,the Paleogene Enping Formation in the Huizhou 26 subsag was systematically analyzed for the potential of oil and gas resources,the development characteristics of large-scale high-quality conglomerate reservoirs,the trapping effectiveness of faults,the hydrocarbon migration and accumulation model,and the formation conditions and exploration targets of large-and medium-sized glutenite-rich oil and gas fields.The research results were obtained in four aspects.First,the Paleogene Wenchang Formation in the Huizhou 26 subsag develops extensive and thick high-quality source rocks of semi-deep to deep lacustrine subfacies,which have typical hydrocarbon expulsion characteristics of"great oil generation in the early stage and huge gas expulsion in the late stage",providing a sufficient material basis for hydrocarbon accumulation in the Enping Formation.Second,under the joint control of the steep slope zone and transition zone of the fault within the sag,the large-scale near-source glutenite reservoirs are highly heterogeneous,with the development scale dominated hierarchically by three factors(favorable facies zone,particle component,and microfracture).The(subaqueous)distributary channels near the fault system,with equal grains,a low mud content(<5%),and a high content of feldspar composition,are conducive to the development of sweet spot reservoirs.Third,the strike-slip pressurization trap covered by stable lake flooding mudstone is a necessary condition for oil and gas preservation,and the NE and nearly EW faults obliquely to the principal stress have the best control on traps.Fourth,the spatiotemporal configuration of high-quality source rocks,fault transport/sealing,and glutenite reservoirs controls the degree of hydrocarbon enrichment.From top to bottom,three hydrocarbon accumulation units,i.e.low-fill zone,transition zone,and high-fill zone,are recognized.The main area of the channel in the nearly pressurized source-connecting fault zone is favorable for large-scale hydrocarbon enrichment.The research results suggest a new direction for the exploration of large-scale glutenite-rich reservoirs in the Enping Formation of the Pearl River Mouth Basin,and present a major breakthrough in oil and gas exploration.展开更多
Based on the geological and geophysical data of Mesozoic oil-gas exploration in the sea area of Bohai Bay Basin and the discovered high-yield volcanic oil and gas wells since 2019,this paper methodically summarizes th...Based on the geological and geophysical data of Mesozoic oil-gas exploration in the sea area of Bohai Bay Basin and the discovered high-yield volcanic oil and gas wells since 2019,this paper methodically summarizes the formation conditions of large-and medium-sized Cretaceous volcanic oil and gas reservoirs in the Bohai Sea.Research shows that the Mesozoic large intermediate-felsic lava and intermediate-felsic composite volcanic edifices in the Bohai Sea are the material basis for the formation of large-scale volcanic reservoirs.The upper subfacies of effusive facies and cryptoexplosive breccia subfacies of volcanic conduit facies of volcanic vent-proximal facies belts are favorable for large-scale volcanic reservoir formation.Two types of efficient reservoirs,characterized by high porosity and medium to low permeability,as well as medium porosity and medium to low permeability,are the core of the formation of large-and medium-sized volcanic reservoirs.The reservoir with high porosity and medium to low permeability is formed by intermediate-felsic vesicular lava or the cryptoexplosive breccia superimposed by intensive dissolution.The reservoir with medium porosity and medium to low permeability is formed by intense tectonism superimposed by fluid dissolution.Weathering and tectonic transformation are main formation mechanisms for large and medium-sized volcanic reservoirs in the study area.The low-source“source-reservoir draping type”is the optimum source-reservoir configuration relationship for large-and medium-sized volcanic reservoirs.There exists favorable volcanic facies,efficient reservoirs and source-reservoir draping configuration relationship on the periphery of Bozhong Sag,and the large intermediate-felsic lava and intermediate-felsic composite volcanic edifices close to strike-slip faults and their branch faults are the main directions of future exploration.展开更多
Since the 1950s,when the Turing Test was introduced,there has been notable progress in machine language intelligence.Language modeling,crucial for AI development,has evolved from statistical to neural models over the ...Since the 1950s,when the Turing Test was introduced,there has been notable progress in machine language intelligence.Language modeling,crucial for AI development,has evolved from statistical to neural models over the last two decades.Recently,transformer-based Pre-trained Language Models(PLM)have excelled in Natural Language Processing(NLP)tasks by leveraging large-scale training corpora.Increasing the scale of these models enhances performance significantly,introducing abilities like context learning that smaller models lack.The advancement in Large Language Models,exemplified by the development of ChatGPT,has made significant impacts both academically and industrially,capturing widespread societal interest.This survey provides an overview of the development and prospects from Large Language Models(LLM)to Large Multimodal Models(LMM).It first discusses the contributions and technological advancements of LLMs in the field of natural language processing,especially in text generation and language understanding.Then,it turns to the discussion of LMMs,which integrates various data modalities such as text,images,and sound,demonstrating advanced capabilities in understanding and generating cross-modal content,paving new pathways for the adaptability and flexibility of AI systems.Finally,the survey highlights the prospects of LMMs in terms of technological development and application potential,while also pointing out challenges in data integration,cross-modal understanding accuracy,providing a comprehensive perspective on the latest developments in this field.展开更多
The variations of the frontogenetic trend of a cold filament induced by the cross-filament wind and wave fields are studied by a non-hydrostatic large eddy simulation. Five cases with different strengths of wind and w...The variations of the frontogenetic trend of a cold filament induced by the cross-filament wind and wave fields are studied by a non-hydrostatic large eddy simulation. Five cases with different strengths of wind and wave fields are studied.The results show that the intense wind and wave fields further break the symmetries of submesoscale flow fields and suppress the levels of filament frontogenesis. The changes of secondary circulation directions—that is, the conversion between the convergence and divergence of the surface cross-filament currents with the downwelling and upwelling jets in the filament center—are associated with the inertial oscillation. The filament frontogenesis and frontolysis caused by the changes of secondary circulation directions may periodically sharpen and smooth the gradient of submesoscale flow fields.The lifecycle of the cold filament may include multiple stages of filament frontogenesis and frontolysis.展开更多
Huntington'sdisease(HD)isahereditary neurodegenerative disorder for which there is currently no effectivetreatmentavailable.Consequently,the development of appropriate disease models is critical to thoroughly inve...Huntington'sdisease(HD)isahereditary neurodegenerative disorder for which there is currently no effectivetreatmentavailable.Consequently,the development of appropriate disease models is critical to thoroughly investigate disease progression.The genetic basis of HD involves the abnormal expansion of CAG repeats in the huntingtin(HTT)gene,leading to the expansion of a polyglutamine repeat in the HTT protein.Mutant HTT carrying the expanded polyglutamine repeat undergoes misfolding and forms aggregates in the brain,which precipitate selective neuronal loss in specific brain regions.Animal models play an important role in elucidating the pathogenesis of neurodegenerative disorders such as HD and in identifying potential therapeutic targets.Due to the marked species differences between rodents and larger animals,substantial efforts have been directed toward establishing large animal models for HD research.These models are pivotal for advancing the discovery of novel therapeutic targets,enhancing effective drug delivery methods,and improving treatment outcomes.We have explored the advantages of utilizing large animal models,particularly pigs,in previous reviews.Since then,however,significant progress has been made in developing more sophisticated animal models that faithfully replicate the typical pathology of HD.In the current review,we provide a comprehensive overview of large animal models of HD,incorporating recent findings regarding the establishment of HD knock-in(KI)pigs and their genetic therapy.We also explore the utilization of large animal models in HD research,with a focus on sheep,non-human primates(NHPs),and pigs.Our objective is to provide valuable insights into the application of these large animal models for the investigation and treatment of neurodegenerative disorders.展开更多
The recent interest in the deployment of Generative AI applications that use large language models (LLMs) has brought to the forefront significant privacy concerns, notably the leakage of Personally Identifiable Infor...The recent interest in the deployment of Generative AI applications that use large language models (LLMs) has brought to the forefront significant privacy concerns, notably the leakage of Personally Identifiable Information (PII) and other confidential or protected information that may have been memorized during training, specifically during a fine-tuning or customization process. We describe different black-box attacks from potential adversaries and study their impact on the amount and type of information that may be recovered from commonly used and deployed LLMs. Our research investigates the relationship between PII leakage, memorization, and factors such as model size, architecture, and the nature of attacks employed. The study utilizes two broad categories of attacks: PII leakage-focused attacks (auto-completion and extraction attacks) and memorization-focused attacks (various membership inference attacks). The findings from these investigations are quantified using an array of evaluative metrics, providing a detailed understanding of LLM vulnerabilities and the effectiveness of different attacks.展开更多
In order to improve rib stability,failure criteria and instability mode of a thick coal seam with inter-band rock layer are analysed in this study.A three-dimensional mechanical model is established for the rib by con...In order to improve rib stability,failure criteria and instability mode of a thick coal seam with inter-band rock layer are analysed in this study.A three-dimensional mechanical model is established for the rib by considering the rock layer.A safety factor is defined foy the rib,and it is observed that the safety factor exhibits a positive correlation with the thickness and strength of the inter-band rock.A calculation method for determining critical parameters of the rock layer is presented to ensure the rib stability.It is revealed that incomplete propagation of the fracture at the hard rock constitutes a fundamental prerequisite for ensuring the rib stability.The influence of the position of the inter-band rock in the coal seam on failure mechanism of the rib was thoroughly investigated by developing a series of physical models for the rib at the face area.The best position for the inter-band rock in the coal seam is at a height of 1.5 m away from the roof line,which tends to provide a good stability state for the rib.For different inter-band rock positions,two ways of controlling rib by increasing supports stiffness and flexible grouting reinforcement are proposed.展开更多
This article elucidates the concept of large model technology,summarizes the research status of large model technology both domestically and internationally,provides an overview of the application status of large mode...This article elucidates the concept of large model technology,summarizes the research status of large model technology both domestically and internationally,provides an overview of the application status of large models in vertical industries,outlines the challenges and issues confronted in applying large models in the oil and gas sector,and offers prospects for the application of large models in the oil and gas industry.The existing large models can be briefly divided into three categories:large language models,visual large models,and multimodal large models.The application of large models in the oil and gas industry is still in its infancy.Based on open-source large language models,some oil and gas enterprises have released large language model products using methods like fine-tuning and retrieval augmented generation.Scholars have attempted to develop scenario-specific models for oil and gas operations by using visual/multimodal foundation models.A few researchers have constructed pre-trained foundation models for seismic data processing and interpretation,as well as core analysis.The application of large models in the oil and gas industry faces challenges such as current data quantity and quality being difficult to support the training of large models,high research and development costs,and poor algorithm autonomy and control.The application of large models should be guided by the needs of oil and gas business,taking the application of large models as an opportunity to improve data lifecycle management,enhance data governance capabilities,promote the construction of computing power,strengthen the construction of“artificial intelligence+energy”composite teams,and boost the autonomy and control of large model technology.展开更多
For samples in the gaseous state at room temperature and ambient pressure,mature technology has been developed to encapsulate them in a diamond anvil cell(DAC).However,the large volume press(LVP)can only treat samples...For samples in the gaseous state at room temperature and ambient pressure,mature technology has been developed to encapsulate them in a diamond anvil cell(DAC).However,the large volume press(LVP)can only treat samples with starting materials in solid or liquid form.We have achieved stable encapsulation and reaction treatment of carbon dioxide in a centimeter sized sample chamber for a long time(over 10 min)under conditions of temperature higher than 1200C and pressure over 5 GPa through the use of integrated low-temperature freezing and rapid compression sealing method for LVP cell assemblies.This technology can also be applied to the packaging of other gaseous or liquid samples,such as ammonia,sulfur dioxide,water,etc.in LVP devices.展开更多
There are lots of researches on fixture layout optimization for large thin-walled parts.Current researches focus on the positioning problem,i.e.,optimizing the positions of a constant number of fixtures.However,how to...There are lots of researches on fixture layout optimization for large thin-walled parts.Current researches focus on the positioning problem,i.e.,optimizing the positions of a constant number of fixtures.However,how to determine the number of fixtures is ignored.In most cases,the number of fixtures located on large thin-walled parts is determined based on engineering experience,which leads to huge fixture number and extra waste.Therefore,this paper constructs an optimization model to minimize the number of fixtures.The constraints are set in the optimization model to ensure that the part deformation is within the surface profile tolerance.In addition,the assembly gap between two parts is also controlled.To conduct the optimization,this paper develops an improved particle swarm optimization(IPSO)algorithm by integrating the shrinkage factor and adaptive inertia weight.In the algorithm,particles are encoded according to the fixture position.Each dimension of the particle is assigned to a sub-region by constraining the optional position range of each fixture to improve the optimization efficiency.Finally,a case study on ship curved panel assembly is provided to prove that our method can optimize the number of fixtures while meeting the assembly quality requirements.This research proposes a method to optimize the number of fixtures,which can reduce the number of fixtures and achieve deformation control at the same time.展开更多
Most viruses and transposons serve as effective carriers for the introduction of foreign DNA up to 11 kb into vertebrate genomes.However,their activity markedly diminishes with payloads exceeding 11 kb.Expanding the p...Most viruses and transposons serve as effective carriers for the introduction of foreign DNA up to 11 kb into vertebrate genomes.However,their activity markedly diminishes with payloads exceeding 11 kb.Expanding the payload capacity of transposons could facilitate more sophisticated cargo designs,improving the regulation of expression and minimizing mutagenic risks associated with molecular therapeutics,metabolic engineering,and transgenic animal production.In this study,we improved the Tol2 transposon by increasing protein expression levels using a translational enhancer(QBI SP163,ST)and enhanced the nuclear targeting ability using the nuclear localization protein H2B(SHT).The modified Tol2 and ST transposon efficiently integrated large DNA cargos into human cell cultures(H1299),comparable to the well-established super PiggyBac system.Furthermore,mRNA from ST and SHT showed a significant increase in transgene delivery efficiency of large DNA payloads(8 kb,14 kb,and 24 kb)into zebrafish(Danio rerio).This study presents a modified Tol2 transposon as an enhanced nonviral vector for the delivery of large DNA payloads in transgenic applications.展开更多
In order to effectively reduce energy consumption and increase range mile,new energy vehicles represented by Tesla have greatly aroused the application of integrated magnesium(Mg)alloy die casting technology in automo...In order to effectively reduce energy consumption and increase range mile,new energy vehicles represented by Tesla have greatly aroused the application of integrated magnesium(Mg)alloy die casting technology in automobiles.Previously,the application of Mg alloys in automobiles,especially in automotive cockpit components,is quite extensive,while it has almost disappeared for a period of time due to its relatively high cost,causing a certain degree of information loss in the application technology of Mg alloy parts in automobiles.The rapid development of automotive technology has led to a higher requirement for the automotive components compared with those traditional one.Therefore,whatever the components themselves,or the Mg alloy materials and die casting process have to face an increasing challenge,needing to be upgraded.In addition,owing to its high integration characteristics,the application of Mg alloy die casting technology in large-sized and thin-walled automotive parts has inherent advantages and needs to be expanded urgently.Indeed,it necessitates exploring advance Mg alloys and new product structures and optimizing die casting processes.This article summarizes and analyzes the development status of thin-walled and large-sized die casting Mg alloy parts in passenger car cockpit and corresponding material selection methods,die casting processes as well as mold design techniques.Furthermore,this work will aid researchers in establishing a comprehensive understanding of the manufacture of thin-walled and large-sized die casting Mg alloy parts in automobile cockpit.It will also assist them in developing new Mg alloys with improved comprehensive performance and new processes to meet the high requirements for die casting automotive components.展开更多
Shallow convection plays an important role in transporting heat and moisture from the near-surface to higher altitudes,yet its parameterization in numerical models remains a great challenge,partly due to the lack of h...Shallow convection plays an important role in transporting heat and moisture from the near-surface to higher altitudes,yet its parameterization in numerical models remains a great challenge,partly due to the lack of high-resolution observations.This study describes a large eddy simulation(LES)dataset for four shallow convection cases that differ primarily in inversion strength,which can be used as a surrogate for real data.To reduce the uncertainty in LES modeling,three different large eddy models were used,including SAM(System for Atmospheric Modeling),WRF(Weather Research and Forecasting model),and UCLA-LES.Results show that the different models generally exhibit similar behavior for each shallow convection case,despite some differences in the details of the convective structure.In addition to grid-averaged fields,conditionally sampled variables,such as in-cloud moisture and vertical velocity,are also provided,which are indispensable for calculation of the entrainment/detrainment rate.Considering the essentiality of the entraining/detraining process in the parameterization of cumulus convection,the dataset presented in this study is potentially useful for validation and improvement of the parameterization of shallow convection.展开更多
Hierarchical networks are frequently encountered in animal groups,gene networks,and artificial engineering systems such as multiple robots,unmanned vehicle systems,smart grids,wind farm networks,and so forth.The struc...Hierarchical networks are frequently encountered in animal groups,gene networks,and artificial engineering systems such as multiple robots,unmanned vehicle systems,smart grids,wind farm networks,and so forth.The structure of a large directed hierarchical network is often strongly influenced by reverse edges from lower-to higher-level nodes,such as lagging birds’howl in a flock or the opinions of lowerlevel individuals feeding back to higher-level ones in a social group.This study reveals that,for most large-scale real hierarchical networks,the majority of the reverse edges do not affect the synchronization process of the entire network;the synchronization process is influenced only by a small part of these reverse edges along specific paths.More surprisingly,a single effective reverse edge can slow down the synchronization of a huge hierarchical network by over 60%.The effect of such edges depends not on the network size but only on the average in-degree of the involved subnetwork.The overwhelming majority of active reverse edges turn out to have some kind of“bunching”effect on the information flows of hierarchical networks,which slows down synchronization processes.This finding refines the current understanding of the role of reverse edges in many natural,social,and engineering hierarchical networks,which might be beneficial for precisely tuning the synchronization rhythms of these networks.Our study also proposes an effective way to attack a hierarchical network by adding a malicious reverse edge to it and provides some guidance for protecting a network by screening out the specific small proportion of vulnerable nodes.展开更多
In designing efficient perovskite solar cells(PSCs),the selection of suitable electron transport layers(ETLs)is critical to the final device performance as they determine the driving force for selective charge extract...In designing efficient perovskite solar cells(PSCs),the selection of suitable electron transport layers(ETLs)is critical to the final device performance as they determine the driving force for selective charge extraction.SnO_(2)nanoparticles(NPs)based ETLs have been a popular choice for PSCs due to superior electron mobility,but their relatively deep-lying conduction band energy levels(ECB)result in substantial potential loss.Meanwhile,TiO_(2)NPs establish favorable band alignment owing to shallower ECB,but their low intrinsic mobility and abundant surface trap sites impede the final performance.For this reason,constructing a cascaded bilayer ETL is highly desirable for efficient PSCs,as it can rearrange energy levels and exploit on advantages of an individual ETL.In this study,we prepare SnO_(2)NPs and acetylacetone-modified TiO_(2)(Acac-TiO_(2))NPs and implement them as bilayer SnO_(2)/Acac-TiO_(2)(BST)ETL,to assemble cascaded energy band structure.SnO_(2)contributes to rapid charge carrier transport from high electron mobility while Acac-TiO_(2)minimizes band-offset and effectively suppresses interfacial recombination.Accordingly,the optimized BST ETL generates synergistic influence and delivers power conversion efficiency(PCE)as high as 23.14%with open-circuit voltage(V_(oc))reaching 1.14 V.Furthermore,the BST ETL is transferred to a large scale and the corresponding mini module demonstrates peak performance of 18.39%PCE from 25 cm^(2)aperture area.Finally,the BST-based mini module exhibit excellent stability,maintaining 83.1%of its initial efficiency after 1000 h under simultaneous 1 Sun light-soaking and damp heat(85℃/RH 85%)environment.展开更多
We present a large deviation theory that characterizes the exponential estimate for rare events in stochastic dynamical systems in the limit of weak noise.We aim to consider a next-to-leading-order approximation for m...We present a large deviation theory that characterizes the exponential estimate for rare events in stochastic dynamical systems in the limit of weak noise.We aim to consider a next-to-leading-order approximation for more accurate calculation of the mean exit time by computing large deviation prefactors with the aid of machine learning.More specifically,we design a neural network framework to compute quasipotential,most probable paths and prefactors based on the orthogonal decomposition of a vector field.We corroborate the higher effectiveness and accuracy of our algorithm with two toy models.Numerical experiments demonstrate its powerful functionality in exploring the internal mechanism of rare events triggered by weak random fluctuations.展开更多
We are concerned with the large-time behavior of 3D quasilinear hyperbolic equations with nonlinear damping.The main novelty of this paper is two-fold.First,we prove the optimal decay rates of the second and third ord...We are concerned with the large-time behavior of 3D quasilinear hyperbolic equations with nonlinear damping.The main novelty of this paper is two-fold.First,we prove the optimal decay rates of the second and third order spatial derivatives of the solution,which are the same as those of the heat equation,and in particular,are faster than ones of previous related works.Second,for well-chosen initial data,we also show that the lower optimal L^(2) convergence rate of the k(∈[0,3])-order spatial derivatives of the solution is(1+t)^(-(2+2k)/4).Therefore,our decay rates are optimal in this sense.The proofs are based on the Fourier splitting method,low-frequency and high-frequency decomposition,and delicate energy estimates.展开更多
This paper presents the design and verification of the dual-mode core driven fan stage(CDFS)and high-load compressor with a large flow regulation range.In view of the characteristics of large flow regulation range of ...This paper presents the design and verification of the dual-mode core driven fan stage(CDFS)and high-load compressor with a large flow regulation range.In view of the characteristics of large flow regulation range of the two modes and high average stage load coefficient,this paper investigates the design technology of the dual-mode high-efficiency compressor with a large flow regulation range and high-load compressor with an average stage load coefficient of 0.504.Building upon this research,the design of the dual-mode CDFS and four-stage compressor is completed,and three-dimensional numerical simulation of the two modes is carried out.Finally,performance experiment is conducted to verify the result of three-dimensional numerical simulation.The experiment results show that the compressor performance is improved for the whole working conditions by using the new design method,which realizes the complete fusion design of the CDFS and high-pressure compressor(HPC).The matching mechanism of stage characteristics of single and double bypass modes and the variation rule of different adjustment angles on performance are studied comprehensively.Furthermore,it effectively reduces the length and weight of compressor,and breaks through the key technologies such as high-load compressor with the average load factor of 0.504.These findings provide valuable data and a methodological foundation for the development of the next generation aeroengine.展开更多
Bulked-segregant analysis by deep sequencing(BSA-seq) is a widely used method for mapping QTL(quantitative trait loci) due to its simplicity, speed, cost-effectiveness, and efficiency. However, the ability of BSA-seq ...Bulked-segregant analysis by deep sequencing(BSA-seq) is a widely used method for mapping QTL(quantitative trait loci) due to its simplicity, speed, cost-effectiveness, and efficiency. However, the ability of BSA-seq to detect QTL is often limited by inappropriate experimental designs, as evidenced by numerous practical studies. Most BSA-seq studies have utilized small to medium-sized populations, with F2populations being the most common choice. Nevertheless, theoretical studies have shown that using a large population with an appropriate pool size can significantly enhance the power and resolution of QTL detection in BSA-seq, with F_(3)populations offering notable advantages over F2populations. To provide an experimental demonstration, we tested the power of BSA-seq to identify QTL controlling days from sowing to heading(DTH) in a 7200-plant rice F_(3)population in two environments, with a pool size of approximately 500. Each experiment identified 34 QTL, an order of magnitude greater than reported in most BSA-seq experiments, of which 23 were detected in both experiments, with 17 of these located near41 previously reported QTL and eight cloned genes known to control DTH in rice. These results indicate that QTL mapping by BSA-seq in large F_(3)populations and multi-environment experiments can achieve high power, resolution, and reliability.展开更多
文摘On the basis of analyzing the reliability of the organization’ management chain of the large and medium- sized project in the construction period, the paper studies the factors influencing the reliability of the organi-zation management chain, which, corresponding to four elements of management chain – “Management Loop”, “Management Link”, “Management Chain”, and “Management Network”, can be summarized as project main body, interface management, connection sequence and management model. The paper then re-searches on the specific influencing factors from the above-mentioned four aspects.
基金Supported by the CNOOC Major Technology Project During the 14th FIVE-YEAR PLAN PERIOD(KJGG2022-0403)CNOOC Major Technology Project(KJZH-2021-0003-00).
文摘Based on the practice of oil and gas exploration in the Huizhou Sag of the Pearl River Mouth Basin,the geochemical indexes of source rocks were measured,the reservoir development morphology was restored,the rocks and minerals were characterized microscopically,the measured trap sealing indexes were compared,the biomarker compounds of crude oil were extracted,the genesis of condensate gas was identified,and the reservoir-forming conditions were examined.On this basis,the Paleogene Enping Formation in the Huizhou 26 subsag was systematically analyzed for the potential of oil and gas resources,the development characteristics of large-scale high-quality conglomerate reservoirs,the trapping effectiveness of faults,the hydrocarbon migration and accumulation model,and the formation conditions and exploration targets of large-and medium-sized glutenite-rich oil and gas fields.The research results were obtained in four aspects.First,the Paleogene Wenchang Formation in the Huizhou 26 subsag develops extensive and thick high-quality source rocks of semi-deep to deep lacustrine subfacies,which have typical hydrocarbon expulsion characteristics of"great oil generation in the early stage and huge gas expulsion in the late stage",providing a sufficient material basis for hydrocarbon accumulation in the Enping Formation.Second,under the joint control of the steep slope zone and transition zone of the fault within the sag,the large-scale near-source glutenite reservoirs are highly heterogeneous,with the development scale dominated hierarchically by three factors(favorable facies zone,particle component,and microfracture).The(subaqueous)distributary channels near the fault system,with equal grains,a low mud content(<5%),and a high content of feldspar composition,are conducive to the development of sweet spot reservoirs.Third,the strike-slip pressurization trap covered by stable lake flooding mudstone is a necessary condition for oil and gas preservation,and the NE and nearly EW faults obliquely to the principal stress have the best control on traps.Fourth,the spatiotemporal configuration of high-quality source rocks,fault transport/sealing,and glutenite reservoirs controls the degree of hydrocarbon enrichment.From top to bottom,three hydrocarbon accumulation units,i.e.low-fill zone,transition zone,and high-fill zone,are recognized.The main area of the channel in the nearly pressurized source-connecting fault zone is favorable for large-scale hydrocarbon enrichment.The research results suggest a new direction for the exploration of large-scale glutenite-rich reservoirs in the Enping Formation of the Pearl River Mouth Basin,and present a major breakthrough in oil and gas exploration.
基金Supported by the China National Offshore Oil Corporation Limited Project(2021-KT-YXKY-03)。
文摘Based on the geological and geophysical data of Mesozoic oil-gas exploration in the sea area of Bohai Bay Basin and the discovered high-yield volcanic oil and gas wells since 2019,this paper methodically summarizes the formation conditions of large-and medium-sized Cretaceous volcanic oil and gas reservoirs in the Bohai Sea.Research shows that the Mesozoic large intermediate-felsic lava and intermediate-felsic composite volcanic edifices in the Bohai Sea are the material basis for the formation of large-scale volcanic reservoirs.The upper subfacies of effusive facies and cryptoexplosive breccia subfacies of volcanic conduit facies of volcanic vent-proximal facies belts are favorable for large-scale volcanic reservoir formation.Two types of efficient reservoirs,characterized by high porosity and medium to low permeability,as well as medium porosity and medium to low permeability,are the core of the formation of large-and medium-sized volcanic reservoirs.The reservoir with high porosity and medium to low permeability is formed by intermediate-felsic vesicular lava or the cryptoexplosive breccia superimposed by intensive dissolution.The reservoir with medium porosity and medium to low permeability is formed by intense tectonism superimposed by fluid dissolution.Weathering and tectonic transformation are main formation mechanisms for large and medium-sized volcanic reservoirs in the study area.The low-source“source-reservoir draping type”is the optimum source-reservoir configuration relationship for large-and medium-sized volcanic reservoirs.There exists favorable volcanic facies,efficient reservoirs and source-reservoir draping configuration relationship on the periphery of Bozhong Sag,and the large intermediate-felsic lava and intermediate-felsic composite volcanic edifices close to strike-slip faults and their branch faults are the main directions of future exploration.
基金We acknowledge funding from NSFC Grant 62306283.
文摘Since the 1950s,when the Turing Test was introduced,there has been notable progress in machine language intelligence.Language modeling,crucial for AI development,has evolved from statistical to neural models over the last two decades.Recently,transformer-based Pre-trained Language Models(PLM)have excelled in Natural Language Processing(NLP)tasks by leveraging large-scale training corpora.Increasing the scale of these models enhances performance significantly,introducing abilities like context learning that smaller models lack.The advancement in Large Language Models,exemplified by the development of ChatGPT,has made significant impacts both academically and industrially,capturing widespread societal interest.This survey provides an overview of the development and prospects from Large Language Models(LLM)to Large Multimodal Models(LMM).It first discusses the contributions and technological advancements of LLMs in the field of natural language processing,especially in text generation and language understanding.Then,it turns to the discussion of LMMs,which integrates various data modalities such as text,images,and sound,demonstrating advanced capabilities in understanding and generating cross-modal content,paving new pathways for the adaptability and flexibility of AI systems.Finally,the survey highlights the prospects of LMMs in terms of technological development and application potential,while also pointing out challenges in data integration,cross-modal understanding accuracy,providing a comprehensive perspective on the latest developments in this field.
基金supported by the National Natural Science Foundation of China (Grant Nos. 92158204, 41506001 and 42076019)a Project supported by the Southern Marine Science and Engineering Guangdong Laboratory (Zhuhai) (Grant No. 311021005)。
文摘The variations of the frontogenetic trend of a cold filament induced by the cross-filament wind and wave fields are studied by a non-hydrostatic large eddy simulation. Five cases with different strengths of wind and wave fields are studied.The results show that the intense wind and wave fields further break the symmetries of submesoscale flow fields and suppress the levels of filament frontogenesis. The changes of secondary circulation directions—that is, the conversion between the convergence and divergence of the surface cross-filament currents with the downwelling and upwelling jets in the filament center—are associated with the inertial oscillation. The filament frontogenesis and frontolysis caused by the changes of secondary circulation directions may periodically sharpen and smooth the gradient of submesoscale flow fields.The lifecycle of the cold filament may include multiple stages of filament frontogenesis and frontolysis.
基金supported by the National Key Research and Development Program of China (2021YFA0805300,2021YFA0805200)National Natural Science Foundation of China (32170981,82371874,82394422,82171244,82071421,82271902)+1 种基金Guangzhou Key Research Program on Brain Science (202007030008)Department of Science and Technology of Guangdong Province (2021ZT09Y007,2020B121201006,2018B030337001)。
文摘Huntington'sdisease(HD)isahereditary neurodegenerative disorder for which there is currently no effectivetreatmentavailable.Consequently,the development of appropriate disease models is critical to thoroughly investigate disease progression.The genetic basis of HD involves the abnormal expansion of CAG repeats in the huntingtin(HTT)gene,leading to the expansion of a polyglutamine repeat in the HTT protein.Mutant HTT carrying the expanded polyglutamine repeat undergoes misfolding and forms aggregates in the brain,which precipitate selective neuronal loss in specific brain regions.Animal models play an important role in elucidating the pathogenesis of neurodegenerative disorders such as HD and in identifying potential therapeutic targets.Due to the marked species differences between rodents and larger animals,substantial efforts have been directed toward establishing large animal models for HD research.These models are pivotal for advancing the discovery of novel therapeutic targets,enhancing effective drug delivery methods,and improving treatment outcomes.We have explored the advantages of utilizing large animal models,particularly pigs,in previous reviews.Since then,however,significant progress has been made in developing more sophisticated animal models that faithfully replicate the typical pathology of HD.In the current review,we provide a comprehensive overview of large animal models of HD,incorporating recent findings regarding the establishment of HD knock-in(KI)pigs and their genetic therapy.We also explore the utilization of large animal models in HD research,with a focus on sheep,non-human primates(NHPs),and pigs.Our objective is to provide valuable insights into the application of these large animal models for the investigation and treatment of neurodegenerative disorders.
文摘The recent interest in the deployment of Generative AI applications that use large language models (LLMs) has brought to the forefront significant privacy concerns, notably the leakage of Personally Identifiable Information (PII) and other confidential or protected information that may have been memorized during training, specifically during a fine-tuning or customization process. We describe different black-box attacks from potential adversaries and study their impact on the amount and type of information that may be recovered from commonly used and deployed LLMs. Our research investigates the relationship between PII leakage, memorization, and factors such as model size, architecture, and the nature of attacks employed. The study utilizes two broad categories of attacks: PII leakage-focused attacks (auto-completion and extraction attacks) and memorization-focused attacks (various membership inference attacks). The findings from these investigations are quantified using an array of evaluative metrics, providing a detailed understanding of LLM vulnerabilities and the effectiveness of different attacks.
基金financial support from the National Key Research and Development Program of China (No.2023YFC2907501)the National Natural Science Foundation of China (No.52374106)the Fundamental Research Funds for the Central Universities (No.2023ZKPYNY01)。
文摘In order to improve rib stability,failure criteria and instability mode of a thick coal seam with inter-band rock layer are analysed in this study.A three-dimensional mechanical model is established for the rib by considering the rock layer.A safety factor is defined foy the rib,and it is observed that the safety factor exhibits a positive correlation with the thickness and strength of the inter-band rock.A calculation method for determining critical parameters of the rock layer is presented to ensure the rib stability.It is revealed that incomplete propagation of the fracture at the hard rock constitutes a fundamental prerequisite for ensuring the rib stability.The influence of the position of the inter-band rock in the coal seam on failure mechanism of the rib was thoroughly investigated by developing a series of physical models for the rib at the face area.The best position for the inter-band rock in the coal seam is at a height of 1.5 m away from the roof line,which tends to provide a good stability state for the rib.For different inter-band rock positions,two ways of controlling rib by increasing supports stiffness and flexible grouting reinforcement are proposed.
基金Supported by the National Natural Science Foundation of China(72088101,42372175)PetroChina Science and Technology Innovation Fund Program(2021DQ02-0904)。
文摘This article elucidates the concept of large model technology,summarizes the research status of large model technology both domestically and internationally,provides an overview of the application status of large models in vertical industries,outlines the challenges and issues confronted in applying large models in the oil and gas sector,and offers prospects for the application of large models in the oil and gas industry.The existing large models can be briefly divided into three categories:large language models,visual large models,and multimodal large models.The application of large models in the oil and gas industry is still in its infancy.Based on open-source large language models,some oil and gas enterprises have released large language model products using methods like fine-tuning and retrieval augmented generation.Scholars have attempted to develop scenario-specific models for oil and gas operations by using visual/multimodal foundation models.A few researchers have constructed pre-trained foundation models for seismic data processing and interpretation,as well as core analysis.The application of large models in the oil and gas industry faces challenges such as current data quantity and quality being difficult to support the training of large models,high research and development costs,and poor algorithm autonomy and control.The application of large models should be guided by the needs of oil and gas business,taking the application of large models as an opportunity to improve data lifecycle management,enhance data governance capabilities,promote the construction of computing power,strengthen the construction of“artificial intelligence+energy”composite teams,and boost the autonomy and control of large model technology.
基金supported by the National Key R&D Program of China(Grant No.2023YFA1406200).
文摘For samples in the gaseous state at room temperature and ambient pressure,mature technology has been developed to encapsulate them in a diamond anvil cell(DAC).However,the large volume press(LVP)can only treat samples with starting materials in solid or liquid form.We have achieved stable encapsulation and reaction treatment of carbon dioxide in a centimeter sized sample chamber for a long time(over 10 min)under conditions of temperature higher than 1200C and pressure over 5 GPa through the use of integrated low-temperature freezing and rapid compression sealing method for LVP cell assemblies.This technology can also be applied to the packaging of other gaseous or liquid samples,such as ammonia,sulfur dioxide,water,etc.in LVP devices.
基金Supported by National Natural Science Foundation of China(Grant No.52005371)Shanghai Pujiang Program of China(Grant No.2020PJD071)+1 种基金Shanghai Municipal Natural Science Foundation of China(Grant No.22ZR1463900)Fundamental Research Funds for the Central Universities of China.
文摘There are lots of researches on fixture layout optimization for large thin-walled parts.Current researches focus on the positioning problem,i.e.,optimizing the positions of a constant number of fixtures.However,how to determine the number of fixtures is ignored.In most cases,the number of fixtures located on large thin-walled parts is determined based on engineering experience,which leads to huge fixture number and extra waste.Therefore,this paper constructs an optimization model to minimize the number of fixtures.The constraints are set in the optimization model to ensure that the part deformation is within the surface profile tolerance.In addition,the assembly gap between two parts is also controlled.To conduct the optimization,this paper develops an improved particle swarm optimization(IPSO)algorithm by integrating the shrinkage factor and adaptive inertia weight.In the algorithm,particles are encoded according to the fixture position.Each dimension of the particle is assigned to a sub-region by constraining the optional position range of each fixture to improve the optimization efficiency.Finally,a case study on ship curved panel assembly is provided to prove that our method can optimize the number of fixtures while meeting the assembly quality requirements.This research proposes a method to optimize the number of fixtures,which can reduce the number of fixtures and achieve deformation control at the same time.
基金supported by the National Science and Technology Innovation 2030 Major Projects(2021ZD0202200)National Natural Science Foundation of China(32171090,81970264)+1 种基金Shanghai Science and Technology Commission(21ZR1482600)2023 Youth Innovation Promotion Association CAS。
文摘Most viruses and transposons serve as effective carriers for the introduction of foreign DNA up to 11 kb into vertebrate genomes.However,their activity markedly diminishes with payloads exceeding 11 kb.Expanding the payload capacity of transposons could facilitate more sophisticated cargo designs,improving the regulation of expression and minimizing mutagenic risks associated with molecular therapeutics,metabolic engineering,and transgenic animal production.In this study,we improved the Tol2 transposon by increasing protein expression levels using a translational enhancer(QBI SP163,ST)and enhanced the nuclear targeting ability using the nuclear localization protein H2B(SHT).The modified Tol2 and ST transposon efficiently integrated large DNA cargos into human cell cultures(H1299),comparable to the well-established super PiggyBac system.Furthermore,mRNA from ST and SHT showed a significant increase in transgene delivery efficiency of large DNA payloads(8 kb,14 kb,and 24 kb)into zebrafish(Danio rerio).This study presents a modified Tol2 transposon as an enhanced nonviral vector for the delivery of large DNA payloads in transgenic applications.
基金supported by the foundation of“Cold area new energy service engineering laboratory battery pack comprehensive test system”from Jilin Provincial Development and Reform Commission(2020C021-6)the National Natural Science Foundation of China(NNSFC,No.52371109).
文摘In order to effectively reduce energy consumption and increase range mile,new energy vehicles represented by Tesla have greatly aroused the application of integrated magnesium(Mg)alloy die casting technology in automobiles.Previously,the application of Mg alloys in automobiles,especially in automotive cockpit components,is quite extensive,while it has almost disappeared for a period of time due to its relatively high cost,causing a certain degree of information loss in the application technology of Mg alloy parts in automobiles.The rapid development of automotive technology has led to a higher requirement for the automotive components compared with those traditional one.Therefore,whatever the components themselves,or the Mg alloy materials and die casting process have to face an increasing challenge,needing to be upgraded.In addition,owing to its high integration characteristics,the application of Mg alloy die casting technology in large-sized and thin-walled automotive parts has inherent advantages and needs to be expanded urgently.Indeed,it necessitates exploring advance Mg alloys and new product structures and optimizing die casting processes.This article summarizes and analyzes the development status of thin-walled and large-sized die casting Mg alloy parts in passenger car cockpit and corresponding material selection methods,die casting processes as well as mold design techniques.Furthermore,this work will aid researchers in establishing a comprehensive understanding of the manufacture of thin-walled and large-sized die casting Mg alloy parts in automobile cockpit.It will also assist them in developing new Mg alloys with improved comprehensive performance and new processes to meet the high requirements for die casting automotive components.
基金the National Key R&D Program of China(Grant No.2021YFC3000802)the National Natural Science Foundation of China(Grant No.42175165)the National Key Scientific and Technological Infrastructure project“Earth System Numerical Simulation Facility”(EarthLab).
文摘Shallow convection plays an important role in transporting heat and moisture from the near-surface to higher altitudes,yet its parameterization in numerical models remains a great challenge,partly due to the lack of high-resolution observations.This study describes a large eddy simulation(LES)dataset for four shallow convection cases that differ primarily in inversion strength,which can be used as a surrogate for real data.To reduce the uncertainty in LES modeling,three different large eddy models were used,including SAM(System for Atmospheric Modeling),WRF(Weather Research and Forecasting model),and UCLA-LES.Results show that the different models generally exhibit similar behavior for each shallow convection case,despite some differences in the details of the convective structure.In addition to grid-averaged fields,conditionally sampled variables,such as in-cloud moisture and vertical velocity,are also provided,which are indispensable for calculation of the entrainment/detrainment rate.Considering the essentiality of the entraining/detraining process in the parameterization of cumulus convection,the dataset presented in this study is potentially useful for validation and improvement of the parameterization of shallow convection.
基金supported in part by the National Natural Science Foundation of China(62225306,U2141235,52188102,and 62003145)the National Key Research and Development Program of China(2022ZD0119601)+1 种基金Guangdong Basic and Applied Research Foundation(2022B1515120069)the Science and Technology Project of State Grid Corporation of China(5100-202199557A-0-5-ZN).
文摘Hierarchical networks are frequently encountered in animal groups,gene networks,and artificial engineering systems such as multiple robots,unmanned vehicle systems,smart grids,wind farm networks,and so forth.The structure of a large directed hierarchical network is often strongly influenced by reverse edges from lower-to higher-level nodes,such as lagging birds’howl in a flock or the opinions of lowerlevel individuals feeding back to higher-level ones in a social group.This study reveals that,for most large-scale real hierarchical networks,the majority of the reverse edges do not affect the synchronization process of the entire network;the synchronization process is influenced only by a small part of these reverse edges along specific paths.More surprisingly,a single effective reverse edge can slow down the synchronization of a huge hierarchical network by over 60%.The effect of such edges depends not on the network size but only on the average in-degree of the involved subnetwork.The overwhelming majority of active reverse edges turn out to have some kind of“bunching”effect on the information flows of hierarchical networks,which slows down synchronization processes.This finding refines the current understanding of the role of reverse edges in many natural,social,and engineering hierarchical networks,which might be beneficial for precisely tuning the synchronization rhythms of these networks.Our study also proposes an effective way to attack a hierarchical network by adding a malicious reverse edge to it and provides some guidance for protecting a network by screening out the specific small proportion of vulnerable nodes.
基金supported by the National Research Foundation of Korea(NRF)under the Ministry of ScienceICT&Future Planning(Basic Science Research Program[No.2021R1A5A6002853],[No.2022R1A2C3004964],[No.2022R1C1C2008126],[No.2022M3H4A1A03074093])
文摘In designing efficient perovskite solar cells(PSCs),the selection of suitable electron transport layers(ETLs)is critical to the final device performance as they determine the driving force for selective charge extraction.SnO_(2)nanoparticles(NPs)based ETLs have been a popular choice for PSCs due to superior electron mobility,but their relatively deep-lying conduction band energy levels(ECB)result in substantial potential loss.Meanwhile,TiO_(2)NPs establish favorable band alignment owing to shallower ECB,but their low intrinsic mobility and abundant surface trap sites impede the final performance.For this reason,constructing a cascaded bilayer ETL is highly desirable for efficient PSCs,as it can rearrange energy levels and exploit on advantages of an individual ETL.In this study,we prepare SnO_(2)NPs and acetylacetone-modified TiO_(2)(Acac-TiO_(2))NPs and implement them as bilayer SnO_(2)/Acac-TiO_(2)(BST)ETL,to assemble cascaded energy band structure.SnO_(2)contributes to rapid charge carrier transport from high electron mobility while Acac-TiO_(2)minimizes band-offset and effectively suppresses interfacial recombination.Accordingly,the optimized BST ETL generates synergistic influence and delivers power conversion efficiency(PCE)as high as 23.14%with open-circuit voltage(V_(oc))reaching 1.14 V.Furthermore,the BST ETL is transferred to a large scale and the corresponding mini module demonstrates peak performance of 18.39%PCE from 25 cm^(2)aperture area.Finally,the BST-based mini module exhibit excellent stability,maintaining 83.1%of its initial efficiency after 1000 h under simultaneous 1 Sun light-soaking and damp heat(85℃/RH 85%)environment.
基金Project supported by the Natural Science Foundation of Jiangsu Province (Grant No.BK20220917)the National Natural Science Foundation of China (Grant Nos.12001213 and 12302035)。
文摘We present a large deviation theory that characterizes the exponential estimate for rare events in stochastic dynamical systems in the limit of weak noise.We aim to consider a next-to-leading-order approximation for more accurate calculation of the mean exit time by computing large deviation prefactors with the aid of machine learning.More specifically,we design a neural network framework to compute quasipotential,most probable paths and prefactors based on the orthogonal decomposition of a vector field.We corroborate the higher effectiveness and accuracy of our algorithm with two toy models.Numerical experiments demonstrate its powerful functionality in exploring the internal mechanism of rare events triggered by weak random fluctuations.
基金partially supported by the National Nature Science Foundation of China(12271114)the Guangxi Natural Science Foundation(2023JJD110009,2019JJG110003,2019AC20214)+2 种基金the Innovation Project of Guangxi Graduate Education(JGY2023061)the Key Laboratory of Mathematical Model and Application(Guangxi Normal University)the Education Department of Guangxi Zhuang Autonomous Region。
文摘We are concerned with the large-time behavior of 3D quasilinear hyperbolic equations with nonlinear damping.The main novelty of this paper is two-fold.First,we prove the optimal decay rates of the second and third order spatial derivatives of the solution,which are the same as those of the heat equation,and in particular,are faster than ones of previous related works.Second,for well-chosen initial data,we also show that the lower optimal L^(2) convergence rate of the k(∈[0,3])-order spatial derivatives of the solution is(1+t)^(-(2+2k)/4).Therefore,our decay rates are optimal in this sense.The proofs are based on the Fourier splitting method,low-frequency and high-frequency decomposition,and delicate energy estimates.
文摘This paper presents the design and verification of the dual-mode core driven fan stage(CDFS)and high-load compressor with a large flow regulation range.In view of the characteristics of large flow regulation range of the two modes and high average stage load coefficient,this paper investigates the design technology of the dual-mode high-efficiency compressor with a large flow regulation range and high-load compressor with an average stage load coefficient of 0.504.Building upon this research,the design of the dual-mode CDFS and four-stage compressor is completed,and three-dimensional numerical simulation of the two modes is carried out.Finally,performance experiment is conducted to verify the result of three-dimensional numerical simulation.The experiment results show that the compressor performance is improved for the whole working conditions by using the new design method,which realizes the complete fusion design of the CDFS and high-pressure compressor(HPC).The matching mechanism of stage characteristics of single and double bypass modes and the variation rule of different adjustment angles on performance are studied comprehensively.Furthermore,it effectively reduces the length and weight of compressor,and breaks through the key technologies such as high-load compressor with the average load factor of 0.504.These findings provide valuable data and a methodological foundation for the development of the next generation aeroengine.
基金supported by Natural Science Foundation of Fujian Province (CN) (2020I0009, 2022J01596)Cooperation Project on University Industry-Education-Research of Fujian Provincial Science and Technology Plan (CN) (2022N5011)+1 种基金Lancang-Mekong Cooperation Special Fund (2017-2020)International Sci-Tech Cooperation and Communication Program of Fujian Agriculture and Forestry University (KXGH17014)。
文摘Bulked-segregant analysis by deep sequencing(BSA-seq) is a widely used method for mapping QTL(quantitative trait loci) due to its simplicity, speed, cost-effectiveness, and efficiency. However, the ability of BSA-seq to detect QTL is often limited by inappropriate experimental designs, as evidenced by numerous practical studies. Most BSA-seq studies have utilized small to medium-sized populations, with F2populations being the most common choice. Nevertheless, theoretical studies have shown that using a large population with an appropriate pool size can significantly enhance the power and resolution of QTL detection in BSA-seq, with F_(3)populations offering notable advantages over F2populations. To provide an experimental demonstration, we tested the power of BSA-seq to identify QTL controlling days from sowing to heading(DTH) in a 7200-plant rice F_(3)population in two environments, with a pool size of approximately 500. Each experiment identified 34 QTL, an order of magnitude greater than reported in most BSA-seq experiments, of which 23 were detected in both experiments, with 17 of these located near41 previously reported QTL and eight cloned genes known to control DTH in rice. These results indicate that QTL mapping by BSA-seq in large F_(3)populations and multi-environment experiments can achieve high power, resolution, and reliability.