Solar Wind Charge eXchange X-ray(SWCX) emission in the heliosphere and Ea rth’s exosphere is a hard to avoid signal in soft Xray obse rvations of astrophysical targets.On the other hand,the X-ray imaging possibilitie...Solar Wind Charge eXchange X-ray(SWCX) emission in the heliosphere and Ea rth’s exosphere is a hard to avoid signal in soft Xray obse rvations of astrophysical targets.On the other hand,the X-ray imaging possibilities offered by the SWCX process has led to an increasing number of future dedicated space missions for investigating the solar wind-terrestrial inte ractions and magnetospheric interfaces.In both cases,accurate modelling of the SWCX emission is key to correctly interpret its signal,and remove it from obse rvations,when needed.In this paper,we compile solar wind abundance measurements from ACE for different solar wind types,and atomic data from literature,including charge exchange cross-sections and emission probabilities,used fo r calculating the compound cross-section a for the SWCX X-ray emission.We calculate a values for charge-exchange with H and He,relevant to soft X-ray energy bands(0.1-2.0 keV)for various solar wind types and solar cycle conditions.展开更多
Solar wind charge exchange(SWCX)is the process of solar wind high-valence ions exchanging charges with neutral components and generating soft X-rays.Recently,detecting the SWCX emission from the magnetosphere is propo...Solar wind charge exchange(SWCX)is the process of solar wind high-valence ions exchanging charges with neutral components and generating soft X-rays.Recently,detecting the SWCX emission from the magnetosphere is proposed as a new technique to study the magnetosphere using panoramic soft X-ray imaging.To better prepare for the data analysis of upcoming magnetospheric soft X-ray imaging missions,this paper compares the magnetospheric SWCX emission obtained by two methods in an XMM-Newton observation,during which the solar wind changed dramatically.The two methods differ in the data used to fit the diffuse X-ray background(DXB)parameters in spectral analysis.The method adding data from the ROSAT All-Sky Survey(RASS)is called the RASS method.The method using the quiet observation data is called the Quiet method,where quiet observations usually refer to observations made by the same satellite with the same target but under weaker solar wind conditions.Results show that the spectral compositions of magnetospheric SWCX emission obtained by the two methods are very similar,and the changes in intensity over time are highly consistent,although the intensity obtained by the RASS method is about 2.68±0.56 keV cm^(-2)s^(-1)sr^(-1)higher than that obtained by the Quiet method.Since the DXB intensity obtained by the RASS method is about 2.84±0.74 keV cm^(-2)s^(-1)sr^(-1)lower than that obtained by the Quiet method,and the linear correlation coefficient between the difference of SWCX and DXB obtained by the two methods in diffe rent energy band is close to-1,the diffe rences in magnetospheric SWCX can be fully attributed to the diffe rences in the fitted DXB.The difference between the two methods is most significant when the energy is less than 0.7 keV,which is also the main energy band of SWCX emission.In addition,the difference between the two methods is not related to the SWCX intensity and,to some extent,to solar wind conditions,because SWCX intensity typically va ries with the solar wind.In summary,both methods are robust and reliable,and should be considered based on the best available options.展开更多
In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding ...In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding the risk of data loss and data overlapping.The development of data flow scheduling approaches in the cloud environment taking security parameters into account is insufficient.In our work,we propose a data scheduling model for the cloud environment.Themodel is made up of three parts that together help dispatch user data flow to the appropriate cloudVMs.The first component is the Collector Agent whichmust periodically collect information on the state of the network links.The second one is the monitoring agent which must then analyze,classify,and make a decision on the state of the link and finally transmit this information to the scheduler.The third one is the scheduler who must consider previous information to transfer user data,including fair distribution and reliable paths.It should be noted that each part of the proposedmodel requires the development of its algorithms.In this article,we are interested in the development of data transfer algorithms,including fairness distribution with the consideration of a stable link state.These algorithms are based on the grouping of transmitted files and the iterative method.The proposed algorithms showthe performances to obtain an approximate solution to the studied problem which is an NP-hard(Non-Polynomial solution)problem.The experimental results show that the best algorithm is the half-grouped minimum excluding(HME),with a percentage of 91.3%,an average deviation of 0.042,and an execution time of 0.001 s.展开更多
A launching system with a filter cartridge structure was proposed to improve the muzzle velocity of the projectile.The combustion chamber of the launching system is divided into two fixed chambers,one is located in th...A launching system with a filter cartridge structure was proposed to improve the muzzle velocity of the projectile.The combustion chamber of the launching system is divided into two fixed chambers,one is located in the breech chamber,and the other is arranged in the barrel.The breech chamber charge was ignited first,and the charges in the auxiliary chambers were ignited by the high-temperature,highpressure combustible gas trailing the projectile.In this way,the combustible gas in the auxiliary chambers could compensate for the pressure drop caused by the movement of the projectile.The proposed device features the advantage of launching a projectile with high muzzle velocity without exceeding the maximum pressure in the chamber.In order to obtain some internal ballistic characteristics of the launch system,some critical structure,such as the length of the filter cartridge auxiliary charge,the combustion degree of the propellant in the chamber,and the length of the barrel,are discussed.The experimental results show that with the increased auxiliary charge length,a pressure plateau or even a secondary peak pressure can be formed,which is less than the peak pressure.The projectile velocity increased by 23.57%,14.64%,and 7.65%when the diaphragm thickness was 0 mm,1 mm,and2 mm,respectively.The muzzle velocity of the projectile can be increased by 13.42%by increasing the length of the barrel.Under the same charge condition,with the increase of barrel length,the energy utilization rate of propellant increases by 28.64%.展开更多
This paper proposes a type of double-layer charge liner fabricated using chemical vapor deposition(CVD)that has tungsten as its inner liner.The feasibility of this design was evaluated through penetration tests.Double...This paper proposes a type of double-layer charge liner fabricated using chemical vapor deposition(CVD)that has tungsten as its inner liner.The feasibility of this design was evaluated through penetration tests.Double-layer charge liners were fabricated by using CVD to deposit tungsten layers on the inner surfaces of pure T2 copper liners.The microstructures of the tungsten layers were analyzed using a scanning electron microscope(SEM).The feasibility analysis was carried out by pulsed X-rays,slug-retrieval test and static penetration tests.The shaped charge jet forming and penetration law of inner tungsten-coated double-layer liner were studied by numerical simulation method.The results showed that the double-layer liners could form well-shaped jets.The errors between the X-ray test results and the numerical results were within 11.07%.A slug-retrieval test was found that the retrieved slug was similar to a numerically simulated slug.Compared with the traditional pure copper shaped charge jet,the penetration depth of the double-layer shaped charge liner increased by 11.4% and>10.8% respectively.In summary,the test results are good,and the numerical simulation is in good agreement with the test,which verified the feasibility of using the CVD method to fabricate double-layer charge liners with a high-density and high-strength refractory metal as the inner liner.展开更多
The state-selective cross section data are useful for understanding and modeling the x-ray emission in celestial observations.In the present work,using the cold target recoil ion momentum spectroscopy,for the first ti...The state-selective cross section data are useful for understanding and modeling the x-ray emission in celestial observations.In the present work,using the cold target recoil ion momentum spectroscopy,for the first time we investigated the state-selective single electron capture processes for S^(q+)–He and H_(2)(q=11–15)collision systems at an impact energy of q×20 keV and obtained the relative state-selective cross sections.The results indicate that only a few principal quantum states of the projectile energy level are populated in a single electron capture process.In particular,the increase of the projectile charge state leads to the population of the states with higher principal quantum numbers.It is also shown that the experimental averaged n-shell populations are reproduced well by the over-barrier model.The database is openly available in Science Data Bank at 10.57760/sciencedb.j00113.00091.展开更多
Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin ...Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin in CC’s performance,the Cloud Service Broker(CSB),orchestrates DC selection.Failure to adroitly route user requests with suitable DCs transforms the CSB into a bottleneck,endangering service quality.To tackle this,deploying an efficient CSB policy becomes imperative,optimizing DC selection to meet stringent Qualityof-Service(QoS)demands.Amidst numerous CSB policies,their implementation grapples with challenges like costs and availability.This article undertakes a holistic review of diverse CSB policies,concurrently surveying the predicaments confronted by current policies.The foremost objective is to pinpoint research gaps and remedies to invigorate future policy development.Additionally,it extensively clarifies various DC selection methodologies employed in CC,enriching practitioners and researchers alike.Employing synthetic analysis,the article systematically assesses and compares myriad DC selection techniques.These analytical insights equip decision-makers with a pragmatic framework to discern the apt technique for their needs.In summation,this discourse resoundingly underscores the paramount importance of adept CSB policies in DC selection,highlighting the imperative role of efficient CSB policies in optimizing CC performance.By emphasizing the significance of these policies and their modeling implications,the article contributes to both the general modeling discourse and its practical applications in the CC domain.展开更多
The cloud type product 2B-CLDCLASS-LIDAR based on CloudSat and CALIPSO from June 2006 to May 2017 is used to examine the temporal and spatial distribution characteristics and interannual variability of eight cloud typ...The cloud type product 2B-CLDCLASS-LIDAR based on CloudSat and CALIPSO from June 2006 to May 2017 is used to examine the temporal and spatial distribution characteristics and interannual variability of eight cloud types(high cloud, altostratus, altocumulus, stratus, stratocumulus, cumulus, nimbostratus, and deep convection) and three phases(ice,mixed, and water) in the Arctic. Possible reasons for the observed interannual variability are also discussed. The main conclusions are as follows:(1) More water clouds occur on the Atlantic side, and more ice clouds occur over continents.(2)The average spatial and seasonal distributions of cloud types show three patterns: high clouds and most cumuliform clouds are concentrated in low-latitude locations and peak in summer;altostratus and nimbostratus are concentrated over and around continents and are less abundant in summer;stratocumulus and stratus are concentrated near the inner Arctic and peak during spring and autumn.(3) Regional averaged interannual frequencies of ice clouds and altostratus clouds significantly decrease, while those of water clouds, altocumulus, and cumulus clouds increase significantly.(4) Significant features of the linear trends of cloud frequencies are mainly located over ocean areas.(5) The monthly water cloud frequency anomalies are positively correlated with air temperature in most of the troposphere, while those for ice clouds are negatively correlated.(6) The decrease in altostratus clouds is associated with the weakening of the Arctic front due to Arctic warming, while increased water vapor transport into the Arctic and higher atmospheric instability lead to more cumulus and altocumulus clouds.展开更多
An intense laser pulse focused onto a plasma can excite nonlinear plasma waves.Under appropriate conditions,electrons from the background plasma are trapped in the plasma wave and accelerated to ultra-relativistic vel...An intense laser pulse focused onto a plasma can excite nonlinear plasma waves.Under appropriate conditions,electrons from the background plasma are trapped in the plasma wave and accelerated to ultra-relativistic velocities.This scheme is called a laser wakefield accelerator.In this work,we present results from a laser wakefield acceleration experiment using a petawatt-class laser to excite the wakefields as well as nanoparticles to assist the injection of electrons into the accelerating phase of the wakefields.We find that a 10-cm-long,nanoparticle-assisted laser wakefield accelerator can generate 340 pC,10±1.86 GeV electron bunches with a 3.4 GeV rms convolved energy spread and a 0.9 mrad rms divergence.It can also produce bunches with lower energies in the 4–6 GeV range.展开更多
The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of...The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of failure and lack of trust in the centralized system.However,it also brings new problems to the health information in the cloud storage environment,such as attribute leakage,low consensus efficiency,complex permission updates,and so on.This paper proposes an access control scheme with fine-grained attribute revocation,keyword search,and traceability of the attribute private key distribution process.Blockchain technology tracks the authorization of attribute private keys.The credit scoring method improves the Raft protocol in consensus efficiency.Besides,the interplanetary file system(IPFS)addresses the capacity deficit of blockchain.Under the premise of hiding policy,the research proposes a fine-grained access control method based on users,user attributes,and file structure.It optimizes the data-sharing mode.At the same time,Proxy Re-Encryption(PRE)technology is used to update the access rights.The proposed scheme proved to be secure.Comparative analysis and experimental results show that the proposed scheme has higher efficiency and more functions.It can meet the needs of medical institutions.展开更多
This paper focuses on the task of few-shot 3D point cloud semantic segmentation.Despite some progress,this task still encounters many issues due to the insufficient samples given,e.g.,incomplete object segmentation an...This paper focuses on the task of few-shot 3D point cloud semantic segmentation.Despite some progress,this task still encounters many issues due to the insufficient samples given,e.g.,incomplete object segmentation and inaccurate semantic discrimination.To tackle these issues,we first leverage part-whole relationships into the task of 3D point cloud semantic segmentation to capture semantic integrity,which is empowered by the dynamic capsule routing with the module of 3D Capsule Networks(CapsNets)in the embedding network.Concretely,the dynamic routing amalgamates geometric information of the 3D point cloud data to construct higher-level feature representations,which capture the relationships between object parts and their wholes.Secondly,we designed a multi-prototype enhancement module to enhance the prototype discriminability.Specifically,the single-prototype enhancement mechanism is expanded to the multi-prototype enhancement version for capturing rich semantics.Besides,the shot-correlation within the category is calculated via the interaction of different samples to enhance the intra-category similarity.Ablation studies prove that the involved part-whole relations and proposed multi-prototype enhancement module help to achieve complete object segmentation and improve semantic discrimination.Moreover,under the integration of these two modules,quantitative and qualitative experiments on two public benchmarks,including S3DIS and ScanNet,indicate the superior performance of the proposed framework on the task of 3D point cloud semantic segmentation,compared to some state-of-the-art methods.展开更多
As the extensive use of cloud computing raises questions about the security of any personal data stored there,cryptography is being used more frequently as a security tool to protect data confidentiality and privacy i...As the extensive use of cloud computing raises questions about the security of any personal data stored there,cryptography is being used more frequently as a security tool to protect data confidentiality and privacy in the cloud environment.A hypervisor is a virtualization software used in cloud hosting to divide and allocate resources on various pieces of hardware.The choice of hypervisor can significantly impact the performance of cryptographic operations in the cloud environment.An important issue that must be carefully examined is that no hypervisor is completely superior in terms of performance;Each hypervisor should be examined to meet specific needs.The main objective of this study is to provide accurate results to compare the performance of Hyper-V and Kernel-based Virtual Machine(KVM)while implementing different cryptographic algorithms to guide cloud service providers and end users in choosing the most suitable hypervisor for their cryptographic needs.This study evaluated the efficiency of two hypervisors,Hyper-V and KVM,in implementing six cryptographic algorithms:Rivest,Shamir,Adleman(RSA),Advanced Encryption Standard(AES),Triple Data Encryption Standard(TripleDES),Carlisle Adams and Stafford Tavares(CAST-128),BLOWFISH,and TwoFish.The study’s findings show that KVM outperforms Hyper-V,with 12.2%less Central Processing Unit(CPU)use and 12.95%less time overall for encryption and decryption operations with various file sizes.The study’s findings emphasize how crucial it is to pick a hypervisor that is appropriate for cryptographic needs in a cloud environment,which could assist both cloud service providers and end users.Future research may focus more on how various hypervisors perform while handling cryptographic workloads.展开更多
logical testing model and resource lifecycle information,generate test cases and complete parameters,and alleviate inconsistency issues through parameter inference.Once again,we propose a method of analyzing test resu...logical testing model and resource lifecycle information,generate test cases and complete parameters,and alleviate inconsistency issues through parameter inference.Once again,we propose a method of analyzing test results using joint state codes and call stack information,which compensates for the shortcomings of traditional analysis methods.We will apply our method to testing REST services,including OpenStack,an open source cloud operating platform for experimental evaluation.We have found a series of inconsistencies,known vulnerabilities,and new unknown logical defects.展开更多
Cavitation is a prevalent phenomenon within the domain of ship and ocean engineering,predominantly occurring in the tail flow fields of high-speed rotating propellers and on the surfaces of high-speed underwater vehic...Cavitation is a prevalent phenomenon within the domain of ship and ocean engineering,predominantly occurring in the tail flow fields of high-speed rotating propellers and on the surfaces of high-speed underwater vehicles.The re-entrant jet and compression wave resulting from the collapse of cavity vapour are pivotal factors contributing to cavity instability.Concurrently,these phenomena significantly modulate the evolution of cavitation flow.In this paper,numerical investigations into cloud cavitation over a Clark-Y hydrofoil were conducted,utilizing the Large Eddy Simulation(LES)turbulence model and the Volume of Fluid(VOF)method within the OpenFOAM framework.Comparative analysis of results obtained at different angles of attack is undertaken.A discernible augmentation in cavity thickness is observed concomitant with the escalation in attack angle,alongside a progressive intensification in pressure at the leading edge of the hydrofoil,contributing to the suction force.These results can serve as a fundamental point of reference for gaining a deeper comprehension of cloud cavitation dynamics.展开更多
Reactive armour is a very efficient add-on armour against shaped charge threats.Explosive reactive armour consists of one or several plates that are accelerated by an explosive.Similar but less violent acceleration of...Reactive armour is a very efficient add-on armour against shaped charge threats.Explosive reactive armour consists of one or several plates that are accelerated by an explosive.Similar but less violent acceleration of plates can also be achieved in a completely inert reactive armour.To be efficient against elongated jets,the motion of the plates needs to be inclined against the jet such that a sliding contact between the jet and the plates is established.This sliding contact causes a deflection and thinning of the jet.Under certain circumstances,the contact will become unstable,leading to severe disturbances on the jet.These disturbances will drastically reduce the jet penetration performance and it is therefore of interest to study the conditions that leads to an unstable contact.Previous studies on the interaction between shaped charge jets and flyer plates have shown that it is mainly the forward moving plate in an explosive reactive armour that is effective in disturbing the jet.This is usually attributed to the higher plate-to-jet mass flux ratio involved in the collision of the forward moving plate compared to the backward moving plate.For slow moving plates,as occurs in inert reactive armour,the difference in mass flux for the forward and backward moving plate is much lesser,and it is therefore of interest to study if other factors than the mass flux influences on the protection capability.In this work,experiments have been performed where a plate is accelerated along its length,interacting with a shaped charge jet that is fired at an oblique angle to the plate’s normal,either against or along the plate’s velocity.The arrangement corresponds to a jet interacting with a flyer plate from a reactive armour,with the exception that the collision velocity is the same for both types of obliquities in these experiments.The experiments show that disturbances on the jet are different in the two cases even though the collision velocities are the same.Numerical simulations of the interaction support the observation.The difference is attributed to the character of the contact pressure in the interaction region.For a backward moving plate,the maximum contact pressure is obtained at the beginning of the interaction zone and the contact pressure is therefore higher upstream than downstream of the jet while the opposite is true for a forward moving plate.A negative interface pressure gradient with respect to the jet motion results in a more stable flow than a positive,which means that the jet-plate contact is more stable for a backward moving plate than for a forward moving plate.A forward moving plate is thus more effective in disturbing the jet than a backward moving plate,not only because of the higher jet to plate mass flux ratio but also because of the character of the contact with the jet.展开更多
To study the effects of the initiation position on the damage and fracture characteristics of linear-charge blasting, blasting model experiments were conducted in this study using computed tomography scanning and thre...To study the effects of the initiation position on the damage and fracture characteristics of linear-charge blasting, blasting model experiments were conducted in this study using computed tomography scanning and three-dimensional reconstruction methods. The fractal damage theory was used to quantify the crack distribution and damage degree of sandstone specimens after blasting. The results showed that regardless of an inverse or top initiation, due to compression deformation and sliding frictional resistance, the plugging medium of the borehole is effective. The energy of the explosive gas near the top of the borehole is consumed. This affects the effective crushing of rocks near the top of the borehole, where the extent of damage to Sections Ⅰ and Ⅱ is less than that of Sections Ⅲ and Ⅳ. In addition, the analysis revealed that under conditions of top initiation, the reflected tensile damage of the rock at the free face of the top of the borehole and the compression deformation of the plug and friction consume more blasting energy, resulting in lower blasting energy efficiency for top initiation. As a result, the overall damage degree of the specimens in the top-initiation group was significantly smaller than that in the inverse-initiation group. Under conditions of inverse initiation, the blasting energy efficiency is greater, causing the specimen to experience greater damage. Therefore, in the engineering practice of rock tunnel cut blasting, to utilize blasting energy effectively and enhance the effects of rock fragmentation, using the inverse-initiation method is recommended. In addition, in three-dimensional(3D) rock blasting, the bottom of the borehole has obvious end effects under the conditions of inverse initiation, and the crack distribution at the bottom of the borehole is trumpet-shaped. The occurrence of an end effect in the 3D linear-charge blasting model experiment is related to the initiation position and the blocking condition.展开更多
Free-standing covalent organic framework(COFs)nanofilms exhibit a remarkable ability to rapidly intercalate/de-intercalate Li^(+)in lithium-ion batteries,while simultaneously exposing affluent active sites in supercap...Free-standing covalent organic framework(COFs)nanofilms exhibit a remarkable ability to rapidly intercalate/de-intercalate Li^(+)in lithium-ion batteries,while simultaneously exposing affluent active sites in supercapacitors.The development of these nanofilms offers a promising solution to address the persistent challenge of imbalanced charge storage kinetics between battery-type anode and capacitor-type cathode in lithium-ion capacitors(LICs).Herein,for the first time,custom-made COFBTMB-TP and COFTAPB-BPY nanofilms are synthesized as the anode and cathode,respectively,for an all-COF nanofilm-structured LIC.The COFBTMB-TP nanofilm with strong electronegative–CF3 groups enables tuning the partial electron cloud density for Li^(+)migration to ensure the rapid anode kinetic process.The thickness-regulated cathodic COFTAPB-BPY nanofilm can fit the anodic COF nanofilm in the capacity.Due to the aligned 1D channel,2D aromatic skeleton and accessible active sites of COF nanofilms,the whole COFTAPB-BPY//COFBTMB-TP LIC demonstrates a high energy density of 318 mWh cm^(−3)at a high-power density of 6 W cm^(−3),excellent rate capability,good cycle stability with the capacity retention rate of 77%after 5000-cycle.The COFTAPB-BPY//COFBTMB-TP LIC represents a new benchmark for currently reported film-type LICs and even film-type supercapacitors.After being comprehensively explored via ex situ XPS,7Li solid-state NMR analyses,and DFT calculation,it is found that the COFBTMB-TP nanofilm facilitates the reversible conversion of semi-ionic to ionic C–F bonds during lithium storage.COFBTMB-TP exhibits a strong interaction with Li^(+)due to the C–F,C=O,and C–N bonds,facilitating Li^(+)desolation and absorption from the electrolyte.This work addresses the challenge of imbalanced charge storage kinetics and capacity between the anode and cathode and also pave the way for future miniaturized and wearable LIC devices.展开更多
Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to est...Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.展开更多
文摘Solar Wind Charge eXchange X-ray(SWCX) emission in the heliosphere and Ea rth’s exosphere is a hard to avoid signal in soft Xray obse rvations of astrophysical targets.On the other hand,the X-ray imaging possibilities offered by the SWCX process has led to an increasing number of future dedicated space missions for investigating the solar wind-terrestrial inte ractions and magnetospheric interfaces.In both cases,accurate modelling of the SWCX emission is key to correctly interpret its signal,and remove it from obse rvations,when needed.In this paper,we compile solar wind abundance measurements from ACE for different solar wind types,and atomic data from literature,including charge exchange cross-sections and emission probabilities,used fo r calculating the compound cross-section a for the SWCX X-ray emission.We calculate a values for charge-exchange with H and He,relevant to soft X-ray energy bands(0.1-2.0 keV)for various solar wind types and solar cycle conditions.
基金supported by NNSFC grants 42322408,42188101 and 42074202the Strategic Pioneer Program on Space Science,CAS Grant nos.XDA15350201+3 种基金in part by the Research Fund from the Chinese Academy of Sciencesthe Specialized Research Fund for State Key Laboratories of China.supported by the Young Elite Scientists Sponsorship Program(CAST-Y202045)supported by Royal Society grant DHFR1211068。
文摘Solar wind charge exchange(SWCX)is the process of solar wind high-valence ions exchanging charges with neutral components and generating soft X-rays.Recently,detecting the SWCX emission from the magnetosphere is proposed as a new technique to study the magnetosphere using panoramic soft X-ray imaging.To better prepare for the data analysis of upcoming magnetospheric soft X-ray imaging missions,this paper compares the magnetospheric SWCX emission obtained by two methods in an XMM-Newton observation,during which the solar wind changed dramatically.The two methods differ in the data used to fit the diffuse X-ray background(DXB)parameters in spectral analysis.The method adding data from the ROSAT All-Sky Survey(RASS)is called the RASS method.The method using the quiet observation data is called the Quiet method,where quiet observations usually refer to observations made by the same satellite with the same target but under weaker solar wind conditions.Results show that the spectral compositions of magnetospheric SWCX emission obtained by the two methods are very similar,and the changes in intensity over time are highly consistent,although the intensity obtained by the RASS method is about 2.68±0.56 keV cm^(-2)s^(-1)sr^(-1)higher than that obtained by the Quiet method.Since the DXB intensity obtained by the RASS method is about 2.84±0.74 keV cm^(-2)s^(-1)sr^(-1)lower than that obtained by the Quiet method,and the linear correlation coefficient between the difference of SWCX and DXB obtained by the two methods in diffe rent energy band is close to-1,the diffe rences in magnetospheric SWCX can be fully attributed to the diffe rences in the fitted DXB.The difference between the two methods is most significant when the energy is less than 0.7 keV,which is also the main energy band of SWCX emission.In addition,the difference between the two methods is not related to the SWCX intensity and,to some extent,to solar wind conditions,because SWCX intensity typically va ries with the solar wind.In summary,both methods are robust and reliable,and should be considered based on the best available options.
基金the deputyship for Research&Innovation,Ministry of Education in Saudi Arabia for funding this research work through the Project Number(IFP-2022-34).
文摘In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding the risk of data loss and data overlapping.The development of data flow scheduling approaches in the cloud environment taking security parameters into account is insufficient.In our work,we propose a data scheduling model for the cloud environment.Themodel is made up of three parts that together help dispatch user data flow to the appropriate cloudVMs.The first component is the Collector Agent whichmust periodically collect information on the state of the network links.The second one is the monitoring agent which must then analyze,classify,and make a decision on the state of the link and finally transmit this information to the scheduler.The third one is the scheduler who must consider previous information to transfer user data,including fair distribution and reliable paths.It should be noted that each part of the proposedmodel requires the development of its algorithms.In this article,we are interested in the development of data transfer algorithms,including fairness distribution with the consideration of a stable link state.These algorithms are based on the grouping of transmitted files and the iterative method.The proposed algorithms showthe performances to obtain an approximate solution to the studied problem which is an NP-hard(Non-Polynomial solution)problem.The experimental results show that the best algorithm is the half-grouped minimum excluding(HME),with a percentage of 91.3%,an average deviation of 0.042,and an execution time of 0.001 s.
基金financially supported by the National Natural Science Foundation of China under Project No.51874267 and No.12272374the Fundamental Research Funds for the Central Universities under Project Nos.WK2480000008,WK2480000007,and WK2320000049。
文摘A launching system with a filter cartridge structure was proposed to improve the muzzle velocity of the projectile.The combustion chamber of the launching system is divided into two fixed chambers,one is located in the breech chamber,and the other is arranged in the barrel.The breech chamber charge was ignited first,and the charges in the auxiliary chambers were ignited by the high-temperature,highpressure combustible gas trailing the projectile.In this way,the combustible gas in the auxiliary chambers could compensate for the pressure drop caused by the movement of the projectile.The proposed device features the advantage of launching a projectile with high muzzle velocity without exceeding the maximum pressure in the chamber.In order to obtain some internal ballistic characteristics of the launch system,some critical structure,such as the length of the filter cartridge auxiliary charge,the combustion degree of the propellant in the chamber,and the length of the barrel,are discussed.The experimental results show that with the increased auxiliary charge length,a pressure plateau or even a secondary peak pressure can be formed,which is less than the peak pressure.The projectile velocity increased by 23.57%,14.64%,and 7.65%when the diaphragm thickness was 0 mm,1 mm,and2 mm,respectively.The muzzle velocity of the projectile can be increased by 13.42%by increasing the length of the barrel.Under the same charge condition,with the increase of barrel length,the energy utilization rate of propellant increases by 28.64%.
基金funded by the China Postdoctoral Science Foundation(Grant No.2022M721614)the opening project of State Key Laboratory of Explosion Science and Technology,Beijing Institute of Technology(Grant No.KFJJ23-07M)。
文摘This paper proposes a type of double-layer charge liner fabricated using chemical vapor deposition(CVD)that has tungsten as its inner liner.The feasibility of this design was evaluated through penetration tests.Double-layer charge liners were fabricated by using CVD to deposit tungsten layers on the inner surfaces of pure T2 copper liners.The microstructures of the tungsten layers were analyzed using a scanning electron microscope(SEM).The feasibility analysis was carried out by pulsed X-rays,slug-retrieval test and static penetration tests.The shaped charge jet forming and penetration law of inner tungsten-coated double-layer liner were studied by numerical simulation method.The results showed that the double-layer liners could form well-shaped jets.The errors between the X-ray test results and the numerical results were within 11.07%.A slug-retrieval test was found that the retrieved slug was similar to a numerically simulated slug.Compared with the traditional pure copper shaped charge jet,the penetration depth of the double-layer shaped charge liner increased by 11.4% and>10.8% respectively.In summary,the test results are good,and the numerical simulation is in good agreement with the test,which verified the feasibility of using the CVD method to fabricate double-layer charge liners with a high-density and high-strength refractory metal as the inner liner.
基金Project supported by the National Key Research and Development Program of China(Grant No.2017YFA0402400)the National Natural Science Foundation of China(Grant Nos.11974358 and 11934004)+1 种基金the Strategic Priority Research Program of the Chinese Academy of Sciences(Grant No.XDB34020000)the Heavy Ion Research Facility in Lanzhou(HIRFL).
文摘The state-selective cross section data are useful for understanding and modeling the x-ray emission in celestial observations.In the present work,using the cold target recoil ion momentum spectroscopy,for the first time we investigated the state-selective single electron capture processes for S^(q+)–He and H_(2)(q=11–15)collision systems at an impact energy of q×20 keV and obtained the relative state-selective cross sections.The results indicate that only a few principal quantum states of the projectile energy level are populated in a single electron capture process.In particular,the increase of the projectile charge state leads to the population of the states with higher principal quantum numbers.It is also shown that the experimental averaged n-shell populations are reproduced well by the over-barrier model.The database is openly available in Science Data Bank at 10.57760/sciencedb.j00113.00091.
文摘Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin in CC’s performance,the Cloud Service Broker(CSB),orchestrates DC selection.Failure to adroitly route user requests with suitable DCs transforms the CSB into a bottleneck,endangering service quality.To tackle this,deploying an efficient CSB policy becomes imperative,optimizing DC selection to meet stringent Qualityof-Service(QoS)demands.Amidst numerous CSB policies,their implementation grapples with challenges like costs and availability.This article undertakes a holistic review of diverse CSB policies,concurrently surveying the predicaments confronted by current policies.The foremost objective is to pinpoint research gaps and remedies to invigorate future policy development.Additionally,it extensively clarifies various DC selection methodologies employed in CC,enriching practitioners and researchers alike.Employing synthetic analysis,the article systematically assesses and compares myriad DC selection techniques.These analytical insights equip decision-makers with a pragmatic framework to discern the apt technique for their needs.In summation,this discourse resoundingly underscores the paramount importance of adept CSB policies in DC selection,highlighting the imperative role of efficient CSB policies in optimizing CC performance.By emphasizing the significance of these policies and their modeling implications,the article contributes to both the general modeling discourse and its practical applications in the CC domain.
基金supported in part by the National Natural Science Foundation of China (Grant No. 42105127)the Special Research Assistant Project of the Chinese Academy of Sciencesthe National Key Research and Development Plans of China (Grant Nos. 2019YFC1510304 and 2016YFE0201900-02)。
文摘The cloud type product 2B-CLDCLASS-LIDAR based on CloudSat and CALIPSO from June 2006 to May 2017 is used to examine the temporal and spatial distribution characteristics and interannual variability of eight cloud types(high cloud, altostratus, altocumulus, stratus, stratocumulus, cumulus, nimbostratus, and deep convection) and three phases(ice,mixed, and water) in the Arctic. Possible reasons for the observed interannual variability are also discussed. The main conclusions are as follows:(1) More water clouds occur on the Atlantic side, and more ice clouds occur over continents.(2)The average spatial and seasonal distributions of cloud types show three patterns: high clouds and most cumuliform clouds are concentrated in low-latitude locations and peak in summer;altostratus and nimbostratus are concentrated over and around continents and are less abundant in summer;stratocumulus and stratus are concentrated near the inner Arctic and peak during spring and autumn.(3) Regional averaged interannual frequencies of ice clouds and altostratus clouds significantly decrease, while those of water clouds, altocumulus, and cumulus clouds increase significantly.(4) Significant features of the linear trends of cloud frequencies are mainly located over ocean areas.(5) The monthly water cloud frequency anomalies are positively correlated with air temperature in most of the troposphere, while those for ice clouds are negatively correlated.(6) The decrease in altostratus clouds is associated with the weakening of the Arctic front due to Arctic warming, while increased water vapor transport into the Arctic and higher atmospheric instability lead to more cumulus and altocumulus clouds.
基金supported by the Air Force Office of Scientific Research Grant No.FA9550-17-1-0264supported by the DOE,Office of Science,Fusion Energy Sciences under Contract No.DE-SC0021125+2 种基金supported by the U.S.Department of Energy Grant No.DESC0011617.D.A.Jarozynski,E.Brunetti,B.Ersfeld,and S.Yoffe would like to acknowledge support from the U.K.EPSRC(Grant Nos.EP/J018171/1 and EP/N028694/1)the European Union’s Horizon 2020 research and innovation program under Grant Agreement No.871124 Laserlab-Europe and EuPRAXIA(Grant No.653782)funded by the N8 research partnership and EPSRC(Grant No.EP/T022167/1).
文摘An intense laser pulse focused onto a plasma can excite nonlinear plasma waves.Under appropriate conditions,electrons from the background plasma are trapped in the plasma wave and accelerated to ultra-relativistic velocities.This scheme is called a laser wakefield accelerator.In this work,we present results from a laser wakefield acceleration experiment using a petawatt-class laser to excite the wakefields as well as nanoparticles to assist the injection of electrons into the accelerating phase of the wakefields.We find that a 10-cm-long,nanoparticle-assisted laser wakefield accelerator can generate 340 pC,10±1.86 GeV electron bunches with a 3.4 GeV rms convolved energy spread and a 0.9 mrad rms divergence.It can also produce bunches with lower energies in the 4–6 GeV range.
基金This research was funded by the National Natural Science Foundation of China,Grant Number 62162039the Shaanxi Provincial Key R&D Program,China with Grant Number 2020GY-041.
文摘The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of failure and lack of trust in the centralized system.However,it also brings new problems to the health information in the cloud storage environment,such as attribute leakage,low consensus efficiency,complex permission updates,and so on.This paper proposes an access control scheme with fine-grained attribute revocation,keyword search,and traceability of the attribute private key distribution process.Blockchain technology tracks the authorization of attribute private keys.The credit scoring method improves the Raft protocol in consensus efficiency.Besides,the interplanetary file system(IPFS)addresses the capacity deficit of blockchain.Under the premise of hiding policy,the research proposes a fine-grained access control method based on users,user attributes,and file structure.It optimizes the data-sharing mode.At the same time,Proxy Re-Encryption(PRE)technology is used to update the access rights.The proposed scheme proved to be secure.Comparative analysis and experimental results show that the proposed scheme has higher efficiency and more functions.It can meet the needs of medical institutions.
基金This work is supported by the National Natural Science Foundation of China under Grant No.62001341the National Natural Science Foundation of Jiangsu Province under Grant No.BK20221379the Jiangsu Engineering Research Center of Digital Twinning Technology for Key Equipment in Petrochemical Process under Grant No.DTEC202104.
文摘This paper focuses on the task of few-shot 3D point cloud semantic segmentation.Despite some progress,this task still encounters many issues due to the insufficient samples given,e.g.,incomplete object segmentation and inaccurate semantic discrimination.To tackle these issues,we first leverage part-whole relationships into the task of 3D point cloud semantic segmentation to capture semantic integrity,which is empowered by the dynamic capsule routing with the module of 3D Capsule Networks(CapsNets)in the embedding network.Concretely,the dynamic routing amalgamates geometric information of the 3D point cloud data to construct higher-level feature representations,which capture the relationships between object parts and their wholes.Secondly,we designed a multi-prototype enhancement module to enhance the prototype discriminability.Specifically,the single-prototype enhancement mechanism is expanded to the multi-prototype enhancement version for capturing rich semantics.Besides,the shot-correlation within the category is calculated via the interaction of different samples to enhance the intra-category similarity.Ablation studies prove that the involved part-whole relations and proposed multi-prototype enhancement module help to achieve complete object segmentation and improve semantic discrimination.Moreover,under the integration of these two modules,quantitative and qualitative experiments on two public benchmarks,including S3DIS and ScanNet,indicate the superior performance of the proposed framework on the task of 3D point cloud semantic segmentation,compared to some state-of-the-art methods.
文摘As the extensive use of cloud computing raises questions about the security of any personal data stored there,cryptography is being used more frequently as a security tool to protect data confidentiality and privacy in the cloud environment.A hypervisor is a virtualization software used in cloud hosting to divide and allocate resources on various pieces of hardware.The choice of hypervisor can significantly impact the performance of cryptographic operations in the cloud environment.An important issue that must be carefully examined is that no hypervisor is completely superior in terms of performance;Each hypervisor should be examined to meet specific needs.The main objective of this study is to provide accurate results to compare the performance of Hyper-V and Kernel-based Virtual Machine(KVM)while implementing different cryptographic algorithms to guide cloud service providers and end users in choosing the most suitable hypervisor for their cryptographic needs.This study evaluated the efficiency of two hypervisors,Hyper-V and KVM,in implementing six cryptographic algorithms:Rivest,Shamir,Adleman(RSA),Advanced Encryption Standard(AES),Triple Data Encryption Standard(TripleDES),Carlisle Adams and Stafford Tavares(CAST-128),BLOWFISH,and TwoFish.The study’s findings show that KVM outperforms Hyper-V,with 12.2%less Central Processing Unit(CPU)use and 12.95%less time overall for encryption and decryption operations with various file sizes.The study’s findings emphasize how crucial it is to pick a hypervisor that is appropriate for cryptographic needs in a cloud environment,which could assist both cloud service providers and end users.Future research may focus more on how various hypervisors perform while handling cryptographic workloads.
文摘logical testing model and resource lifecycle information,generate test cases and complete parameters,and alleviate inconsistency issues through parameter inference.Once again,we propose a method of analyzing test results using joint state codes and call stack information,which compensates for the shortcomings of traditional analysis methods.We will apply our method to testing REST services,including OpenStack,an open source cloud operating platform for experimental evaluation.We have found a series of inconsistencies,known vulnerabilities,and new unknown logical defects.
基金supported by the National Natural Science Foundation of China(Nos.12202011,12332014)China Postdoctoral Science Foundation(No.2022M710190).
文摘Cavitation is a prevalent phenomenon within the domain of ship and ocean engineering,predominantly occurring in the tail flow fields of high-speed rotating propellers and on the surfaces of high-speed underwater vehicles.The re-entrant jet and compression wave resulting from the collapse of cavity vapour are pivotal factors contributing to cavity instability.Concurrently,these phenomena significantly modulate the evolution of cavitation flow.In this paper,numerical investigations into cloud cavitation over a Clark-Y hydrofoil were conducted,utilizing the Large Eddy Simulation(LES)turbulence model and the Volume of Fluid(VOF)method within the OpenFOAM framework.Comparative analysis of results obtained at different angles of attack is undertaken.A discernible augmentation in cavity thickness is observed concomitant with the escalation in attack angle,alongside a progressive intensification in pressure at the leading edge of the hydrofoil,contributing to the suction force.These results can serve as a fundamental point of reference for gaining a deeper comprehension of cloud cavitation dynamics.
基金funded by the Swedish Armed Forces under Contract No AT.9220620。
文摘Reactive armour is a very efficient add-on armour against shaped charge threats.Explosive reactive armour consists of one or several plates that are accelerated by an explosive.Similar but less violent acceleration of plates can also be achieved in a completely inert reactive armour.To be efficient against elongated jets,the motion of the plates needs to be inclined against the jet such that a sliding contact between the jet and the plates is established.This sliding contact causes a deflection and thinning of the jet.Under certain circumstances,the contact will become unstable,leading to severe disturbances on the jet.These disturbances will drastically reduce the jet penetration performance and it is therefore of interest to study the conditions that leads to an unstable contact.Previous studies on the interaction between shaped charge jets and flyer plates have shown that it is mainly the forward moving plate in an explosive reactive armour that is effective in disturbing the jet.This is usually attributed to the higher plate-to-jet mass flux ratio involved in the collision of the forward moving plate compared to the backward moving plate.For slow moving plates,as occurs in inert reactive armour,the difference in mass flux for the forward and backward moving plate is much lesser,and it is therefore of interest to study if other factors than the mass flux influences on the protection capability.In this work,experiments have been performed where a plate is accelerated along its length,interacting with a shaped charge jet that is fired at an oblique angle to the plate’s normal,either against or along the plate’s velocity.The arrangement corresponds to a jet interacting with a flyer plate from a reactive armour,with the exception that the collision velocity is the same for both types of obliquities in these experiments.The experiments show that disturbances on the jet are different in the two cases even though the collision velocities are the same.Numerical simulations of the interaction support the observation.The difference is attributed to the character of the contact pressure in the interaction region.For a backward moving plate,the maximum contact pressure is obtained at the beginning of the interaction zone and the contact pressure is therefore higher upstream than downstream of the jet while the opposite is true for a forward moving plate.A negative interface pressure gradient with respect to the jet motion results in a more stable flow than a positive,which means that the jet-plate contact is more stable for a backward moving plate than for a forward moving plate.A forward moving plate is thus more effective in disturbing the jet than a backward moving plate,not only because of the higher jet to plate mass flux ratio but also because of the character of the contact with the jet.
基金supported by the National Natural Science Foundation of China (No.52204085)the Interdisciplinary Research Project for Young Teachers of USTB,Fundamental Research Funds for the Central Universities (No.FRF-IDRY-21-006).
文摘To study the effects of the initiation position on the damage and fracture characteristics of linear-charge blasting, blasting model experiments were conducted in this study using computed tomography scanning and three-dimensional reconstruction methods. The fractal damage theory was used to quantify the crack distribution and damage degree of sandstone specimens after blasting. The results showed that regardless of an inverse or top initiation, due to compression deformation and sliding frictional resistance, the plugging medium of the borehole is effective. The energy of the explosive gas near the top of the borehole is consumed. This affects the effective crushing of rocks near the top of the borehole, where the extent of damage to Sections Ⅰ and Ⅱ is less than that of Sections Ⅲ and Ⅳ. In addition, the analysis revealed that under conditions of top initiation, the reflected tensile damage of the rock at the free face of the top of the borehole and the compression deformation of the plug and friction consume more blasting energy, resulting in lower blasting energy efficiency for top initiation. As a result, the overall damage degree of the specimens in the top-initiation group was significantly smaller than that in the inverse-initiation group. Under conditions of inverse initiation, the blasting energy efficiency is greater, causing the specimen to experience greater damage. Therefore, in the engineering practice of rock tunnel cut blasting, to utilize blasting energy effectively and enhance the effects of rock fragmentation, using the inverse-initiation method is recommended. In addition, in three-dimensional(3D) rock blasting, the bottom of the borehole has obvious end effects under the conditions of inverse initiation, and the crack distribution at the bottom of the borehole is trumpet-shaped. The occurrence of an end effect in the 3D linear-charge blasting model experiment is related to the initiation position and the blocking condition.
基金We are grateful to National Natural Science Foundation of China(Grant No.22375056,52272163)the Key R&D Program of Hebei(Grant No.216Z1201G)+1 种基金Natural Science Foundation of Hebei Province(Grant No.E2022208066,B2021208014)Key R&D Program of Hebei Technological Innovation Center of Chiral Medicine(Grant No.ZXJJ20220105).
文摘Free-standing covalent organic framework(COFs)nanofilms exhibit a remarkable ability to rapidly intercalate/de-intercalate Li^(+)in lithium-ion batteries,while simultaneously exposing affluent active sites in supercapacitors.The development of these nanofilms offers a promising solution to address the persistent challenge of imbalanced charge storage kinetics between battery-type anode and capacitor-type cathode in lithium-ion capacitors(LICs).Herein,for the first time,custom-made COFBTMB-TP and COFTAPB-BPY nanofilms are synthesized as the anode and cathode,respectively,for an all-COF nanofilm-structured LIC.The COFBTMB-TP nanofilm with strong electronegative–CF3 groups enables tuning the partial electron cloud density for Li^(+)migration to ensure the rapid anode kinetic process.The thickness-regulated cathodic COFTAPB-BPY nanofilm can fit the anodic COF nanofilm in the capacity.Due to the aligned 1D channel,2D aromatic skeleton and accessible active sites of COF nanofilms,the whole COFTAPB-BPY//COFBTMB-TP LIC demonstrates a high energy density of 318 mWh cm^(−3)at a high-power density of 6 W cm^(−3),excellent rate capability,good cycle stability with the capacity retention rate of 77%after 5000-cycle.The COFTAPB-BPY//COFBTMB-TP LIC represents a new benchmark for currently reported film-type LICs and even film-type supercapacitors.After being comprehensively explored via ex situ XPS,7Li solid-state NMR analyses,and DFT calculation,it is found that the COFBTMB-TP nanofilm facilitates the reversible conversion of semi-ionic to ionic C–F bonds during lithium storage.COFBTMB-TP exhibits a strong interaction with Li^(+)due to the C–F,C=O,and C–N bonds,facilitating Li^(+)desolation and absorption from the electrolyte.This work addresses the challenge of imbalanced charge storage kinetics and capacity between the anode and cathode and also pave the way for future miniaturized and wearable LIC devices.
基金supported in part by the Nationa Natural Science Foundation of China (61876011)the National Key Research and Development Program of China (2022YFB4703700)+1 种基金the Key Research and Development Program 2020 of Guangzhou (202007050002)the Key-Area Research and Development Program of Guangdong Province (2020B090921003)。
文摘Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.