This paper presents a new method of using a convolutional neural network(CNN)in machine learning to identify brand consistency by product appearance variation.In Experiment 1,we collected fifty mouse devices from the ...This paper presents a new method of using a convolutional neural network(CNN)in machine learning to identify brand consistency by product appearance variation.In Experiment 1,we collected fifty mouse devices from the past thirty-five years from a renowned company to build a dataset consisting of product pictures with pre-defined design features of their appearance and functions.Results show that it is a challenge to distinguish periods for the subtle evolution of themouse devices with such traditionalmethods as time series analysis and principal component analysis(PCA).In Experiment 2,we applied deep learning to predict the extent to which the product appearance variation ofmouse devices of various brands.The investigation collected 6,042 images ofmouse devices and divided theminto the Early Stage and the Late Stage.Results show the highest accuracy of 81.4%with the CNNmodel,and the evaluation score of brand style consistency is 0.36,implying that the brand consistency score converted by the CNN accuracy rate is not always perfect in the real world.The relationship between product appearance variation,brand style consistency,and evaluation score is beneficial for predicting new product styles and future product style roadmaps.In addition,the CNN heat maps highlight the critical areas of design features of different styles,providing alternative clues related to the blurred boundary.The study provides insights into practical problems for designers,manufacturers,and marketers in product design.It not only contributes to the scientific understanding of design development,but also provides industry professionals with practical tools and methods to improve the design process and maintain brand consistency.Designers can use these techniques to find features that influence brand style.Then,capture these features as innovative design elements and maintain core brand values.展开更多
Accurate cropland information is critical for agricultural planning and production,especially in foodstressed countries like China.Although widely used medium-to-high-resolution satellite-based cropland maps have been...Accurate cropland information is critical for agricultural planning and production,especially in foodstressed countries like China.Although widely used medium-to-high-resolution satellite-based cropland maps have been developed from various remotely sensed data sources over the past few decades,considerable discrepancies exist among these products both in total area and in spatial distribution of croplands,impeding further applications of these datasets.The factors influencing their inconsistency are also unknown.In this study,we evaluated the consistency and accuracy of six cropland maps widely used in China in circa 2020,including three state-of-the-art 10-m products(i.e.,Google Dynamic World,ESRI Land Cover,and ESA WorldCover)and three 30-m ones(i.e.,GLC_FCS30,GlobeLand 30,and CLCD).We also investigated the effects of landscape fragmentation,climate,and agricultural management.Validation using a ground-truth sample revealed that the 10-m-resolution WorldCover provided the highest accuracy(92.3%).These maps collectively overestimated Chinese cropland area by up to 56%.Up to 37%of the land showed spatial inconsistency among the maps,concentrated mainly in mountainous regions and attributed to the varying accuracy of cropland maps,cropland fragmentation and management practices such as irrigation.Our work shed light on the promotion of future cropland mapping efforts,especially in highly inconsistent regions.展开更多
Domain adaptation(DA) aims to find a subspace,where the discrepancies between the source and target domains are reduced. Based on this subspace, the classifier trained by the labeled source samples can classify unlabe...Domain adaptation(DA) aims to find a subspace,where the discrepancies between the source and target domains are reduced. Based on this subspace, the classifier trained by the labeled source samples can classify unlabeled target samples well.Existing approaches leverage Graph Embedding Learning to explore such a subspace. Unfortunately, due to 1) the interaction of the consistency and specificity between samples, and 2) the joint impact of the degenerated features and incorrect labels in the samples, the existing approaches might assign unsuitable similarity, which restricts their performance. In this paper, we propose an approach called adaptive graph embedding with consistency and specificity(AGE-CS) to cope with these issues. AGE-CS consists of two methods, i.e., graph embedding with consistency and specificity(GECS), and adaptive graph embedding(AGE).GECS jointly learns the similarity of samples under the geometric distance and semantic similarity metrics, while AGE adaptively adjusts the relative importance between the geometric distance and semantic similarity during the iterations. By AGE-CS,the neighborhood samples with the same label are rewarded,while the neighborhood samples with different labels are punished. As a result, compact structures are preserved, and advanced performance is achieved. Extensive experiments on five benchmark datasets demonstrate that the proposed method performs better than other Graph Embedding methods.展开更多
As one of the major threats to the current DeFi(Decentralized Finance)ecosystem,reentrant attack induces data inconsistency of the victim smart contract,enabling attackers to steal on-chain assets from DeFi projects,w...As one of the major threats to the current DeFi(Decentralized Finance)ecosystem,reentrant attack induces data inconsistency of the victim smart contract,enabling attackers to steal on-chain assets from DeFi projects,which could terribly do harm to the confidence of the blockchain investors.However,protecting DeFi projects from the reentrant attack is very difficult,since generating a call loop within the highly automatic DeFi ecosystem could be very practicable.Existing researchers mainly focus on the detection of the reentrant vulnerabilities in the code testing,and no method could promise the non-existent of reentrant vulnerabilities.In this paper,we introduce the database lock mechanism to isolate the correlated smart contract states from other operations in the same contract,so that we can prevent the attackers from abusing the inconsistent smart contract state.Compared to the existing resolutions of front-running,code audit,andmodifier,our method guarantees protection resultswith better flexibility.And we further evaluate our method on a number of de facto reentrant attacks observed from Etherscan.The results prove that our method could efficiently prevent the reentrant attack with less running cost.展开更多
The study explores the asymptotic consistency of the James-Stein shrinkage estimator obtained by shrinking a maximum likelihood estimator. We use Hansen’s approach to show that the James-Stein shrinkage estimator con...The study explores the asymptotic consistency of the James-Stein shrinkage estimator obtained by shrinking a maximum likelihood estimator. We use Hansen’s approach to show that the James-Stein shrinkage estimator converges asymptotically to some multivariate normal distribution with shrinkage effect values. We establish that the rate of convergence is of order and rate , hence the James-Stein shrinkage estimator is -consistent. Then visualise its consistency by studying the asymptotic behaviour using simulating plots in R for the mean squared error of the maximum likelihood estimator and the shrinkage estimator. The latter graphically shows lower mean squared error as compared to that of the maximum likelihood estimator.展开更多
“The Fundamental Rights and obligations of Citizens”, the title of Chapter II of the current Constitution of PRC, and the stipulation that citizens must fulfill certain obligations while enjoying rights have trigger...“The Fundamental Rights and obligations of Citizens”, the title of Chapter II of the current Constitution of PRC, and the stipulation that citizens must fulfill certain obligations while enjoying rights have triggered many debates. Considering the historical origin, constitutional philosophy, and the text and structure of the Constitution, the special provisions of the current Constitution are influenced by the principle of consistency of rights and obligations. The principle of consistency of rights and obligations in the Constitution is of complex connotation. Therefore, although the principle of consistency of rights and obligations effectively connects the public and private spheres, it ignores the diversity and differences of the interests and elements contained in the Constitution, the asymmetry of the normative status of fundamental rights and fundamental obligations,and the right of citizens to self-determination of personal interests.The principle of consistency of rights and obligations should be purposefully narrowed and concretized: In the context of public-private integration and risk society prevention, the principle of consistency of rights and obligations can be used as a supplement to the functional system of the Constitution;in the field of fundamental political obligations, the principle of consistency of rights and obligations should be in line with the requirements of the state to respect and protect human rights;in the field of fundamental social obligations, the exercise of fundamental rights by individuals is protected by the Constitution as long as they comply with the law and do not infringe upon the interests of the social community. The principle of the consistency of rights and obligations is only used as the negative constituents of the determination of rights and the basis for the effect against a third party of fundamental rights.展开更多
The classical propositional calculus(often called also as“zero-order logic”),is the most fundamental two-valued logical system.It is necessary to construct the classical calculus of quantifiers(often called also as...The classical propositional calculus(often called also as“zero-order logic”),is the most fundamental two-valued logical system.It is necessary to construct the classical calculus of quantifiers(often called also as“classical calculus of predicates”or“first-order logic”),which is necessary to construct the classical functional calculus.This last one is being used for formalization of the Arithmetic System.At the beginning of this paper,we introduce a notation and we repeat certain well-known notions(among others,the notions of operation of consequence,a system,consistency in the traditional sense,consistency in the absolute sense)and certain well-known theorems.Next,we establish that classical propositional calculus is an inconsistent theory.展开更多
Image segmentation is a key and fundamental problem in image processing,computer graphics,and computer vision.Level set based method for image segmentation is used widely for its topology flexibility and proper mathem...Image segmentation is a key and fundamental problem in image processing,computer graphics,and computer vision.Level set based method for image segmentation is used widely for its topology flexibility and proper mathematical formulation.However,poor performance of existing level set models on noisy images and weak boundary limit its application in image segmentation.In this paper,we present a region consistency constraint term to measure the regional consistency on both sides of the boundary,this term defines the boundary of the image within a range,and hence increases the stability of the level set model.The term can make existing level set models significantly improve the efficiency of the algorithms on segmenting images with noise and weak boundary.Furthermore,this constraint term can make edge-based level set model overcome the defect of sensitivity to the initial contour.The experimental results show that our algorithm is efficient for image segmentation and outperform the existing state-of-art methods regarding images with noise and weak boundary.展开更多
Reliability enhancement testing(RET) is an accelerated testing which hastens the performance degradation process to surface its inherent defects of design and manufacture. It is an important hypothesis that the degrad...Reliability enhancement testing(RET) is an accelerated testing which hastens the performance degradation process to surface its inherent defects of design and manufacture. It is an important hypothesis that the degradation mechanism of the RET is the same as the one of the normal stress condition. In order to check the consistency of two mechanisms, we conduct two enhancement tests with a missile servo system as an object of the study, and preprocess two sets of test data to establish the accelerated degradation models regarding the temperature change rate that is assumed to be the main applied stress of the servo system during the natural storage. Based on the accelerated degradation models and natural storage profile of the servo system, we provide and demonstrate a procedure to check the consistency of two mechanisms by checking the correlation and difference of two sets of degradation data. The results indicate that the two degradation mechanisms are significantly consistent with each other.展开更多
Given the global lack of effective analysis methods for the impact of design parameter tolerance on performance deviation in the vehicle proof-of-concept stage,it is difficult to decompose performance tolerance to des...Given the global lack of effective analysis methods for the impact of design parameter tolerance on performance deviation in the vehicle proof-of-concept stage,it is difficult to decompose performance tolerance to design parameter tolerance.This study proposes a set of consistency analysis methods for vehicle steering performance.The process of consistency analysis and control of automotive performance in the conceptual design phase is proposed for the first time.A vehicle dynamics model is constructed,and the multi-objective optimization software Isight is used to optimize the steering performance of the car.Sensitivity analysis is used to optimize the design performance value.The tolerance interval of the performance is obtained by comparing the original car performance value with the optimized value.With the help of layer-by-layer decomposition theory and interval mathematics,automotive performance tolerance has been decomposed into design parameter tolerance.Through simulation and real vehicle experiments,the validity of the consistency analysis and control method presented in this paper are verified.The decomposition from parameter tolerance to performance tolerance can be achieved at the conceptual design stage.展开更多
This paper introduces uncertainty theory to deal with non-deterministic factors in ranking alternatives. The uncertain variable method (UVM) and the definition of consistency for uncertainty comparison matrices are pr...This paper introduces uncertainty theory to deal with non-deterministic factors in ranking alternatives. The uncertain variable method (UVM) and the definition of consistency for uncertainty comparison matrices are proposed. A simple yet pragmatic approach for testing whether or not an uncertainty comparison matrix is consistent is put forward. In cases where an uncertainty comparison matrix is inconsistent, an algorithm is used to generate consistent matrix. And then the consistent uncertainty comparison matrix can derive the uncertainty weights. The final ranking is given by uncertainty weighs if they are acceptable;otherwise we rely on the ranks of expected values of uncertainty weights instead. Three numerical examples including a hierarchical (AHP) decision problem are examined to illustrate the validity and practicality of the proposed methods.展开更多
The purpose of this study is to investigate and quantify some possible sources of dispersion of 120 mm APFSDS tank ammunition both experimentally and numerically.This paper aims to point out the most influential sourc...The purpose of this study is to investigate and quantify some possible sources of dispersion of 120 mm APFSDS tank ammunition both experimentally and numerically.This paper aims to point out the most influential source during In-Bore Balloting Motion phase as well as in External Ballistics phase of the ammunition and quantifies its effect on dispersion.Data obtained from flight trials is critically analysed and parameters affecting dispersion such as initial yaw/pitch rates,yaw/pitch dampening,plane start angle,launch spin,clearance,centre of gravity shift,dynamic imbalance angle,cross wind,etc.are observed and,later on,studied in detail by extensive External Ballistics Monte Carlo(EBMC)simulation and Six Degree of Freedom(6-DOF)trajectory analysis.In Bore Balloting Motion simulation shows that reduction in residual spin by about 5%results in drastic56%reduction in first maximum yaw.A correlation between first maximum yaw and residual spin is observed.Results of data analysis are used in design modification for existing ammunition.Number of designs are evaluated numerically before freezing five designs for further soundings.These designs are critically assessed in terms of their comparative performance during In-bore travel&external ballistics phase.Results are validated by free flight trials for the finalised design.展开更多
Identifying inter-frame forgery is a hot topic in video forensics. In this paper, we propose a method based on the assumption that the correlation coefficients of gray values is consistent in an original video, while ...Identifying inter-frame forgery is a hot topic in video forensics. In this paper, we propose a method based on the assumption that the correlation coefficients of gray values is consistent in an original video, while in forgeries the consistency will be destroyed. We first extract the consistency of correlation coefficients of gray values (CCCoGV for short) after normalization and quantization as distinguishing feature to identify interframe forgeries. Then we test the CCCoGV in a large database with the help of SVM (Support Vector Machine). Experimental results show that the proposed method is efficient in classifying original videos and forgeries. Furthermore, the proposed method performs also pretty well in classifying frame insertion and frame deletion forgeries.展开更多
Consistency degree calculation is established on the basis of known correspondence, but in real life, the correspondence is generally unknown, so how to calculate consistency of two models under unknown correspondence...Consistency degree calculation is established on the basis of known correspondence, but in real life, the correspondence is generally unknown, so how to calculate consistency of two models under unknown correspondence has become a problem. For this condition, we should analyze unknown correspondence due to the influence of different correspondences.In this paper we obtain the relations of transitions based on event relations using branching processes, and build a behavioral matrix of relations. Based on the permutation of behavioral matrix, we express different correspondences, and define a new formula to compute the maximal consistency degree of two workflow nets. Additionally, this paper utilizes an example to show these definitions, computation as well as the advantages.展开更多
Intuitionistic fuzzy preference relation(IFPR) is a suitable technique to express fuzzy preference information by decision makers(DMs). This paper aims to provide a group decision making method where DMs use the IFPRs...Intuitionistic fuzzy preference relation(IFPR) is a suitable technique to express fuzzy preference information by decision makers(DMs). This paper aims to provide a group decision making method where DMs use the IFPRs to indicate their preferences with uncertain weights. To begin with, a model to derive weight vectors of alternatives from IFPRs based on multiplicative consistency is presented. Specifically, for any IFPR,by minimizing its absolute deviation from the corresponding consistent IFPR, the weight vectors are generated. Secondly,a method to determine relative weights of DMs depending on preference information is developed. After that we prioritize alternatives based on the obtained weights considering the risk preference of DMs. Finally, this approach is applied to the problem of technical risks assessment of armored equipment to illustrate the applicability and superiority of the proposed method.展开更多
In this note we consider some basic, yet unusual, issues pertaining to the accuracy and stability of numerical integration methods to follow the solution of first order and second order initial value problems (IVP). I...In this note we consider some basic, yet unusual, issues pertaining to the accuracy and stability of numerical integration methods to follow the solution of first order and second order initial value problems (IVP). Included are remarks on multiple solutions, multi-step methods, effect of initial value perturbations, as well as slowing and advancing the computed motion in second order problems.展开更多
With the rapid development of Unmanned Aerial Vehicle(UAV)technology,change detection methods based on UAV images have been extensively studied.However,the imaging of UAV sensors is susceptible to environmental interf...With the rapid development of Unmanned Aerial Vehicle(UAV)technology,change detection methods based on UAV images have been extensively studied.However,the imaging of UAV sensors is susceptible to environmental interference,which leads to great differences of same object between UAV images.Overcoming the discrepancy difference between UAV images is crucial to improving the accuracy of change detection.To address this issue,a novel unsupervised change detection method based on structural consistency and the Generalized Fuzzy Local Information C-means Clustering Model(GFLICM)was proposed in this study.Within this method,the establishment of a graph-based structural consistency measure allowed for the detection of change information by comparing structure similarity between UAV images.The local variation coefficient was introduced and a new fuzzy factor was reconstructed,after which the GFLICM algorithm was used to analyze difference images.Finally,change detection results were analyzed qualitatively and quantitatively.To measure the feasibility and robustness of the proposed method,experiments were conducted using two data sets from the cities of Yangzhou and Nanjing.The experimental results show that the proposed method can improve the overall accuracy of change detection and reduce the false alarm rate when compared with other state-of-the-art change detection methods.展开更多
Inclusion of dissipation and memory mechanisms, non-classical elasticity and thermal effects in the currently used plate/shell mathematical models require that we establish if these mathematical models can be derived ...Inclusion of dissipation and memory mechanisms, non-classical elasticity and thermal effects in the currently used plate/shell mathematical models require that we establish if these mathematical models can be derived using the conservation and balance laws of continuum mechanics in conjunction with the corresponding kinematic assumptions. This is referred to as thermodynamic consistency of the mathematical models. Thermodynamic consistency ensures thermodynamic equilibrium during the evolution of the deformation. When the mathematical models are thermodynamically consistent, the second law of thermodynamics facilitates consistent derivations of constitutive theories in the presence of dissipation and memory mechanisms. This is the main motivation for the work presented in this paper. In the currently used mathematical models for plates/shells based on the assumed kinematic relations, energy functional is constructed over the volume consisting of kinetic energy, strain energy and the potential energy of the loads. The Euler’s equations derived from the first variation of the energy functional for arbitrary length when set to zero yield the mathematical model(s) for the deforming plates/shells. Alternatively, principle of virtual work can also be used to derive the same mathematical model(s). For linear elastic reversible deformation physics with small deformation and small strain, these two approaches, based on energy functional and the principle of virtual work, yield the same mathematical models. These mathematical models hold for reversible mechanical deformation. In this paper, we examine whether the currently used plate/shell mathematical models with the corresponding kinematic assumptions can be derived using the conservation and balance laws of classical or non-classical continuum mechanics. The mathematical models based on Kirchhoff hypothesis (classical plate theory, CPT) and first order shear deformation theory (FSDT) that are representative of most mathematical models for plates/shells are investigated in this paper for their thermodynamic consistency. This is followed by the details of a general and higher order thermodynamically consistent plate/shell thermoelastic mathematical model that is free of a priori consideration of kinematic assumptions and remains valid for very thin as well as thick plates/shells with comprehensive nonlinear constitutive theories based on integrity. Model problem studies are presented for small deformation behavior of linear elastic plates in the absence of thermal effects and the results are compared with CPT and FSDT mathematical models.展开更多
A procedure to evaluate the quality consistency of generic drugs based on the impurity profile and the similarity analysis methods was presented in this paper. Nifedipine extended-release tablets from six generic fact...A procedure to evaluate the quality consistency of generic drugs based on the impurity profile and the similarity analysis methods was presented in this paper. Nifedipine extended-release tablets from six generic factories of China were used to evaluate the uniformity with the original drug in the study. The procedure includes: choice of chromatographic methods, data collection and conformity test, evaluation of intra-batch similarity of drugs, evaluation of generic drugs with the original drug and weighted similarity evaluation of generic drugs. The data were collected via high-performance liquid chromatography (HPLC), and then calculated by correlation coefficient, cosine, principal component analysis (PCA) and hierarchical clustering analysis (HCA). It is more suitable to use peak areas as the vector when calculating the similarity of impurity profile. After weighting the peak areas of the unspecified impurities in further evaluation of the generic quality, the generic level of different factories was differentiated and the best generic factory was picked out.展开更多
In this article, we focus on the semi-parametric error-in-variables model with missing responses: , where yi are the response variables missing at random, are design points, ζi are the potential variables observed wi...In this article, we focus on the semi-parametric error-in-variables model with missing responses: , where yi are the response variables missing at random, are design points, ζi are the potential variables observed with measurement errors μi, the unknown slope parameter ß?and nonparametric component g(·) need to be estimated. Here we choose two different approaches to estimate ß?and g(·). Under appropriate conditions, we study the strong consistency for the proposed estimators.展开更多
基金supported in part by a grant,PHA1110214,from MOE,Taiwan.
文摘This paper presents a new method of using a convolutional neural network(CNN)in machine learning to identify brand consistency by product appearance variation.In Experiment 1,we collected fifty mouse devices from the past thirty-five years from a renowned company to build a dataset consisting of product pictures with pre-defined design features of their appearance and functions.Results show that it is a challenge to distinguish periods for the subtle evolution of themouse devices with such traditionalmethods as time series analysis and principal component analysis(PCA).In Experiment 2,we applied deep learning to predict the extent to which the product appearance variation ofmouse devices of various brands.The investigation collected 6,042 images ofmouse devices and divided theminto the Early Stage and the Late Stage.Results show the highest accuracy of 81.4%with the CNNmodel,and the evaluation score of brand style consistency is 0.36,implying that the brand consistency score converted by the CNN accuracy rate is not always perfect in the real world.The relationship between product appearance variation,brand style consistency,and evaluation score is beneficial for predicting new product styles and future product style roadmaps.In addition,the CNN heat maps highlight the critical areas of design features of different styles,providing alternative clues related to the blurred boundary.The study provides insights into practical problems for designers,manufacturers,and marketers in product design.It not only contributes to the scientific understanding of design development,but also provides industry professionals with practical tools and methods to improve the design process and maintain brand consistency.Designers can use these techniques to find features that influence brand style.Then,capture these features as innovative design elements and maintain core brand values.
基金This work was supported by the National Natural Science Foundation of China(72221002,42271375)the Strategic Priority Research Program(XDA28060100)the Informatization Plan Project(CAS-WX2021PY-0109)of the Chinese Academy of Sciences.
文摘Accurate cropland information is critical for agricultural planning and production,especially in foodstressed countries like China.Although widely used medium-to-high-resolution satellite-based cropland maps have been developed from various remotely sensed data sources over the past few decades,considerable discrepancies exist among these products both in total area and in spatial distribution of croplands,impeding further applications of these datasets.The factors influencing their inconsistency are also unknown.In this study,we evaluated the consistency and accuracy of six cropland maps widely used in China in circa 2020,including three state-of-the-art 10-m products(i.e.,Google Dynamic World,ESRI Land Cover,and ESA WorldCover)and three 30-m ones(i.e.,GLC_FCS30,GlobeLand 30,and CLCD).We also investigated the effects of landscape fragmentation,climate,and agricultural management.Validation using a ground-truth sample revealed that the 10-m-resolution WorldCover provided the highest accuracy(92.3%).These maps collectively overestimated Chinese cropland area by up to 56%.Up to 37%of the land showed spatial inconsistency among the maps,concentrated mainly in mountainous regions and attributed to the varying accuracy of cropland maps,cropland fragmentation and management practices such as irrigation.Our work shed light on the promotion of future cropland mapping efforts,especially in highly inconsistent regions.
基金supported in part by the Key-Area Research and Development Program of Guangdong Province (2020B010166006)the National Natural Science Foundation of China (61972102)+2 种基金the Guangzhou Science and Technology Plan Project (023A04J1729)the Science and Technology development fund (FDCT)Macao SAR (015/2020/AMJ)。
文摘Domain adaptation(DA) aims to find a subspace,where the discrepancies between the source and target domains are reduced. Based on this subspace, the classifier trained by the labeled source samples can classify unlabeled target samples well.Existing approaches leverage Graph Embedding Learning to explore such a subspace. Unfortunately, due to 1) the interaction of the consistency and specificity between samples, and 2) the joint impact of the degenerated features and incorrect labels in the samples, the existing approaches might assign unsuitable similarity, which restricts their performance. In this paper, we propose an approach called adaptive graph embedding with consistency and specificity(AGE-CS) to cope with these issues. AGE-CS consists of two methods, i.e., graph embedding with consistency and specificity(GECS), and adaptive graph embedding(AGE).GECS jointly learns the similarity of samples under the geometric distance and semantic similarity metrics, while AGE adaptively adjusts the relative importance between the geometric distance and semantic similarity during the iterations. By AGE-CS,the neighborhood samples with the same label are rewarded,while the neighborhood samples with different labels are punished. As a result, compact structures are preserved, and advanced performance is achieved. Extensive experiments on five benchmark datasets demonstrate that the proposed method performs better than other Graph Embedding methods.
基金supported byNationalKeyResearch andDevelopment Plan(Grant No.2018YFB1800701)Key-Area Research and Development Program of Guangdong Province 2020B0101090003,CCF-NSFOCUS Kunpeng Scientific Research Fund(CCF-NSFOCUS 2021010)+2 种基金National Natural Science Foundation of China(Grant Nos.61902083,62172115,61976064)Guangdong Higher Education Innovation Group 2020KCXTD007 and Guangzhou Higher Education Innovation Group(No.202032854)Guangzhou Fundamental Research Plan of“Municipalschool”Jointly Funded Projects(No.202102010445).
文摘As one of the major threats to the current DeFi(Decentralized Finance)ecosystem,reentrant attack induces data inconsistency of the victim smart contract,enabling attackers to steal on-chain assets from DeFi projects,which could terribly do harm to the confidence of the blockchain investors.However,protecting DeFi projects from the reentrant attack is very difficult,since generating a call loop within the highly automatic DeFi ecosystem could be very practicable.Existing researchers mainly focus on the detection of the reentrant vulnerabilities in the code testing,and no method could promise the non-existent of reentrant vulnerabilities.In this paper,we introduce the database lock mechanism to isolate the correlated smart contract states from other operations in the same contract,so that we can prevent the attackers from abusing the inconsistent smart contract state.Compared to the existing resolutions of front-running,code audit,andmodifier,our method guarantees protection resultswith better flexibility.And we further evaluate our method on a number of de facto reentrant attacks observed from Etherscan.The results prove that our method could efficiently prevent the reentrant attack with less running cost.
文摘The study explores the asymptotic consistency of the James-Stein shrinkage estimator obtained by shrinking a maximum likelihood estimator. We use Hansen’s approach to show that the James-Stein shrinkage estimator converges asymptotically to some multivariate normal distribution with shrinkage effect values. We establish that the rate of convergence is of order and rate , hence the James-Stein shrinkage estimator is -consistent. Then visualise its consistency by studying the asymptotic behaviour using simulating plots in R for the mean squared error of the maximum likelihood estimator and the shrinkage estimator. The latter graphically shows lower mean squared error as compared to that of the maximum likelihood estimator.
文摘“The Fundamental Rights and obligations of Citizens”, the title of Chapter II of the current Constitution of PRC, and the stipulation that citizens must fulfill certain obligations while enjoying rights have triggered many debates. Considering the historical origin, constitutional philosophy, and the text and structure of the Constitution, the special provisions of the current Constitution are influenced by the principle of consistency of rights and obligations. The principle of consistency of rights and obligations in the Constitution is of complex connotation. Therefore, although the principle of consistency of rights and obligations effectively connects the public and private spheres, it ignores the diversity and differences of the interests and elements contained in the Constitution, the asymmetry of the normative status of fundamental rights and fundamental obligations,and the right of citizens to self-determination of personal interests.The principle of consistency of rights and obligations should be purposefully narrowed and concretized: In the context of public-private integration and risk society prevention, the principle of consistency of rights and obligations can be used as a supplement to the functional system of the Constitution;in the field of fundamental political obligations, the principle of consistency of rights and obligations should be in line with the requirements of the state to respect and protect human rights;in the field of fundamental social obligations, the exercise of fundamental rights by individuals is protected by the Constitution as long as they comply with the law and do not infringe upon the interests of the social community. The principle of the consistency of rights and obligations is only used as the negative constituents of the determination of rights and the basis for the effect against a third party of fundamental rights.
文摘The classical propositional calculus(often called also as“zero-order logic”),is the most fundamental two-valued logical system.It is necessary to construct the classical calculus of quantifiers(often called also as“classical calculus of predicates”or“first-order logic”),which is necessary to construct the classical functional calculus.This last one is being used for formalization of the Arithmetic System.At the beginning of this paper,we introduce a notation and we repeat certain well-known notions(among others,the notions of operation of consequence,a system,consistency in the traditional sense,consistency in the absolute sense)and certain well-known theorems.Next,we establish that classical propositional calculus is an inconsistent theory.
基金supported in part by the NSFC-Zhejiang Joint Fund of the Integration of Informatization and Industrialization(U1609218)NSFC(61772312,61373078,61772253)+1 种基金the Key Research and Development Project of Shandong Province(2017GGX10110)NSF of Shandong Province(ZR2016FM21,ZR2016FM13)
文摘Image segmentation is a key and fundamental problem in image processing,computer graphics,and computer vision.Level set based method for image segmentation is used widely for its topology flexibility and proper mathematical formulation.However,poor performance of existing level set models on noisy images and weak boundary limit its application in image segmentation.In this paper,we present a region consistency constraint term to measure the regional consistency on both sides of the boundary,this term defines the boundary of the image within a range,and hence increases the stability of the level set model.The term can make existing level set models significantly improve the efficiency of the algorithms on segmenting images with noise and weak boundary.Furthermore,this constraint term can make edge-based level set model overcome the defect of sensitivity to the initial contour.The experimental results show that our algorithm is efficient for image segmentation and outperform the existing state-of-art methods regarding images with noise and weak boundary.
基金supported by the Natural Science Foundation of Hunan Province(2018JJ2282)
文摘Reliability enhancement testing(RET) is an accelerated testing which hastens the performance degradation process to surface its inherent defects of design and manufacture. It is an important hypothesis that the degradation mechanism of the RET is the same as the one of the normal stress condition. In order to check the consistency of two mechanisms, we conduct two enhancement tests with a missile servo system as an object of the study, and preprocess two sets of test data to establish the accelerated degradation models regarding the temperature change rate that is assumed to be the main applied stress of the servo system during the natural storage. Based on the accelerated degradation models and natural storage profile of the servo system, we provide and demonstrate a procedure to check the consistency of two mechanisms by checking the correlation and difference of two sets of degradation data. The results indicate that the two degradation mechanisms are significantly consistent with each other.
文摘Given the global lack of effective analysis methods for the impact of design parameter tolerance on performance deviation in the vehicle proof-of-concept stage,it is difficult to decompose performance tolerance to design parameter tolerance.This study proposes a set of consistency analysis methods for vehicle steering performance.The process of consistency analysis and control of automotive performance in the conceptual design phase is proposed for the first time.A vehicle dynamics model is constructed,and the multi-objective optimization software Isight is used to optimize the steering performance of the car.Sensitivity analysis is used to optimize the design performance value.The tolerance interval of the performance is obtained by comparing the original car performance value with the optimized value.With the help of layer-by-layer decomposition theory and interval mathematics,automotive performance tolerance has been decomposed into design parameter tolerance.Through simulation and real vehicle experiments,the validity of the consistency analysis and control method presented in this paper are verified.The decomposition from parameter tolerance to performance tolerance can be achieved at the conceptual design stage.
文摘This paper introduces uncertainty theory to deal with non-deterministic factors in ranking alternatives. The uncertain variable method (UVM) and the definition of consistency for uncertainty comparison matrices are proposed. A simple yet pragmatic approach for testing whether or not an uncertainty comparison matrix is consistent is put forward. In cases where an uncertainty comparison matrix is inconsistent, an algorithm is used to generate consistent matrix. And then the consistent uncertainty comparison matrix can derive the uncertainty weights. The final ranking is given by uncertainty weighs if they are acceptable;otherwise we rely on the ranks of expected values of uncertainty weights instead. Three numerical examples including a hierarchical (AHP) decision problem are examined to illustrate the validity and practicality of the proposed methods.
文摘The purpose of this study is to investigate and quantify some possible sources of dispersion of 120 mm APFSDS tank ammunition both experimentally and numerically.This paper aims to point out the most influential source during In-Bore Balloting Motion phase as well as in External Ballistics phase of the ammunition and quantifies its effect on dispersion.Data obtained from flight trials is critically analysed and parameters affecting dispersion such as initial yaw/pitch rates,yaw/pitch dampening,plane start angle,launch spin,clearance,centre of gravity shift,dynamic imbalance angle,cross wind,etc.are observed and,later on,studied in detail by extensive External Ballistics Monte Carlo(EBMC)simulation and Six Degree of Freedom(6-DOF)trajectory analysis.In Bore Balloting Motion simulation shows that reduction in residual spin by about 5%results in drastic56%reduction in first maximum yaw.A correlation between first maximum yaw and residual spin is observed.Results of data analysis are used in design modification for existing ammunition.Number of designs are evaluated numerically before freezing five designs for further soundings.These designs are critically assessed in terms of their comparative performance during In-bore travel&external ballistics phase.Results are validated by free flight trials for the finalised design.
文摘Identifying inter-frame forgery is a hot topic in video forensics. In this paper, we propose a method based on the assumption that the correlation coefficients of gray values is consistent in an original video, while in forgeries the consistency will be destroyed. We first extract the consistency of correlation coefficients of gray values (CCCoGV for short) after normalization and quantization as distinguishing feature to identify interframe forgeries. Then we test the CCCoGV in a large database with the help of SVM (Support Vector Machine). Experimental results show that the proposed method is efficient in classifying original videos and forgeries. Furthermore, the proposed method performs also pretty well in classifying frame insertion and frame deletion forgeries.
基金supported in part by the National Key R&D Program of China(2017YFB1001804)Shanghai Science and Technology Innovation Action Plan Project(16511100900)the National Natural Science Foundation of China(61572360)
文摘Consistency degree calculation is established on the basis of known correspondence, but in real life, the correspondence is generally unknown, so how to calculate consistency of two models under unknown correspondence has become a problem. For this condition, we should analyze unknown correspondence due to the influence of different correspondences.In this paper we obtain the relations of transitions based on event relations using branching processes, and build a behavioral matrix of relations. Based on the permutation of behavioral matrix, we express different correspondences, and define a new formula to compute the maximal consistency degree of two workflow nets. Additionally, this paper utilizes an example to show these definitions, computation as well as the advantages.
基金partly supported by the National Natural Science Foundation of China(71371053)the Social Science Foundation of Fujian Province(FJ2015C111)
文摘Intuitionistic fuzzy preference relation(IFPR) is a suitable technique to express fuzzy preference information by decision makers(DMs). This paper aims to provide a group decision making method where DMs use the IFPRs to indicate their preferences with uncertain weights. To begin with, a model to derive weight vectors of alternatives from IFPRs based on multiplicative consistency is presented. Specifically, for any IFPR,by minimizing its absolute deviation from the corresponding consistent IFPR, the weight vectors are generated. Secondly,a method to determine relative weights of DMs depending on preference information is developed. After that we prioritize alternatives based on the obtained weights considering the risk preference of DMs. Finally, this approach is applied to the problem of technical risks assessment of armored equipment to illustrate the applicability and superiority of the proposed method.
文摘In this note we consider some basic, yet unusual, issues pertaining to the accuracy and stability of numerical integration methods to follow the solution of first order and second order initial value problems (IVP). Included are remarks on multiple solutions, multi-step methods, effect of initial value perturbations, as well as slowing and advancing the computed motion in second order problems.
基金National Natural Science Foundation of China(No.62101219)Natural Science Foundation of Jiangsu Province(Nos.BK20201026,BK20210921)+1 种基金Science Foundation of Jiangsu Normal University(No.19XSRX006)Open Research Fund of Jiangsu Key Laboratory of Resources and Environmental Information Engineering(No.JS202107)。
文摘With the rapid development of Unmanned Aerial Vehicle(UAV)technology,change detection methods based on UAV images have been extensively studied.However,the imaging of UAV sensors is susceptible to environmental interference,which leads to great differences of same object between UAV images.Overcoming the discrepancy difference between UAV images is crucial to improving the accuracy of change detection.To address this issue,a novel unsupervised change detection method based on structural consistency and the Generalized Fuzzy Local Information C-means Clustering Model(GFLICM)was proposed in this study.Within this method,the establishment of a graph-based structural consistency measure allowed for the detection of change information by comparing structure similarity between UAV images.The local variation coefficient was introduced and a new fuzzy factor was reconstructed,after which the GFLICM algorithm was used to analyze difference images.Finally,change detection results were analyzed qualitatively and quantitatively.To measure the feasibility and robustness of the proposed method,experiments were conducted using two data sets from the cities of Yangzhou and Nanjing.The experimental results show that the proposed method can improve the overall accuracy of change detection and reduce the false alarm rate when compared with other state-of-the-art change detection methods.
文摘Inclusion of dissipation and memory mechanisms, non-classical elasticity and thermal effects in the currently used plate/shell mathematical models require that we establish if these mathematical models can be derived using the conservation and balance laws of continuum mechanics in conjunction with the corresponding kinematic assumptions. This is referred to as thermodynamic consistency of the mathematical models. Thermodynamic consistency ensures thermodynamic equilibrium during the evolution of the deformation. When the mathematical models are thermodynamically consistent, the second law of thermodynamics facilitates consistent derivations of constitutive theories in the presence of dissipation and memory mechanisms. This is the main motivation for the work presented in this paper. In the currently used mathematical models for plates/shells based on the assumed kinematic relations, energy functional is constructed over the volume consisting of kinetic energy, strain energy and the potential energy of the loads. The Euler’s equations derived from the first variation of the energy functional for arbitrary length when set to zero yield the mathematical model(s) for the deforming plates/shells. Alternatively, principle of virtual work can also be used to derive the same mathematical model(s). For linear elastic reversible deformation physics with small deformation and small strain, these two approaches, based on energy functional and the principle of virtual work, yield the same mathematical models. These mathematical models hold for reversible mechanical deformation. In this paper, we examine whether the currently used plate/shell mathematical models with the corresponding kinematic assumptions can be derived using the conservation and balance laws of classical or non-classical continuum mechanics. The mathematical models based on Kirchhoff hypothesis (classical plate theory, CPT) and first order shear deformation theory (FSDT) that are representative of most mathematical models for plates/shells are investigated in this paper for their thermodynamic consistency. This is followed by the details of a general and higher order thermodynamically consistent plate/shell thermoelastic mathematical model that is free of a priori consideration of kinematic assumptions and remains valid for very thin as well as thick plates/shells with comprehensive nonlinear constitutive theories based on integrity. Model problem studies are presented for small deformation behavior of linear elastic plates in the absence of thermal effects and the results are compared with CPT and FSDT mathematical models.
文摘A procedure to evaluate the quality consistency of generic drugs based on the impurity profile and the similarity analysis methods was presented in this paper. Nifedipine extended-release tablets from six generic factories of China were used to evaluate the uniformity with the original drug in the study. The procedure includes: choice of chromatographic methods, data collection and conformity test, evaluation of intra-batch similarity of drugs, evaluation of generic drugs with the original drug and weighted similarity evaluation of generic drugs. The data were collected via high-performance liquid chromatography (HPLC), and then calculated by correlation coefficient, cosine, principal component analysis (PCA) and hierarchical clustering analysis (HCA). It is more suitable to use peak areas as the vector when calculating the similarity of impurity profile. After weighting the peak areas of the unspecified impurities in further evaluation of the generic quality, the generic level of different factories was differentiated and the best generic factory was picked out.
文摘In this article, we focus on the semi-parametric error-in-variables model with missing responses: , where yi are the response variables missing at random, are design points, ζi are the potential variables observed with measurement errors μi, the unknown slope parameter ß?and nonparametric component g(·) need to be estimated. Here we choose two different approaches to estimate ß?and g(·). Under appropriate conditions, we study the strong consistency for the proposed estimators.