The dissemination of information across various locations is an ubiquitous occurrence,however,prevalent methodologies for multi-source identification frequently overlook the fact that sources may initiate disseminatio...The dissemination of information across various locations is an ubiquitous occurrence,however,prevalent methodologies for multi-source identification frequently overlook the fact that sources may initiate dissemination at distinct initial moments.Although there are many research results of multi-source identification,the challenge of locating sources with varying initiation times using a limited subset of observational nodes remains unresolved.In this study,we provide the backward spread tree theorem and source centrality theorem,and develop a backward spread centrality algorithm to identify all the information sources that trigger the spread at different start times.The proposed algorithm does not require prior knowledge of the number of sources,however,it can estimate both the initial spread moment and the spread duration.The core concept of this algorithm involves inferring suspected sources through source centrality theorem and locating the source from the suspected sources with linear programming.Extensive experiments from synthetic and real network simulation corroborate the superiority of our method in terms of both efficacy and efficiency.Furthermore,we find that our method maintains robustness irrespective of the number of sources and the average degree of network.Compared with classical and state-of-the art source identification methods,our method generally improves the AUROC value by 0.1 to 0.2.展开更多
This paper presents a novel framework for understanding time as an emergent phenomenon arising from quantum information dynamics. We propose that the flow of time and its directional arrow are intrinsically linked to ...This paper presents a novel framework for understanding time as an emergent phenomenon arising from quantum information dynamics. We propose that the flow of time and its directional arrow are intrinsically linked to the growth of quantum complexity and the evolution of entanglement entropy in physical systems. By integrating principles from quantum mechanics, information theory, and holography, we develop a comprehensive theory that explains how time can emerge from timeless quantum processes. Our approach unifies concepts from quantum mechanics, general relativity, and thermodynamics, providing new perspectives on longstanding puzzles such as the black hole information paradox and the arrow of time. We derive modified Friedmann equations that incorporate quantum information measures, offering novel insights into cosmic evolution and the nature of dark energy. The paper presents a series of experimental proposals to test key aspects of this theory, ranging from quantum simulations to cosmological observations. Our framework suggests a deeply information-theoretic view of the universe, challenging our understanding of the nature of reality and opening new avenues for technological applications in quantum computing and sensing. This work contributes to the ongoing quest for a unified theory of quantum gravity and information, potentially with far-reaching implications for our understanding of space, time, and the fundamental structure of the cosmos.展开更多
Supply chain management is an essential part of an organisation's sustainable programme.Understanding the concentration of natural environment,public,and economic influence and feasibility of your suppliers and pu...Supply chain management is an essential part of an organisation's sustainable programme.Understanding the concentration of natural environment,public,and economic influence and feasibility of your suppliers and purchasers is becoming progressively familiar as all industries are moving towards a massive sustainable potential.To handle such sort of developments in supply chain management the involvement of fuzzy settings and their generalisations is playing an important role.Keeping in mind this role,the aim of this study is to analyse the role and involvement of complex q-rung orthopair normal fuzzy(CQRONF)information in supply chain management.The major impact of this theory is to analyse the notion of confidence CQRONF weighted averaging,confidence CQRONF ordered weighted averaging,confidence CQRONF hybrid averaging,confidence CQRONF weighted geometric,confidence CQRONF ordered weighted geometric,confidence CQRONF hybrid geometric operators and try to diagnose various properties and results.Furthermore,with the help of the CRITIC and VIKOR models,we diagnosed the novel theory of the CQRONF-CRITIC-VIKOR model to check the sensitivity analysis of the initiated method.Moreover,in the availability of diagnosed operators,we constructed a multi-attribute decision-making tool for finding a beneficial sustainable supplier to handle complex dilemmas.Finally,the initiated operator's efficiency is proved by comparative analysis.展开更多
Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in ge...Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in general net theory(GNT), a quantitative method for structure description and analysis of information systems was introduced. Results The structural complexity index and two related factors, i.e. element complexity factor and connection complexity factor were defined, and the relations between them and the parameters of the Petri net based model of the system were derived. Application example was presented. Conclusion The proposed method provides a theoretical basis for quantitative analysis and evaluation of the structural complexity and can be applied in the general planning and design processes of the information systems.展开更多
In this paper, we investigate the regularity of spreading of information and public opinions towards two competing products in complex networks. By building its mathematical model and simulating its evolution process,...In this paper, we investigate the regularity of spreading of information and public opinions towards two competing products in complex networks. By building its mathematical model and simulating its evolution process, we have found the statistical regularity for support rates of two different products at a steady stage. The research shows that strength of the public opinion spreading is proportional to the final support rates of a product.展开更多
Because of the developed economy and lush vegetation in southern China, the following obstacles or difficulties exist in remote sensing land surface classification: 1) Diverse surface composition types;2) Undulating t...Because of the developed economy and lush vegetation in southern China, the following obstacles or difficulties exist in remote sensing land surface classification: 1) Diverse surface composition types;2) Undulating terrains;3) Small fragmented land;4) Indistinguishable shadows of surface objects. It is our top priority to clarify how to use the concept of big data (Data mining technology) and various new technologies and methods to make complex surface remote sensing information extraction technology develop in the direction of automation, refinement and intelligence. In order to achieve the above research objectives, the paper takes the Gaofen-2 satellite data produced in China as the data source, and takes the complex surface remote sensing information extraction technology as the research object, and intelligently analyzes the remote sensing information of complex surface on the basis of completing the data collection and preprocessing. The specific extraction methods are as follows: 1) extraction research on fractal texture features of Brownian motion;2) extraction research on color features;3) extraction research on vegetation index;4) research on vectors and corresponding classification. In this paper, fractal texture features, color features, vegetation features and spectral features of remote sensing images are combined to form a combination feature vector, which improves the dimension of features, and the feature vector improves the difference of remote sensing features, and it is more conducive to the classification of remote sensing features, and thus it improves the classification accuracy of remote sensing images. It is suitable for remote sensing information extraction of complex surface in southern China. This method can be extended to complex surface area in the future.展开更多
Objective:Explore the characteristics of acupoints in the treatment of stroke with complex network and point mutual information method.Methods:The complex network and point wise mutual information system-developed by ...Objective:Explore the characteristics of acupoints in the treatment of stroke with complex network and point mutual information method.Methods:The complex network and point wise mutual information system-developed by Chinese academy of Chinese medical sciences wereused to analyze the specific acupoints,compatibility,frequency etc.Results:174 acumoxibustion prescriptions were collected,including 163 acupoints.among them eighteen acupoints were used more than 30 times such as Hegu(LI4),Zusanli(ST36),Quchi(LI11)and Fengshi(GB31).The combinations of 31 acupoints were used more than 15 times,such as the combination of Quchi(LI11)and Zusanli(ST36),the combination of acupoint Quchi(LI11)and Jianyu(LI15),Hegu and Quchi(LI11).The most commonly used treatment method for stroke treatment is to dredge the Yangming meridian and Shaoyang meridian through acupuncture the multiple acupoints located on these two meridians..The commonly used acupoints are mainly distributed in the limbs,head and face.The most commonly used specific acupoint is intersection acupoint.The usage frequency of specific acupoints are higher than that of non-specific acupoints.Conclusion:Dredging the collaterals,dispelling wind-evil and restoring consciousness are the main principle for the treatment of stroke.Specific acupoints in head,face and climbs maybe the main targeted acupoints.Combination of Yang meridians with other meridians is needed to improve the effects.The Yangming meridian and Shaoyang meridian are most used meridians and Hegu(LI4),Quchi(LI11)and Zusanli(ST36)are the most used acupionts.展开更多
Since Manufacturing Execution System (MES) is a bridge which links the upper planning system of the enterprise and the control system of the shop floor, various kinds of the information with different characteristics ...Since Manufacturing Execution System (MES) is a bridge which links the upper planning system of the enterprise and the control system of the shop floor, various kinds of the information with different characteristics flow through the system. The information environment of MES and its effect on MES scheduling are analyzed. A methodological proposal is given to address the problem of agile scheduling in a complex information environment, based on which a microeconomic market and game theoretic model-based scheduling approach is presented. The future development of this method is also discussed.展开更多
This work introduces a modification to the Heisenberg Uncertainty Principle (HUP) by incorporating quantum complexity, including potential nonlinear effects. Our theoretical framework extends the traditional HUP to co...This work introduces a modification to the Heisenberg Uncertainty Principle (HUP) by incorporating quantum complexity, including potential nonlinear effects. Our theoretical framework extends the traditional HUP to consider the complexity of quantum states, offering a more nuanced understanding of measurement precision. By adding a complexity term to the uncertainty relation, we explore nonlinear modifications such as polynomial, exponential, and logarithmic functions. Rigorous mathematical derivations demonstrate the consistency of the modified principle with classical quantum mechanics and quantum information theory. We investigate the implications of this modified HUP for various aspects of quantum mechanics, including quantum metrology, quantum algorithms, quantum error correction, and quantum chaos. Additionally, we propose experimental protocols to test the validity of the modified HUP, evaluating their feasibility with current and near-term quantum technologies. This work highlights the importance of quantum complexity in quantum mechanics and provides a refined perspective on the interplay between complexity, entanglement, and uncertainty in quantum systems. The modified HUP has the potential to stimulate interdisciplinary research at the intersection of quantum physics, information theory, and complexity theory, with significant implications for the development of quantum technologies and the understanding of the quantum-to-classical transition.展开更多
The survivability of computer systems should be guaranteed in order to improve its operation efficiency,especially for the efficiency of its critical functions.This paper proposes a decentralized mechanism based on So...The survivability of computer systems should be guaranteed in order to improve its operation efficiency,especially for the efficiency of its critical functions.This paper proposes a decentralized mechanism based on Software-Defined Architecture(SDA).The concepts of critical functions and critical states are defined,and then,the critical functional parameters of the target system are collected and analyzed.Experiments based on the analysis results are performed for reconfiguring the implementations of the whole system.A formal model is presented for analyzing and improving the survivability of the system,and the problem investigated in this paper is reduced to an optimization problem for increasing the system survival time.展开更多
With the skyrocketing development of technologies,there are many issues in information security quantitative evaluation(ISQE)of complex heterogeneous information systems(CHISs).The development of CHIS calls for an ISQ...With the skyrocketing development of technologies,there are many issues in information security quantitative evaluation(ISQE)of complex heterogeneous information systems(CHISs).The development of CHIS calls for an ISQE model based on security-critical components to improve the efficiency of system security evaluation urgently.In this paper,we summarize the implication of critical components in different filed and propose a recognition algorithm of security-critical components based on threat attack tree to support the ISQE process.The evaluation model establishes a framework for ISQE of CHISs that are updated iteratively.Firstly,with the support of asset identification and topology data,we sort the security importance of each asset based on the threat attack tree and obtain the security-critical components(set)of the CHIS.Then,we build the evaluation indicator tree of the evaluation target and propose an ISQE algorithm based on the coefficient of variation to calculate the security quality value of the CHIS.Moreover,we present a novel indicator measurement uncertainty aiming to better supervise the performance of the proposed model.Simulation results show the advantages of the proposed algorithm in the evaluation of CHISs.展开更多
Information diffusion in online social networks is induced by the event of forwarding information for users, and latency exists widely in user spreading behaviors. Little work has been done to reveal the effect of lat...Information diffusion in online social networks is induced by the event of forwarding information for users, and latency exists widely in user spreading behaviors. Little work has been done to reveal the effect of latency on the diffusion process. In this paper, we propose a propagation model in which nodes may suspend their spreading actions for a waiting period of stochastic length. These latent nodes may recover their activity again. Meanwhile, the mechanism of forwarding information is also introduced into the diffusion model. Mean-field analysis and numerical simulations indicate that our model has three nontrivial results. First, the spreading threshold does not correlate with latency in neither homogeneous nor heterogeneous networks, but depends on the spreading and refractory parameter. Furthermore, latency affects the diffusion process and changes the infection scale. A large or small latency parameter leads to a larger final diffusion extent, but the intrinsic dynamics is different. Large latency implies forwarding information rapidly, while small latency prevents nodes from dropping out of interactions. In addition, the betweenness is a better descriptor to identify influential nodes in the model with latency, compared with the coreness and degree. These results are helpful in understanding some collective phenomena of the diffusion process and taking measures to restrain a rumor in social networks.展开更多
In order to meet the demand of testability analysis and evaluation for complex equipment under a small sample test in the equipment life cycle, the hierarchical hybrid testability model- ing and evaluation method (HH...In order to meet the demand of testability analysis and evaluation for complex equipment under a small sample test in the equipment life cycle, the hierarchical hybrid testability model- ing and evaluation method (HHTME), which combines the testabi- lity structure model (TSM) with the testability Bayesian networks model (TBNM), is presented. Firstly, the testability network topo- logy of complex equipment is built by using the hierarchical hybrid testability modeling method. Secondly, the prior conditional prob- ability distribution between network nodes is determined through expert experience. Then the Bayesian method is used to update the conditional probability distribution, according to history test information, virtual simulation information and similar product in- formation. Finally, the learned hierarchical hybrid testability model (HHTM) is used to estimate the testability of equipment. Compared with the results of other modeling methods, the relative deviation of the HHTM is only 0.52%, and the evaluation result is the most accu rate.展开更多
A model is proposed to describe the competition between two kinds of information among N random-walking individuals in an L x L square, starting from a half-and-half mixture of two kinds of information. Individuals re...A model is proposed to describe the competition between two kinds of information among N random-walking individuals in an L x L square, starting from a half-and-half mixture of two kinds of information. Individuals remain or change their information according to their neighbors' information. When the moving speed of individuals v is zero, the two kinds of information typically coexist, and the ratio between them increases with L and decreases with N. In the dynamic case (v 〉 0), only one information eventually remains, and the time required for one information being left scales as Td -v^αL^β^γ.展开更多
This paper considers a decomposition framework as a mechanism for information hiding for secure communication via open network channels. Two varieties of this framework are provided: one is based on Gaussian arithmeti...This paper considers a decomposition framework as a mechanism for information hiding for secure communication via open network channels. Two varieties of this framework are provided: one is based on Gaussian arithmetic with complex modulus and another on an elliptic curve modular equation. The proposed algorithm is illustrated in a numerical example.展开更多
We develop a series of mathematical models to describe flow of information in different periods of time and the relationship between flow of information and inherent value. We optimize the diffusion mechanism of infor...We develop a series of mathematical models to describe flow of information in different periods of time and the relationship between flow of information and inherent value. We optimize the diffusion mechanism of information based on model SEIR and improve the diffusion mechanism. In order to explore how inherent value of the information affects the flow of information, we simulate the model by using Matalab. We also use the data that the number of people is connected to Internet in Canada from the year 2009 to 2014 to analysis the model’s reliability. Then we use the model to predict the communication networks’ relationships and capacities around the year 2050. Last we do sensitivity analysis by making small changes in parameters of simulation experiment. The result of the experiment is helpful to model how public interest and opinion can be changed in complex network.展开更多
We advance here a novel methodology for robust intelligent biometric information management with inferences and predictions made using randomness and complexity concepts. Intelligence refers to learning, adap- tation,...We advance here a novel methodology for robust intelligent biometric information management with inferences and predictions made using randomness and complexity concepts. Intelligence refers to learning, adap- tation, and functionality, and robustness refers to the ability to handle incomplete and/or corrupt adversarial information, on one side, and image and or device variability, on the other side. The proposed methodology is model-free and non-parametric. It draws support from discriminative methods using likelihood ratios to link at the conceptual level biometrics and forensics. It further links, at the modeling and implementation level, the Bayesian framework, statistical learning theory (SLT) using transduction and semi-supervised lea- rning, and Information Theory (IY) using mutual information. The key concepts supporting the proposed methodology are a) local estimation to facilitate learning and prediction using both labeled and unlabeled data;b) similarity metrics using regularity of patterns, randomness deficiency, and Kolmogorov complexity (similar to MDL) using strangeness/typicality and ranking p-values;and c) the Cover – Hart theorem on the asymptotical performance of k-nearest neighbors approaching the optimal Bayes error. Several topics on biometric inference and prediction related to 1) multi-level and multi-layer data fusion including quality and multi-modal biometrics;2) score normalization and revision theory;3) face selection and tracking;and 4) identity management, are described here using an integrated approach that includes transduction and boosting for ranking and sequential fusion/aggregation, respectively, on one side, and active learning and change/ outlier/intrusion detection realized using information gain and martingale, respectively, on the other side. The methodology proposed can be mapped to additional types of information beyond biometrics.展开更多
The capability of a system to fulfill its mission promptly in the presence of attacks,failures,or accidents is one of the qualitative definitions of survivability.In this paper,we propose a model for survivability qua...The capability of a system to fulfill its mission promptly in the presence of attacks,failures,or accidents is one of the qualitative definitions of survivability.In this paper,we propose a model for survivability quantification,which is acceptable for networks carrying complex traffic flows.Complex network traffic is considered as general multi-rate,heterogeneous traffic,where the individual bandwidth demands may aggregate in complex,nonlinear ways.Blocking probability is the chosen measure for survivability analysis.We study an arbitrary topology and some other known topologies for the network.Independent and dependent failure scenarios as well as deterministic and random traffic models are investigated.Finally,we provide survivability evaluation results for different network configurations.The results show that by using about 50%of the link capacity in networks with a relatively high number of links,the blocking probability remains near zero in the case of a limited number of failures.展开更多
基金Project supported by the National Natural Science Foundation of China(Grant Nos.62103375,62006106,61877055,and 62171413)the Philosophy and Social Science Planning Project of Zhejinag Province,China(Grant No.22NDJC009Z)+1 种基金the Education Ministry Humanities and Social Science Foundation of China(Grant No.19YJCZH056)the Natural Science Foundation of Zhejiang Province,China(Grant Nos.LY23F030003,LY22F030006,and LQ21F020005).
文摘The dissemination of information across various locations is an ubiquitous occurrence,however,prevalent methodologies for multi-source identification frequently overlook the fact that sources may initiate dissemination at distinct initial moments.Although there are many research results of multi-source identification,the challenge of locating sources with varying initiation times using a limited subset of observational nodes remains unresolved.In this study,we provide the backward spread tree theorem and source centrality theorem,and develop a backward spread centrality algorithm to identify all the information sources that trigger the spread at different start times.The proposed algorithm does not require prior knowledge of the number of sources,however,it can estimate both the initial spread moment and the spread duration.The core concept of this algorithm involves inferring suspected sources through source centrality theorem and locating the source from the suspected sources with linear programming.Extensive experiments from synthetic and real network simulation corroborate the superiority of our method in terms of both efficacy and efficiency.Furthermore,we find that our method maintains robustness irrespective of the number of sources and the average degree of network.Compared with classical and state-of-the art source identification methods,our method generally improves the AUROC value by 0.1 to 0.2.
文摘This paper presents a novel framework for understanding time as an emergent phenomenon arising from quantum information dynamics. We propose that the flow of time and its directional arrow are intrinsically linked to the growth of quantum complexity and the evolution of entanglement entropy in physical systems. By integrating principles from quantum mechanics, information theory, and holography, we develop a comprehensive theory that explains how time can emerge from timeless quantum processes. Our approach unifies concepts from quantum mechanics, general relativity, and thermodynamics, providing new perspectives on longstanding puzzles such as the black hole information paradox and the arrow of time. We derive modified Friedmann equations that incorporate quantum information measures, offering novel insights into cosmic evolution and the nature of dark energy. The paper presents a series of experimental proposals to test key aspects of this theory, ranging from quantum simulations to cosmological observations. Our framework suggests a deeply information-theoretic view of the universe, challenging our understanding of the nature of reality and opening new avenues for technological applications in quantum computing and sensing. This work contributes to the ongoing quest for a unified theory of quantum gravity and information, potentially with far-reaching implications for our understanding of space, time, and the fundamental structure of the cosmos.
文摘Supply chain management is an essential part of an organisation's sustainable programme.Understanding the concentration of natural environment,public,and economic influence and feasibility of your suppliers and purchasers is becoming progressively familiar as all industries are moving towards a massive sustainable potential.To handle such sort of developments in supply chain management the involvement of fuzzy settings and their generalisations is playing an important role.Keeping in mind this role,the aim of this study is to analyse the role and involvement of complex q-rung orthopair normal fuzzy(CQRONF)information in supply chain management.The major impact of this theory is to analyse the notion of confidence CQRONF weighted averaging,confidence CQRONF ordered weighted averaging,confidence CQRONF hybrid averaging,confidence CQRONF weighted geometric,confidence CQRONF ordered weighted geometric,confidence CQRONF hybrid geometric operators and try to diagnose various properties and results.Furthermore,with the help of the CRITIC and VIKOR models,we diagnosed the novel theory of the CQRONF-CRITIC-VIKOR model to check the sensitivity analysis of the initiated method.Moreover,in the availability of diagnosed operators,we constructed a multi-attribute decision-making tool for finding a beneficial sustainable supplier to handle complex dilemmas.Finally,the initiated operator's efficiency is proved by comparative analysis.
文摘Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in general net theory(GNT), a quantitative method for structure description and analysis of information systems was introduced. Results The structural complexity index and two related factors, i.e. element complexity factor and connection complexity factor were defined, and the relations between them and the parameters of the Petri net based model of the system were derived. Application example was presented. Conclusion The proposed method provides a theoretical basis for quantitative analysis and evaluation of the structural complexity and can be applied in the general planning and design processes of the information systems.
文摘In this paper, we investigate the regularity of spreading of information and public opinions towards two competing products in complex networks. By building its mathematical model and simulating its evolution process, we have found the statistical regularity for support rates of two different products at a steady stage. The research shows that strength of the public opinion spreading is proportional to the final support rates of a product.
文摘Because of the developed economy and lush vegetation in southern China, the following obstacles or difficulties exist in remote sensing land surface classification: 1) Diverse surface composition types;2) Undulating terrains;3) Small fragmented land;4) Indistinguishable shadows of surface objects. It is our top priority to clarify how to use the concept of big data (Data mining technology) and various new technologies and methods to make complex surface remote sensing information extraction technology develop in the direction of automation, refinement and intelligence. In order to achieve the above research objectives, the paper takes the Gaofen-2 satellite data produced in China as the data source, and takes the complex surface remote sensing information extraction technology as the research object, and intelligently analyzes the remote sensing information of complex surface on the basis of completing the data collection and preprocessing. The specific extraction methods are as follows: 1) extraction research on fractal texture features of Brownian motion;2) extraction research on color features;3) extraction research on vegetation index;4) research on vectors and corresponding classification. In this paper, fractal texture features, color features, vegetation features and spectral features of remote sensing images are combined to form a combination feature vector, which improves the dimension of features, and the feature vector improves the difference of remote sensing features, and it is more conducive to the classification of remote sensing features, and thus it improves the classification accuracy of remote sensing images. It is suitable for remote sensing information extraction of complex surface in southern China. This method can be extended to complex surface area in the future.
文摘Objective:Explore the characteristics of acupoints in the treatment of stroke with complex network and point mutual information method.Methods:The complex network and point wise mutual information system-developed by Chinese academy of Chinese medical sciences wereused to analyze the specific acupoints,compatibility,frequency etc.Results:174 acumoxibustion prescriptions were collected,including 163 acupoints.among them eighteen acupoints were used more than 30 times such as Hegu(LI4),Zusanli(ST36),Quchi(LI11)and Fengshi(GB31).The combinations of 31 acupoints were used more than 15 times,such as the combination of Quchi(LI11)and Zusanli(ST36),the combination of acupoint Quchi(LI11)and Jianyu(LI15),Hegu and Quchi(LI11).The most commonly used treatment method for stroke treatment is to dredge the Yangming meridian and Shaoyang meridian through acupuncture the multiple acupoints located on these two meridians..The commonly used acupoints are mainly distributed in the limbs,head and face.The most commonly used specific acupoint is intersection acupoint.The usage frequency of specific acupoints are higher than that of non-specific acupoints.Conclusion:Dredging the collaterals,dispelling wind-evil and restoring consciousness are the main principle for the treatment of stroke.Specific acupoints in head,face and climbs maybe the main targeted acupoints.Combination of Yang meridians with other meridians is needed to improve the effects.The Yangming meridian and Shaoyang meridian are most used meridians and Hegu(LI4),Quchi(LI11)and Zusanli(ST36)are the most used acupionts.
基金Supported by the National Natural Science Foundation of China(50105006 )National Hi-tech R&D Program of China (2001AA412140 and 2003AA411120)
文摘Since Manufacturing Execution System (MES) is a bridge which links the upper planning system of the enterprise and the control system of the shop floor, various kinds of the information with different characteristics flow through the system. The information environment of MES and its effect on MES scheduling are analyzed. A methodological proposal is given to address the problem of agile scheduling in a complex information environment, based on which a microeconomic market and game theoretic model-based scheduling approach is presented. The future development of this method is also discussed.
文摘This work introduces a modification to the Heisenberg Uncertainty Principle (HUP) by incorporating quantum complexity, including potential nonlinear effects. Our theoretical framework extends the traditional HUP to consider the complexity of quantum states, offering a more nuanced understanding of measurement precision. By adding a complexity term to the uncertainty relation, we explore nonlinear modifications such as polynomial, exponential, and logarithmic functions. Rigorous mathematical derivations demonstrate the consistency of the modified principle with classical quantum mechanics and quantum information theory. We investigate the implications of this modified HUP for various aspects of quantum mechanics, including quantum metrology, quantum algorithms, quantum error correction, and quantum chaos. Additionally, we propose experimental protocols to test the validity of the modified HUP, evaluating their feasibility with current and near-term quantum technologies. This work highlights the importance of quantum complexity in quantum mechanics and provides a refined perspective on the interplay between complexity, entanglement, and uncertainty in quantum systems. The modified HUP has the potential to stimulate interdisciplinary research at the intersection of quantum physics, information theory, and complexity theory, with significant implications for the development of quantum technologies and the understanding of the quantum-to-classical transition.
文摘The survivability of computer systems should be guaranteed in order to improve its operation efficiency,especially for the efficiency of its critical functions.This paper proposes a decentralized mechanism based on Software-Defined Architecture(SDA).The concepts of critical functions and critical states are defined,and then,the critical functional parameters of the target system are collected and analyzed.Experiments based on the analysis results are performed for reconfiguring the implementations of the whole system.A formal model is presented for analyzing and improving the survivability of the system,and the problem investigated in this paper is reduced to an optimization problem for increasing the system survival time.
基金supported in part by the National Key R&D Program of China under Grant 2019YFB2102400,2016YFF0204001in part by the BUPT Excellent Ph.D.Students Foundation under Grant CX2019117.
文摘With the skyrocketing development of technologies,there are many issues in information security quantitative evaluation(ISQE)of complex heterogeneous information systems(CHISs).The development of CHIS calls for an ISQE model based on security-critical components to improve the efficiency of system security evaluation urgently.In this paper,we summarize the implication of critical components in different filed and propose a recognition algorithm of security-critical components based on threat attack tree to support the ISQE process.The evaluation model establishes a framework for ISQE of CHISs that are updated iteratively.Firstly,with the support of asset identification and topology data,we sort the security importance of each asset based on the threat attack tree and obtain the security-critical components(set)of the CHIS.Then,we build the evaluation indicator tree of the evaluation target and propose an ISQE algorithm based on the coefficient of variation to calculate the security quality value of the CHIS.Moreover,we present a novel indicator measurement uncertainty aiming to better supervise the performance of the proposed model.Simulation results show the advantages of the proposed algorithm in the evaluation of CHISs.
基金supported by the National Natural Science Foundation of China(Grant Nos.61401015 and 61271308)the Fundamental Research Funds for the Central Universities,China(Grant No.2014JBM018)the Talent Fund of Beijing Jiaotong University,China(Grant No.2015RC013)
文摘Information diffusion in online social networks is induced by the event of forwarding information for users, and latency exists widely in user spreading behaviors. Little work has been done to reveal the effect of latency on the diffusion process. In this paper, we propose a propagation model in which nodes may suspend their spreading actions for a waiting period of stochastic length. These latent nodes may recover their activity again. Meanwhile, the mechanism of forwarding information is also introduced into the diffusion model. Mean-field analysis and numerical simulations indicate that our model has three nontrivial results. First, the spreading threshold does not correlate with latency in neither homogeneous nor heterogeneous networks, but depends on the spreading and refractory parameter. Furthermore, latency affects the diffusion process and changes the infection scale. A large or small latency parameter leads to a larger final diffusion extent, but the intrinsic dynamics is different. Large latency implies forwarding information rapidly, while small latency prevents nodes from dropping out of interactions. In addition, the betweenness is a better descriptor to identify influential nodes in the model with latency, compared with the coreness and degree. These results are helpful in understanding some collective phenomena of the diffusion process and taking measures to restrain a rumor in social networks.
基金supported by the National Defense Pre-research Foundation of China(51327030104)
文摘In order to meet the demand of testability analysis and evaluation for complex equipment under a small sample test in the equipment life cycle, the hierarchical hybrid testability model- ing and evaluation method (HHTME), which combines the testabi- lity structure model (TSM) with the testability Bayesian networks model (TBNM), is presented. Firstly, the testability network topo- logy of complex equipment is built by using the hierarchical hybrid testability modeling method. Secondly, the prior conditional prob- ability distribution between network nodes is determined through expert experience. Then the Bayesian method is used to update the conditional probability distribution, according to history test information, virtual simulation information and similar product in- formation. Finally, the learned hierarchical hybrid testability model (HHTM) is used to estimate the testability of equipment. Compared with the results of other modeling methods, the relative deviation of the HHTM is only 0.52%, and the evaluation result is the most accu rate.
基金supported by the National Natural Science Foundation of China (Grant Nos. 61173183, 60973152, and 60573172)the Superior University Doctor Subject Special Scientific Research Foundation of China (Grant No. 20070141014)the Natural Science Foundation of Liaoning Province of China (Grant No. 20082165)
文摘A model is proposed to describe the competition between two kinds of information among N random-walking individuals in an L x L square, starting from a half-and-half mixture of two kinds of information. Individuals remain or change their information according to their neighbors' information. When the moving speed of individuals v is zero, the two kinds of information typically coexist, and the ratio between them increases with L and decreases with N. In the dynamic case (v 〉 0), only one information eventually remains, and the time required for one information being left scales as Td -v^αL^β^γ.
文摘This paper considers a decomposition framework as a mechanism for information hiding for secure communication via open network channels. Two varieties of this framework are provided: one is based on Gaussian arithmetic with complex modulus and another on an elliptic curve modular equation. The proposed algorithm is illustrated in a numerical example.
文摘We develop a series of mathematical models to describe flow of information in different periods of time and the relationship between flow of information and inherent value. We optimize the diffusion mechanism of information based on model SEIR and improve the diffusion mechanism. In order to explore how inherent value of the information affects the flow of information, we simulate the model by using Matalab. We also use the data that the number of people is connected to Internet in Canada from the year 2009 to 2014 to analysis the model’s reliability. Then we use the model to predict the communication networks’ relationships and capacities around the year 2050. Last we do sensitivity analysis by making small changes in parameters of simulation experiment. The result of the experiment is helpful to model how public interest and opinion can be changed in complex network.
文摘We advance here a novel methodology for robust intelligent biometric information management with inferences and predictions made using randomness and complexity concepts. Intelligence refers to learning, adap- tation, and functionality, and robustness refers to the ability to handle incomplete and/or corrupt adversarial information, on one side, and image and or device variability, on the other side. The proposed methodology is model-free and non-parametric. It draws support from discriminative methods using likelihood ratios to link at the conceptual level biometrics and forensics. It further links, at the modeling and implementation level, the Bayesian framework, statistical learning theory (SLT) using transduction and semi-supervised lea- rning, and Information Theory (IY) using mutual information. The key concepts supporting the proposed methodology are a) local estimation to facilitate learning and prediction using both labeled and unlabeled data;b) similarity metrics using regularity of patterns, randomness deficiency, and Kolmogorov complexity (similar to MDL) using strangeness/typicality and ranking p-values;and c) the Cover – Hart theorem on the asymptotical performance of k-nearest neighbors approaching the optimal Bayes error. Several topics on biometric inference and prediction related to 1) multi-level and multi-layer data fusion including quality and multi-modal biometrics;2) score normalization and revision theory;3) face selection and tracking;and 4) identity management, are described here using an integrated approach that includes transduction and boosting for ranking and sequential fusion/aggregation, respectively, on one side, and active learning and change/ outlier/intrusion detection realized using information gain and martingale, respectively, on the other side. The methodology proposed can be mapped to additional types of information beyond biometrics.
文摘The capability of a system to fulfill its mission promptly in the presence of attacks,failures,or accidents is one of the qualitative definitions of survivability.In this paper,we propose a model for survivability quantification,which is acceptable for networks carrying complex traffic flows.Complex network traffic is considered as general multi-rate,heterogeneous traffic,where the individual bandwidth demands may aggregate in complex,nonlinear ways.Blocking probability is the chosen measure for survivability analysis.We study an arbitrary topology and some other known topologies for the network.Independent and dependent failure scenarios as well as deterministic and random traffic models are investigated.Finally,we provide survivability evaluation results for different network configurations.The results show that by using about 50%of the link capacity in networks with a relatively high number of links,the blocking probability remains near zero in the case of a limited number of failures.