A Generative Adversarial Neural(GAN)network is designed based on deep learning for the Super-Resolution(SR)reconstruction task of temperaturefields(comparable to downscaling in the meteorologicalfield),which is limite...A Generative Adversarial Neural(GAN)network is designed based on deep learning for the Super-Resolution(SR)reconstruction task of temperaturefields(comparable to downscaling in the meteorologicalfield),which is limited by the small number of ground stations and the sparse distribution of observations,resulting in a lack offineness of data.To improve the network’s generalization performance,the residual structure,and batch normalization are used.Applying the nearest interpolation method to avoid over-smoothing of the climate element values instead of the conventional Bicubic interpolation in the computer visionfield.Sub-pixel convolution is used instead of transposed convolution or interpolation methods for up-sampling to speed up network inference.The experimental dataset is the European Centre for Medium-Range Weather Forecasts Reanalysis v5(ERA5)with a bidirectional resolution of 0:1°×0:1°.On the other hand,the task aims to scale up the size by a factor of 8,which is rare compared to conventional methods.The comparison methods include traditional interpolation methods and a more widely used GAN-based network such as the SRGAN.Thefinal experimental results show that the proposed scheme advances the performance of Root Mean Square Error(RMSE)by 37.25%,the Peak Signal-to-noise Ratio(PNSR)by 14.4%,and the Structural Similarity(SSIM)by 10.3%compared to the Bicubic Interpolation.For the traditional SRGAN network,a relatively obvious performance improvement is observed by experimental demonstration.Meanwhile,the GAN network can converge stably and reach the approximate Nash equilibrium for various initialization parameters to empirically illustrate the effectiveness of the method in the temperature fields.展开更多
Machine learning tasks such as image classification need to select the features that can describe the image well.The image has individual features and common features,and they are interdependent.If only the individual ...Machine learning tasks such as image classification need to select the features that can describe the image well.The image has individual features and common features,and they are interdependent.If only the individual features of the image are emphasized,the neural network is prone to overfitting.If only the common features of images are emphasized,neural networks will not be able to adapt to diversified learning environments.In order to better integrate individual features and common features,based on skeleton and edge individual features extraction,this paper designed a mixed feature extraction method based on reso-nancefiltering,named resonance layer.Resonance layer is in front of the neural network input layer,using K3M algorithm to extract image skeleton,using the Canny algorithm to extract image border,using resonancefiltering to reconstruct training image byfiltering image noise,through the common features of the images in the training set and efficient expression of individual characteristics to improve the efficiency of feature extraction of neural network,so as to improve the accuracy of neural network prediction.Taking the fully connected neural net-work and LeNet-5 neural networks for example,the experiment on handwritten digits database shows that the proposed mixed feature extraction method can improve the accuracy of training whilefiltering out part of image noise data.展开更多
In the objective world,how to deal with the complexity and uncertainty of big data efficiently and accurately has become the premise and key to machine learning.Fuzzy support vector machine(FSVM)not only deals with th...In the objective world,how to deal with the complexity and uncertainty of big data efficiently and accurately has become the premise and key to machine learning.Fuzzy support vector machine(FSVM)not only deals with the classifi-cation problems for training samples with fuzzy information,but also assigns a fuzzy membership degree to each training sample,allowing different training samples to contribute differently in predicting an optimal hyperplane to separate two classes with maximum margin,reducing the effect of outliers and noise,Quantum computing has super parallel computing capabilities and holds the pro-mise of faster algorithmic processing of data.However,FSVM and quantum com-puting are incapable of dealing with the complexity and uncertainty of big data in an efficient and accurate manner.This paper research and propose an efficient and accurate quantum fuzzy support vector machine(QFSVM)algorithm based on the fact that quantum computing can efficiently process large amounts of data and FSVM is easy to deal with the complexity and uncertainty problems.The central idea of the proposed algorithm is to use the quantum algorithm for solving linear systems of equations(HHL algorithm)and the least-squares method to solve the quadratic programming problem in the FSVM.The proposed algorithm can deter-mine whether a sample belongs to the positive or negative class while also achiev-ing a good generalization performance.Furthermore,this paper applies QFSVM to handwritten character recognition and demonstrates that QFSVM can be run on quantum computers,and achieve accurate classification of handwritten characters.When compared to FSVM,QFSVM’s computational complexity decreases expo-nentially with the number of training samples.展开更多
This paper summarizes the state of art in quantum communication networks and trust management in recent years.As in the classical networks,trust management is the premise and foundation of quantum secure communication...This paper summarizes the state of art in quantum communication networks and trust management in recent years.As in the classical networks,trust management is the premise and foundation of quantum secure communication and cannot simply be attributed to security issues,therefore the basic and importance of trust management in quantum communication networks should be taken more seriously.Compared with other theories and techniques in quantum communication,the trust of quantum communication and trust management model in quantum communication network environment is still in its initial stage.In this paper,the core technologies of establishing secure and reliable quantum communication networks are categorized and summarized,and the trends of each direction in trust management of quantum communication network are discussed in depth.展开更多
Though numerical wave models have been applied widely to significant wave height prediction,they consume massive computing memory and their accuracy needs to be further improved.In this paper,a two-dimensional(2D)sign...Though numerical wave models have been applied widely to significant wave height prediction,they consume massive computing memory and their accuracy needs to be further improved.In this paper,a two-dimensional(2D)significant wave height(SWH)prediction model is established for the South and East China Seas.The proposed model is trained by Wave Watch III(WW3)reanalysis data based on a convolutional neural network,the bidirectional long short-term memory and the attention mechanism(CNNBiLSTM-Attention).It adopts the convolutional neural network to extract spatial features of original wave height to reduce the redundant information input into the BiLSTM network.Meanwhile,the BiLSTM model is applied to fully extract the features of the associated information of time series data.Besides,the attention mechanism is used to assign probability weight to the output information of the BiLSTM layer units,and finally,a training model is constructed.Up to 24-h prediction experiments are conducted under normal and extreme conditions,respectively.Under the normal wave condition,for 3-,6-,12-and 24-h forecasting,the mean values of the correlation coefficients on the test set are 0.996,0.991,0.980,and 0.945,respectively.The corresponding mean values of the root mean square errors are measured at 0.063 m,0.105 m,0.172 m,and 0.281 m,respectively.Under the typhoon-forced extreme condition,the model based on CNN-BiLSTM-Attention is trained by typhooninduced SWH extracted from the WW3 reanalysis data.For 3-,6-,12-and 24-h forecasting,the mean values of correlation coefficients on the test set are respectively 0.993,0.983,0.958,and 0.921,and the averaged root mean square errors are 0.159 m,0.257 m,0.437 m,and 0.555 m,respectively.The model performs better than that trained by all the WW3 reanalysis data.The result suggests that the proposed algorithm can be applied to the 2D wave forecast with higher accuracy and efficiency.展开更多
In underground mining,the belt is a critical component,as its state directly affects the safe and stable operation of the conveyor.Most of the existing non-contact detection methods based on machine vision can only de...In underground mining,the belt is a critical component,as its state directly affects the safe and stable operation of the conveyor.Most of the existing non-contact detection methods based on machine vision can only detect a single type of damage and they require pre-processing operations.This tends to cause a large amount of calculation and low detection precision.To solve these problems,in the work described in this paper a belt tear detection method based on a multi-class conditional deep convolutional generative adversarial network(CDCGAN)was designed.In the traditional DCGAN,the image generated by the generator has a certain degree of randomness.Here,a small number of labeled belt images are taken as conditions and added them to the generator and discriminator,so the generator can generate images with the characteristics of belt damage under the aforementioned conditions.Moreover,because the discriminator cannot identify multiple types of damage,the multi-class softmax function is used as the output function of the discriminator to output a vector of class probabilities,and it can accurately classify cracks,scratches,and tears.To avoid the features learned incompletely,skiplayer connection is adopted in the generator and discriminator.This not only can minimize the loss of features,but also improves the convergence speed.Compared with other algorithms,experimental results show that the loss value of the generator and discriminator is the least.Moreover,its convergence speed is faster,and the mean average precision of the proposed algorithm is up to 96.2%,which is at least 6%higher than that of other algorithms.展开更多
In the existing Electronic Health Records(EHRs),the medical information of patients is completely controlled by various medical institutions.As such,patients have no dominant power over their own EHRs.These personal d...In the existing Electronic Health Records(EHRs),the medical information of patients is completely controlled by various medical institutions.As such,patients have no dominant power over their own EHRs.These personal data are not only inconvenient to access and share,but are also prone to cause privacy disclosure.The blockchain technology provides a new development direction in the medical field.Blockchain-based EHRs are characterized by decentralization,openness and non-tampering of records,which enable patients to better manage their own EHRs.In order to better protect the privacy of patients,only designated receivers can access EHRs,and receivers can authenticate the sharer to ensure that the EHRs are real and effective.In this study,we propose an identity-based signcryption scheme with multiple authorities for multiple receivers,which can resist N-1 collusion attacks among N authorities.In addition,the identity information of receivers is anonymous,so the relationship between them and the sharer is not disclosed.Under the random oracle model,it was proved that our scheme was secure and met the unforgeability and confidentiality requirements of signcryption.Moreover,we evaluated the performance of the scheme and found that it had the moderate signcryption efficiency and excellent signcryption attributes.展开更多
Under the co-promotion of the wave of urbanization and the rise of data science,smart cities have become the new concept and new practice of urban development.Smart cities are the combination of information technology...Under the co-promotion of the wave of urbanization and the rise of data science,smart cities have become the new concept and new practice of urban development.Smart cities are the combination of information technology represented by the Internet of Things,cloud computing,mobile networks and big data,and urbanization.How to effectively achieve the long-term preservation of massive,heterogeneous,and multi-source digital electronic records in smart cities is a key issue thatmust be solved.Digital continuity can ensure the accessibility,integrity and availability of information.The quality management of electronic record,like the quality management of product,will run through every phase of the urban lifecycle.Based on data quality management,this paper constructs digital continuity of smart city electronic records.Furthermore,thework in this paper ensures the authenticity,integrity,availability and timeliness of electronic documents by quality management of electronic record.This paper elaborates on the overall technical architecture of electronic record,as well as the various technical means needed to protect its four characteristics.展开更多
Access control is one of the core problems in data management system.In this paper,the system requirements were described in three aspects:the traditional access control model,the access control model in the Internet ...Access control is one of the core problems in data management system.In this paper,the system requirements were described in three aspects:the traditional access control model,the access control model in the Internet era and the access control model in the cloud computing environment.Meanwhile,the corresponding major models were listed and their characteristics and problems were analyzed.Finally,the development trend of the corresponding model was proposed.展开更多
The application field of the Internet of Things(IoT)involves all aspects,and its application in the fields of industry,agriculture,environment,transportation,logistics,security and other infrastructure has effectively...The application field of the Internet of Things(IoT)involves all aspects,and its application in the fields of industry,agriculture,environment,transportation,logistics,security and other infrastructure has effectively promoted the intelligent development of these aspects.Although the IoT has gradually grown in recent years,there are still many problems that need to be overcome in terms of technology,management,cost,policy,and security.We need to constantly weigh the benefits of trusting IoT products and the risk of leaking private data.To avoid the leakage and loss of various user data,this paper developed a hybrid algorithm of kernel function and random perturbation method based on the algorithm of non-negative matrix factorization,which realizes personalized recommendation and solves the problem of user privacy data protection in the process of personalized recommendation.Compared to non-negative matrix factorization privacy-preserving algorithm,the new algorithm does not need to know the detailed information of the data,only need to know the connection between each data;and the new algorithm can process the data points with negative characteristics.Experiments show that the new algorithm can produce recommendation results with certain accuracy under the premise of preserving users’personal privacy.展开更多
With the growing maturity of blockchain technology,its peer-topeer model and fully duplicated data storage pattern enable blockchain to act as a distributed ledger in untrustworthy environments.Blockchain storage has ...With the growing maturity of blockchain technology,its peer-topeer model and fully duplicated data storage pattern enable blockchain to act as a distributed ledger in untrustworthy environments.Blockchain storage has also become a research hotspot in industry,finance,and academia due to its security,and its unique data storage management model is gradually becoming a key technology to play its value in various fields’applications.However,with the increasing amount of data written into the blockchain,the blockchain system faces many problems in its actual implementation of the application,such as high storage space occupation,low data flexibility and availability,low retrieval efficiency,poor scalability,etc.To improve the above problems,this paper combines off-chain storage technology and deduplication technology to optimize the blockchain storage model.Firstly,this paper adopts the double-chain model to reduce the data storage of the major chain system,which stores a small amount of primary data and supervises the vice chain through an Application Programming Interface(API).The vice chain stores a large number of copies of data as well as non-transactional data.Our model divides the vice chain storage system into two layers,including a storage layer and a processing layer.In the processing layer,deduplication technology is applied to reduce the redundancy of vice chain data.Our doublechain storage model with high scalability enhances data flexibility,is more suitable as a distributed storage system,and performs well in data retrieval.展开更多
In view of the low accuracy of traditional ground nephogram recognition model,the authors put forward a k-means algorithm-acquired neural network ensemble method,which takes BP neural network ensemble model as the bas...In view of the low accuracy of traditional ground nephogram recognition model,the authors put forward a k-means algorithm-acquired neural network ensemble method,which takes BP neural network ensemble model as the basis,uses k-means algorithm to choose the individual neural networks with partial diversities for integration,and builds the cloud form classification model.Through simulation experiments on ground nephogram samples,the results show that the algorithm proposed in the article can effectively improve the Classification accuracy of ground nephogram recognition in comparison with applying single BP neural network and traditional BP AdaBoost ensemble algorithm on classification of ground nephogram.展开更多
In recent years,Blockchain is gaining prominence as a hot topic in academic research.However,the consensus mechanism of blockchain has been criticized in terms of energy consumption and performance.Although Proof-of-A...In recent years,Blockchain is gaining prominence as a hot topic in academic research.However,the consensus mechanism of blockchain has been criticized in terms of energy consumption and performance.Although Proof-of-Authority(PoA)consensus mechanism,as a lightweight consensus mechanism,is more efficient than traditional Proof-of-Work(PoW)and Proof-of-Stake(PoS),it suffers from the problem of centralization.To this end,on account of analyzing the shortcomings of existing consensus mechanisms,this paper proposes a dynamic reputation-based consensus mechanism for blockchain.This scheme allows nodes with reputation value higher than a threshold apply to become a monitoring node,which can monitor the behavior of validators in case that validators with excessive power cause harm to the blockchain network.At the same time,the reputation evaluation algorithm is also introduced to select nodes with high reputation to become validators in the network,thus increasing the cost of malicious behavior.In each consensus cycle,validators and monitoring nodes are dynamically updated according to the reputation value.Through security analysis,it is demonstrated that the scheme can resist the attacks of malicious nodes in the blockchain network.By simulation experiments and analysis of the scheme,the result verifies that the mechanism can effectively improve the fault tolerance of the consensus mechanism,reduce the time of consensus to guarantee the security of the system.展开更多
In this paper,we propose two new attack algorithms on RSA implementations with CRT(Chinese remainder theorem).To improve the attack efficiency considerably,a clustering collision power attack on RSA with CRT is introd...In this paper,we propose two new attack algorithms on RSA implementations with CRT(Chinese remainder theorem).To improve the attack efficiency considerably,a clustering collision power attack on RSA with CRT is introduced via chosen-message pairs.This attack method is that the key parameters dp and dq are segmented by byte,and the modular multiplication collisions are identified by k-means clustering.The exponents dp and dq were recovered by 12 power traces of six groups of the specific message pairs,and the exponent d was obtained.We also propose a second order clustering collision power analysis attack against RSA implementation with CRT,which applies double blinding exponentiation.To reduce noise and artificial participation,we analyze the power points of interest by preprocessing and k-means clustering with horizontal correlation collisions.Thus,we recovered approximately 91%of the secret exponents manipulated with a single power curve on RSA-CRT with countermeasures of double blinding methods.展开更多
In order to solve the problem that real-time face recognition is susceptible to illumination changes,this paper proposes a face recognition method that combines Local Binary Patterns(LBP)and Embedded Hidden Markov Mod...In order to solve the problem that real-time face recognition is susceptible to illumination changes,this paper proposes a face recognition method that combines Local Binary Patterns(LBP)and Embedded Hidden Markov Model(EHMM).Face recognition method.The method firstly performs LBP preprocessing on the input face image,then extracts the feature vector,and finally sends the extracted feature observation vector to the EHMM for training or recognition.Experiments on multiple face databases show that the proposed algorithm is robust to illumination and improves recognition rate.展开更多
Most cloud services are built with multi-tenancy which enables data and configuration segregation upon shared infrastructures.It offers tremendous advantages for enterprises and service providers.It is anticipated tha...Most cloud services are built with multi-tenancy which enables data and configuration segregation upon shared infrastructures.It offers tremendous advantages for enterprises and service providers.It is anticipated that this situation will evolve to foster cross-tenant collaboration supported by Authorization as a service.To realize access control in a multi-tenant cloud computing environment,this study proposes a multi-tenant cloud computing access control model based on the traditional usage access control model by building trust relations among tenants.The model consists of three sub-models,which achieve trust relationships between tenants with different granularities and satisfy the requirements of different application scenarios.With an established trust relation in MT-UCON(Multi-tenant Usage Access Control),the trustee can precisely authorize cross-tenant accesses to the trustor’s resources consistent with constraints over the trust relation and other components designated by the trustor.In addition,the security of the model is analyzed by an information flow method.The model adapts to the characteristics of a dynamic and open multi-tenant cloud computing environment and achieves fine-grained access control within and between tenants.展开更多
Lung rehabilitation is safe and feasible,and it has positive benefits in weaning the machine as soon as possible,shortening the time of hospitalization and improving the prognosis of children with mechanical ventilati...Lung rehabilitation is safe and feasible,and it has positive benefits in weaning the machine as soon as possible,shortening the time of hospitalization and improving the prognosis of children with mechanical ventilation.However,at present,the traditional medical concept is deep-rooted,and doctors'understanding of early rehabilitation is inadequate.It is necessary to make in-depth exploration in the relevant guidelines and expert consensus to formulate standardized early rehabilitation diagnosis and treatment procedures and standards for mechanically ventilated children.In the paper,a structured graded lung rehabilitation program is constructed for children with mechanical ventilation to improve their respiratory function,shorten the time of mechanical ventilation and pediatric intensive care unit(PICU)hospitalization,and reduce their anxiety,based on the principal component analysis of functional pneumonia data.Scientific evaluation and dynamic monitoring ensure the safety of the implementation of the program and promote the prognosis and prognosis of the disease.The proposed lung reha-bilitation program provides a reference basis for the formulation of lung rehabilitation guidelines for children with mechanical ventilation.And It has important reference significance for clinical pulmonary rehabilitation to alleviate the concerns of clinicians and lay the foundation for the large-scale promotion of early lung rehabilitation.展开更多
With the diversification of electronic devices,cloud-based services have become the link between different devices.As a cryptosystem with secure conversion function,proxy re-encryption enables secure sharing of data i...With the diversification of electronic devices,cloud-based services have become the link between different devices.As a cryptosystem with secure conversion function,proxy re-encryption enables secure sharing of data in a cloud environment.Proxy re-encryption is a public key encryption system with ciphertext security conversion function.A semi-trusted agent plays the role of ciphertext conversion,which can convert the user ciphertext into the same plaintext encrypted by the principal’s public key.Proxy re-encryption has been a hotspot in the field of information security since it was proposed by Blaze et al.[Blaze,Bleumer and Strauss(1998)].After 20 years of development,proxy re-encryption has evolved into many forms been widely used.This paper elaborates on the definition,characteristics and development status of proxy re-encryption,and classifies proxy re-encryption from the perspectives of user identity,conversion condition,conversion hop count and conversion direction.The aspects of the existing program were compared and briefly reviewed from the aspects of features,performance,and security.Finally,this paper looks forward to the possible development direction of proxy re-encryption in the future.展开更多
Apriori algorithm is often used in traditional association rules mining,searching for the mode of higher frequency.Then the correlation rules are obtained by detected the correlation of the item sets,but this tends to...Apriori algorithm is often used in traditional association rules mining,searching for the mode of higher frequency.Then the correlation rules are obtained by detected the correlation of the item sets,but this tends to ignore low-support high-correlation of association rules.In view of the above problems,some scholars put forward the positive correlation coefficient based on Phi correlation to avoid the embarrassment caused by Apriori algorithm.It can dig item sets with low-support but high-correlation.Although the algorithm has pruned the search space,it is not obvious that the performance of the running time based on the big data set is reduced,and the correlation pairs can be meaningless.This paper presents an improved mining algorithm with new association rules based on interestingness for correlation pairs,using an upper bound on interestingness of the supersets to prune the search space.It greatly reduces the running time,and filters the meaningless correlation pairs according to the constraints of the redundancy.Compared with the algorithm based on the Phi correlation coefficient,the new algorithm has been significantly improved in reducing the running time,the result has pruned the redundant correlation pairs.So it improves the mining efficiency and accuracy.展开更多
As an important maritime hub,Bohai Sea Bay provides great convenience for shipping and suffers from sea ice disasters of different severity every winter,which greatly affects the socio-economic and development of the ...As an important maritime hub,Bohai Sea Bay provides great convenience for shipping and suffers from sea ice disasters of different severity every winter,which greatly affects the socio-economic and development of the region.Therefore,this paper uses FY-4A(a weather satellite)data to study sea ice in the Bohai Sea.After processing the data for land removal and cloud detection,it combines multi-channel threshold method and adaptive threshold algorithm to realize the recognition of Bohai Sea ice under clear sky conditions.The random forests classification algorithm is introduced in sea ice identification,which can achieve a certain effect of sea ice classification recognition under cloud cover.Under non-clear sky conditions,the results of Bohai Sea ice identification based on random forests have been improved,and the algorithm can effectively identify Bohai Sea Ice and can improve the accuracy of sea ice identification,which lays a foundation for the accuracy and stability of sea ice identification.It realizes sea ice identification in the Bohai Sea and provides data support and algorithm support for marine climate forecasting related departments.展开更多
基金supported by the National Natural Science Foundation of China under Grant Nos.61772280 and 62072249.
文摘A Generative Adversarial Neural(GAN)network is designed based on deep learning for the Super-Resolution(SR)reconstruction task of temperaturefields(comparable to downscaling in the meteorologicalfield),which is limited by the small number of ground stations and the sparse distribution of observations,resulting in a lack offineness of data.To improve the network’s generalization performance,the residual structure,and batch normalization are used.Applying the nearest interpolation method to avoid over-smoothing of the climate element values instead of the conventional Bicubic interpolation in the computer visionfield.Sub-pixel convolution is used instead of transposed convolution or interpolation methods for up-sampling to speed up network inference.The experimental dataset is the European Centre for Medium-Range Weather Forecasts Reanalysis v5(ERA5)with a bidirectional resolution of 0:1°×0:1°.On the other hand,the task aims to scale up the size by a factor of 8,which is rare compared to conventional methods.The comparison methods include traditional interpolation methods and a more widely used GAN-based network such as the SRGAN.Thefinal experimental results show that the proposed scheme advances the performance of Root Mean Square Error(RMSE)by 37.25%,the Peak Signal-to-noise Ratio(PNSR)by 14.4%,and the Structural Similarity(SSIM)by 10.3%compared to the Bicubic Interpolation.For the traditional SRGAN network,a relatively obvious performance improvement is observed by experimental demonstration.Meanwhile,the GAN network can converge stably and reach the approximate Nash equilibrium for various initialization parameters to empirically illustrate the effectiveness of the method in the temperature fields.
基金supported by National Natural Science Foundation of China(Youth program,No.82004499,Youwei Ding,https://www.nsfc.gov.cn/)Project of Natural Science Research of the Universities of Jiangsu Province(No.20KJB520030,Yihua Song,http://jyt.jiangsu.gov.cn/)the Qing Lan Project of Jiangsu Province(Xia Zhang,http://jyt.jiangsu.gov.cn/).
文摘Machine learning tasks such as image classification need to select the features that can describe the image well.The image has individual features and common features,and they are interdependent.If only the individual features of the image are emphasized,the neural network is prone to overfitting.If only the common features of images are emphasized,neural networks will not be able to adapt to diversified learning environments.In order to better integrate individual features and common features,based on skeleton and edge individual features extraction,this paper designed a mixed feature extraction method based on reso-nancefiltering,named resonance layer.Resonance layer is in front of the neural network input layer,using K3M algorithm to extract image skeleton,using the Canny algorithm to extract image border,using resonancefiltering to reconstruct training image byfiltering image noise,through the common features of the images in the training set and efficient expression of individual characteristics to improve the efficiency of feature extraction of neural network,so as to improve the accuracy of neural network prediction.Taking the fully connected neural net-work and LeNet-5 neural networks for example,the experiment on handwritten digits database shows that the proposed mixed feature extraction method can improve the accuracy of training whilefiltering out part of image noise data.
基金supported by the National Natural Science Foundation of China(No.62076042)the Key Research and Development Project of Sichuan Province(No.2021YFSY0012,No.2020YFG0307,No.2021YFG0332)+3 种基金the Science and Technology Innovation Project of Sichuan(No.2020017)the Key Research and Development Project of Chengdu(No.2019-YF05-02028-GX)the Innovation Team of Quantum Security Communication of Sichuan Province(No.17TD0009)the Academic and Technical Leaders Training Funding Support Projects of Sichuan Province(No.2016120080102643).
文摘In the objective world,how to deal with the complexity and uncertainty of big data efficiently and accurately has become the premise and key to machine learning.Fuzzy support vector machine(FSVM)not only deals with the classifi-cation problems for training samples with fuzzy information,but also assigns a fuzzy membership degree to each training sample,allowing different training samples to contribute differently in predicting an optimal hyperplane to separate two classes with maximum margin,reducing the effect of outliers and noise,Quantum computing has super parallel computing capabilities and holds the pro-mise of faster algorithmic processing of data.However,FSVM and quantum com-puting are incapable of dealing with the complexity and uncertainty of big data in an efficient and accurate manner.This paper research and propose an efficient and accurate quantum fuzzy support vector machine(QFSVM)algorithm based on the fact that quantum computing can efficiently process large amounts of data and FSVM is easy to deal with the complexity and uncertainty problems.The central idea of the proposed algorithm is to use the quantum algorithm for solving linear systems of equations(HHL algorithm)and the least-squares method to solve the quadratic programming problem in the FSVM.The proposed algorithm can deter-mine whether a sample belongs to the positive or negative class while also achiev-ing a good generalization performance.Furthermore,this paper applies QFSVM to handwritten character recognition and demonstrates that QFSVM can be run on quantum computers,and achieve accurate classification of handwritten characters.When compared to FSVM,QFSVM’s computational complexity decreases expo-nentially with the number of training samples.
基金This work is supported by the National Natural Science Foundation of China(No.61572086)the Innovation Team of Quantum Security Communication of Sichuan Province(No.17TD0009)+1 种基金the Academic and Technical Leaders Training Funding Support Projects of Sichuan Province(No.2016120080102643)the Application Foundation Project of Sichuan Province(No.2017JY0168).
文摘This paper summarizes the state of art in quantum communication networks and trust management in recent years.As in the classical networks,trust management is the premise and foundation of quantum secure communication and cannot simply be attributed to security issues,therefore the basic and importance of trust management in quantum communication networks should be taken more seriously.Compared with other theories and techniques in quantum communication,the trust of quantum communication and trust management model in quantum communication network environment is still in its initial stage.In this paper,the core technologies of establishing secure and reliable quantum communication networks are categorized and summarized,and the trends of each direction in trust management of quantum communication network are discussed in depth.
基金This study is supported by the project supported by the Southern Marine Science and Engineering Guangdong Laboratory(Zhuhai)(SML2020SP007)the National Natural Science Foundation of China(Nos.61772280 and 62072249).
文摘Though numerical wave models have been applied widely to significant wave height prediction,they consume massive computing memory and their accuracy needs to be further improved.In this paper,a two-dimensional(2D)significant wave height(SWH)prediction model is established for the South and East China Seas.The proposed model is trained by Wave Watch III(WW3)reanalysis data based on a convolutional neural network,the bidirectional long short-term memory and the attention mechanism(CNNBiLSTM-Attention).It adopts the convolutional neural network to extract spatial features of original wave height to reduce the redundant information input into the BiLSTM network.Meanwhile,the BiLSTM model is applied to fully extract the features of the associated information of time series data.Besides,the attention mechanism is used to assign probability weight to the output information of the BiLSTM layer units,and finally,a training model is constructed.Up to 24-h prediction experiments are conducted under normal and extreme conditions,respectively.Under the normal wave condition,for 3-,6-,12-and 24-h forecasting,the mean values of the correlation coefficients on the test set are 0.996,0.991,0.980,and 0.945,respectively.The corresponding mean values of the root mean square errors are measured at 0.063 m,0.105 m,0.172 m,and 0.281 m,respectively.Under the typhoon-forced extreme condition,the model based on CNN-BiLSTM-Attention is trained by typhooninduced SWH extracted from the WW3 reanalysis data.For 3-,6-,12-and 24-h forecasting,the mean values of correlation coefficients on the test set are respectively 0.993,0.983,0.958,and 0.921,and the averaged root mean square errors are 0.159 m,0.257 m,0.437 m,and 0.555 m,respectively.The model performs better than that trained by all the WW3 reanalysis data.The result suggests that the proposed algorithm can be applied to the 2D wave forecast with higher accuracy and efficiency.
基金This work was supported by the Shanxi Province Applied Basic Research Project,China(Grant No.201901D111100).Xiaoli Hao received the grant,and the URL of the sponsors’website is http://kjt.shanxi.gov.cn/.
文摘In underground mining,the belt is a critical component,as its state directly affects the safe and stable operation of the conveyor.Most of the existing non-contact detection methods based on machine vision can only detect a single type of damage and they require pre-processing operations.This tends to cause a large amount of calculation and low detection precision.To solve these problems,in the work described in this paper a belt tear detection method based on a multi-class conditional deep convolutional generative adversarial network(CDCGAN)was designed.In the traditional DCGAN,the image generated by the generator has a certain degree of randomness.Here,a small number of labeled belt images are taken as conditions and added them to the generator and discriminator,so the generator can generate images with the characteristics of belt damage under the aforementioned conditions.Moreover,because the discriminator cannot identify multiple types of damage,the multi-class softmax function is used as the output function of the discriminator to output a vector of class probabilities,and it can accurately classify cracks,scratches,and tears.To avoid the features learned incompletely,skiplayer connection is adopted in the generator and discriminator.This not only can minimize the loss of features,but also improves the convergence speed.Compared with other algorithms,experimental results show that the loss value of the generator and discriminator is the least.Moreover,its convergence speed is faster,and the mean average precision of the proposed algorithm is up to 96.2%,which is at least 6%higher than that of other algorithms.
基金This work was supported by the National Key Research and Development Project of China(Grant No.2017YFB0802302)the Science and Technology Support Project of Sichuan Province(Grant Nos.2016FZ0112,2017GZ0314,and 2018GZ0204)+2 种基金the Academic and Technical Leaders Training Funding Support Projects of Sichuan Province(Grant No.2016120080102643)the Application Foundation Project of Sichuan Province(Grant No.2017JY0168)the Science and Technology Project of Chengdu(Grant Nos.2017-RK00-00103-ZF,and 2016-HM01-00217-SF).
文摘In the existing Electronic Health Records(EHRs),the medical information of patients is completely controlled by various medical institutions.As such,patients have no dominant power over their own EHRs.These personal data are not only inconvenient to access and share,but are also prone to cause privacy disclosure.The blockchain technology provides a new development direction in the medical field.Blockchain-based EHRs are characterized by decentralization,openness and non-tampering of records,which enable patients to better manage their own EHRs.In order to better protect the privacy of patients,only designated receivers can access EHRs,and receivers can authenticate the sharer to ensure that the EHRs are real and effective.In this study,we propose an identity-based signcryption scheme with multiple authorities for multiple receivers,which can resist N-1 collusion attacks among N authorities.In addition,the identity information of receivers is anonymous,so the relationship between them and the sharer is not disclosed.Under the random oracle model,it was proved that our scheme was secure and met the unforgeability and confidentiality requirements of signcryption.Moreover,we evaluated the performance of the scheme and found that it had the moderate signcryption efficiency and excellent signcryption attributes.
基金the NSFC (Nos. 61772280, 62072249)the AIrecognition scoring system of weather map (No. SYCX202011)+1 种基金the national training programsof innovation and entrepreneurship for undergraduates (Nos. 201910300123Y, 202010300200)the PAPD fund from NUIST. Jinyue Xia is the corresponding author.
文摘Under the co-promotion of the wave of urbanization and the rise of data science,smart cities have become the new concept and new practice of urban development.Smart cities are the combination of information technology represented by the Internet of Things,cloud computing,mobile networks and big data,and urbanization.How to effectively achieve the long-term preservation of massive,heterogeneous,and multi-source digital electronic records in smart cities is a key issue thatmust be solved.Digital continuity can ensure the accessibility,integrity and availability of information.The quality management of electronic record,like the quality management of product,will run through every phase of the urban lifecycle.Based on data quality management,this paper constructs digital continuity of smart city electronic records.Furthermore,thework in this paper ensures the authenticity,integrity,availability and timeliness of electronic documents by quality management of electronic record.This paper elaborates on the overall technical architecture of electronic record,as well as the various technical means needed to protect its four characteristics.
文摘Access control is one of the core problems in data management system.In this paper,the system requirements were described in three aspects:the traditional access control model,the access control model in the Internet era and the access control model in the cloud computing environment.Meanwhile,the corresponding major models were listed and their characteristics and problems were analyzed.Finally,the development trend of the corresponding model was proposed.
基金the National Natural Science Foundation of Chinaunder Grant No.61772280by the China Special Fund for Meteorological Research in the Public Interestunder Grant GYHY201306070by the Jiangsu Province Innovation and Entrepreneurship TrainingProgram for College Students under Grant No.201910300122Y.
文摘The application field of the Internet of Things(IoT)involves all aspects,and its application in the fields of industry,agriculture,environment,transportation,logistics,security and other infrastructure has effectively promoted the intelligent development of these aspects.Although the IoT has gradually grown in recent years,there are still many problems that need to be overcome in terms of technology,management,cost,policy,and security.We need to constantly weigh the benefits of trusting IoT products and the risk of leaking private data.To avoid the leakage and loss of various user data,this paper developed a hybrid algorithm of kernel function and random perturbation method based on the algorithm of non-negative matrix factorization,which realizes personalized recommendation and solves the problem of user privacy data protection in the process of personalized recommendation.Compared to non-negative matrix factorization privacy-preserving algorithm,the new algorithm does not need to know the detailed information of the data,only need to know the connection between each data;and the new algorithm can process the data points with negative characteristics.Experiments show that the new algorithm can produce recommendation results with certain accuracy under the premise of preserving users’personal privacy.
基金This work is supported by the Key Research and Development Project of Sichuan Province(No.2021YFSY0012,No.2020YFG0307,No.2021YFG0332)the Key Research and Development Project of Chengdu(No.2019-YF05-02028-GX)+1 种基金the Innovation Team of Quantum Security Communication of Sichuan Province(No.17TD0009)the Academic and Technical Leaders Training Funding Support Projects of Sichuan Province(No.2016120080102643).
文摘With the growing maturity of blockchain technology,its peer-topeer model and fully duplicated data storage pattern enable blockchain to act as a distributed ledger in untrustworthy environments.Blockchain storage has also become a research hotspot in industry,finance,and academia due to its security,and its unique data storage management model is gradually becoming a key technology to play its value in various fields’applications.However,with the increasing amount of data written into the blockchain,the blockchain system faces many problems in its actual implementation of the application,such as high storage space occupation,low data flexibility and availability,low retrieval efficiency,poor scalability,etc.To improve the above problems,this paper combines off-chain storage technology and deduplication technology to optimize the blockchain storage model.Firstly,this paper adopts the double-chain model to reduce the data storage of the major chain system,which stores a small amount of primary data and supervises the vice chain through an Application Programming Interface(API).The vice chain stores a large number of copies of data as well as non-transactional data.Our model divides the vice chain storage system into two layers,including a storage layer and a processing layer.In the processing layer,deduplication technology is applied to reduce the redundancy of vice chain data.Our doublechain storage model with high scalability enhances data flexibility,is more suitable as a distributed storage system,and performs well in data retrieval.
基金This research was supported by the National Natural Science Foundation of China under Grant No.61772280by the China Special Fund for Meteorological Research in the Public Interest under Grant GYHY201306070and by the Jiangsu Province Innovation and Entrepreneurship Training Program for College Students under Grant No.201810300079X。
文摘In view of the low accuracy of traditional ground nephogram recognition model,the authors put forward a k-means algorithm-acquired neural network ensemble method,which takes BP neural network ensemble model as the basis,uses k-means algorithm to choose the individual neural networks with partial diversities for integration,and builds the cloud form classification model.Through simulation experiments on ground nephogram samples,the results show that the algorithm proposed in the article can effectively improve the Classification accuracy of ground nephogram recognition in comparison with applying single BP neural network and traditional BP AdaBoost ensemble algorithm on classification of ground nephogram.
基金This work is supported by the Key Research and Development Project of Sichuan Province(No.2021YFSY0012,No.2020YFG0307,No.2021YFG0332)the Key Research and Development Project of Chengdu(No.2019-YF05-02028-GX)+1 种基金the Innovation Team of Quantum Security Communication of Sichuan Province(No.17TD0009)the Academic and Technical Leaders Training Funding Support Projects of Sichuan Province(No.2016120080102643).
文摘In recent years,Blockchain is gaining prominence as a hot topic in academic research.However,the consensus mechanism of blockchain has been criticized in terms of energy consumption and performance.Although Proof-of-Authority(PoA)consensus mechanism,as a lightweight consensus mechanism,is more efficient than traditional Proof-of-Work(PoW)and Proof-of-Stake(PoS),it suffers from the problem of centralization.To this end,on account of analyzing the shortcomings of existing consensus mechanisms,this paper proposes a dynamic reputation-based consensus mechanism for blockchain.This scheme allows nodes with reputation value higher than a threshold apply to become a monitoring node,which can monitor the behavior of validators in case that validators with excessive power cause harm to the blockchain network.At the same time,the reputation evaluation algorithm is also introduced to select nodes with high reputation to become validators in the network,thus increasing the cost of malicious behavior.In each consensus cycle,validators and monitoring nodes are dynamically updated according to the reputation value.Through security analysis,it is demonstrated that the scheme can resist the attacks of malicious nodes in the blockchain network.By simulation experiments and analysis of the scheme,the result verifies that the mechanism can effectively improve the fault tolerance of the consensus mechanism,reduce the time of consensus to guarantee the security of the system.
基金supported by the National Key R&D Program of China(No.2017YFB0802300)the Key Research and Development Project of Sichuan Province(No.2020YFG0307,No.2018TJPT0012)the Key Research and Development Project of Chengdu(No.2019-YF05-02028-GX).
文摘In this paper,we propose two new attack algorithms on RSA implementations with CRT(Chinese remainder theorem).To improve the attack efficiency considerably,a clustering collision power attack on RSA with CRT is introduced via chosen-message pairs.This attack method is that the key parameters dp and dq are segmented by byte,and the modular multiplication collisions are identified by k-means clustering.The exponents dp and dq were recovered by 12 power traces of six groups of the specific message pairs,and the exponent d was obtained.We also propose a second order clustering collision power analysis attack against RSA implementation with CRT,which applies double blinding exponentiation.To reduce noise and artificial participation,we analyze the power points of interest by preprocessing and k-means clustering with horizontal correlation collisions.Thus,we recovered approximately 91%of the secret exponents manipulated with a single power curve on RSA-CRT with countermeasures of double blinding methods.
文摘In order to solve the problem that real-time face recognition is susceptible to illumination changes,this paper proposes a face recognition method that combines Local Binary Patterns(LBP)and Embedded Hidden Markov Model(EHMM).Face recognition method.The method firstly performs LBP preprocessing on the input face image,then extracts the feature vector,and finally sends the extracted feature observation vector to the EHMM for training or recognition.Experiments on multiple face databases show that the proposed algorithm is robust to illumination and improves recognition rate.
文摘Most cloud services are built with multi-tenancy which enables data and configuration segregation upon shared infrastructures.It offers tremendous advantages for enterprises and service providers.It is anticipated that this situation will evolve to foster cross-tenant collaboration supported by Authorization as a service.To realize access control in a multi-tenant cloud computing environment,this study proposes a multi-tenant cloud computing access control model based on the traditional usage access control model by building trust relations among tenants.The model consists of three sub-models,which achieve trust relationships between tenants with different granularities and satisfy the requirements of different application scenarios.With an established trust relation in MT-UCON(Multi-tenant Usage Access Control),the trustee can precisely authorize cross-tenant accesses to the trustor’s resources consistent with constraints over the trust relation and other components designated by the trustor.In addition,the security of the model is analyzed by an information flow method.The model adapts to the characteristics of a dynamic and open multi-tenant cloud computing environment and achieves fine-grained access control within and between tenants.
基金This work is supported by Science and Technology Development Fund of Nanjing Medical University(No.NJMUB2019188).
文摘Lung rehabilitation is safe and feasible,and it has positive benefits in weaning the machine as soon as possible,shortening the time of hospitalization and improving the prognosis of children with mechanical ventilation.However,at present,the traditional medical concept is deep-rooted,and doctors'understanding of early rehabilitation is inadequate.It is necessary to make in-depth exploration in the relevant guidelines and expert consensus to formulate standardized early rehabilitation diagnosis and treatment procedures and standards for mechanically ventilated children.In the paper,a structured graded lung rehabilitation program is constructed for children with mechanical ventilation to improve their respiratory function,shorten the time of mechanical ventilation and pediatric intensive care unit(PICU)hospitalization,and reduce their anxiety,based on the principal component analysis of functional pneumonia data.Scientific evaluation and dynamic monitoring ensure the safety of the implementation of the program and promote the prognosis and prognosis of the disease.The proposed lung reha-bilitation program provides a reference basis for the formulation of lung rehabilitation guidelines for children with mechanical ventilation.And It has important reference significance for clinical pulmonary rehabilitation to alleviate the concerns of clinicians and lay the foundation for the large-scale promotion of early lung rehabilitation.
基金This work is supported by the NSFC(Nos.61772280,61702236)the Changzhou Sci&Tech Program(No.CJ20179027),and the PAPD fund from NUIST.Prof.
文摘With the diversification of electronic devices,cloud-based services have become the link between different devices.As a cryptosystem with secure conversion function,proxy re-encryption enables secure sharing of data in a cloud environment.Proxy re-encryption is a public key encryption system with ciphertext security conversion function.A semi-trusted agent plays the role of ciphertext conversion,which can convert the user ciphertext into the same plaintext encrypted by the principal’s public key.Proxy re-encryption has been a hotspot in the field of information security since it was proposed by Blaze et al.[Blaze,Bleumer and Strauss(1998)].After 20 years of development,proxy re-encryption has evolved into many forms been widely used.This paper elaborates on the definition,characteristics and development status of proxy re-encryption,and classifies proxy re-encryption from the perspectives of user identity,conversion condition,conversion hop count and conversion direction.The aspects of the existing program were compared and briefly reviewed from the aspects of features,performance,and security.Finally,this paper looks forward to the possible development direction of proxy re-encryption in the future.
基金This research was supported by the National Natural Science Foundation of China under Grant No.61772280by the China Special Fund for Meteorological Research in the Public Interest under Grant GYHY201306070by the Jiangsu Province Innovation and Entrepreneurship Training Program for College Students under Grant No.201810300079X.
文摘Apriori algorithm is often used in traditional association rules mining,searching for the mode of higher frequency.Then the correlation rules are obtained by detected the correlation of the item sets,but this tends to ignore low-support high-correlation of association rules.In view of the above problems,some scholars put forward the positive correlation coefficient based on Phi correlation to avoid the embarrassment caused by Apriori algorithm.It can dig item sets with low-support but high-correlation.Although the algorithm has pruned the search space,it is not obvious that the performance of the running time based on the big data set is reduced,and the correlation pairs can be meaningless.This paper presents an improved mining algorithm with new association rules based on interestingness for correlation pairs,using an upper bound on interestingness of the supersets to prune the search space.It greatly reduces the running time,and filters the meaningless correlation pairs according to the constraints of the redundancy.Compared with the algorithm based on the Phi correlation coefficient,the new algorithm has been significantly improved in reducing the running time,the result has pruned the redundant correlation pairs.So it improves the mining efficiency and accuracy.
基金This research was supported by the National Natural Science Foundation of China under Grant No.61772280 and No.62072249。
文摘As an important maritime hub,Bohai Sea Bay provides great convenience for shipping and suffers from sea ice disasters of different severity every winter,which greatly affects the socio-economic and development of the region.Therefore,this paper uses FY-4A(a weather satellite)data to study sea ice in the Bohai Sea.After processing the data for land removal and cloud detection,it combines multi-channel threshold method and adaptive threshold algorithm to realize the recognition of Bohai Sea ice under clear sky conditions.The random forests classification algorithm is introduced in sea ice identification,which can achieve a certain effect of sea ice classification recognition under cloud cover.Under non-clear sky conditions,the results of Bohai Sea ice identification based on random forests have been improved,and the algorithm can effectively identify Bohai Sea Ice and can improve the accuracy of sea ice identification,which lays a foundation for the accuracy and stability of sea ice identification.It realizes sea ice identification in the Bohai Sea and provides data support and algorithm support for marine climate forecasting related departments.