In recent decades,the importance of surface acoustic waves,as a biocompatible tool to integrate with microfluidics,has been proven in various medical and biological applications.The numerical modeling of acoustic stre...In recent decades,the importance of surface acoustic waves,as a biocompatible tool to integrate with microfluidics,has been proven in various medical and biological applications.The numerical modeling of acoustic streaming caused by surface acoustic waves in microchannels requires the effect of viscosity to be considered in the equations which complicates the solution.In this paper,it is shown that the major contribution of viscosity and the horizontal component of actuation is concentrated in a narrow region alongside the actuation boundary.Since the inviscid equations are considerably easier to solve,a division into the viscous and inviscid domains would alleviate the computational load significantly.The particles'traces calculated by this approximation are excellently alongside their counterparts from the completely viscous model.It is also shown that the optimum thickness for the viscous strip is about 9-fold the acoustic boundary layer thickness for various flow patterns and amplitudes of actuation.展开更多
Data stream clustering is integral to contemporary big data applications.However,addressing the ongoing influx of data streams efficiently and accurately remains a primary challenge in current research.This paper aims...Data stream clustering is integral to contemporary big data applications.However,addressing the ongoing influx of data streams efficiently and accurately remains a primary challenge in current research.This paper aims to elevate the efficiency and precision of data stream clustering,leveraging the TEDA(Typicality and Eccentricity Data Analysis)algorithm as a foundation,we introduce improvements by integrating a nearest neighbor search algorithm to enhance both the efficiency and accuracy of the algorithm.The original TEDA algorithm,grounded in the concept of“Typicality and Eccentricity Data Analytics”,represents an evolving and recursive method that requires no prior knowledge.While the algorithm autonomously creates and merges clusters as new data arrives,its efficiency is significantly hindered by the need to traverse all existing clusters upon the arrival of further data.This work presents the NS-TEDA(Neighbor Search Based Typicality and Eccentricity Data Analysis)algorithm by incorporating a KD-Tree(K-Dimensional Tree)algorithm integrated with the Scapegoat Tree.Upon arrival,this ensures that new data points interact solely with clusters in very close proximity.This significantly enhances algorithm efficiency while preventing a single data point from joining too many clusters and mitigating the merging of clusters with high overlap to some extent.We apply the NS-TEDA algorithm to several well-known datasets,comparing its performance with other data stream clustering algorithms and the original TEDA algorithm.The results demonstrate that the proposed algorithm achieves higher accuracy,and its runtime exhibits almost linear dependence on the volume of data,making it more suitable for large-scale data stream analysis research.展开更多
Orthogonal frequency division multiplexing passive optical network(OFDM-PON) has superior anti-dispersion property to operate in the C-band of fiber for increased optical power budget. However,the downlink broadcast e...Orthogonal frequency division multiplexing passive optical network(OFDM-PON) has superior anti-dispersion property to operate in the C-band of fiber for increased optical power budget. However,the downlink broadcast exposes the physical layer vulnerable to the threat of illegal eavesdropping. Quantum noise stream cipher(QNSC) is a classic physical layer encryption method and well compatible with the OFDM-PON. Meanwhile, it is indispensable to exploit forward error correction(FEC) to control errors in data transmission. However, when QNSC and FEC are jointly coded, the redundant information becomes heavier and thus the code rate of the transmitted signal will be largely reduced. In this work, we propose a physical layer encryption scheme based on polar-code-assisted QNSC. In order to improve the code rate and security of the transmitted signal, we exploit chaotic sequences to yield the redundant bits and utilize the redundant information of the polar code to generate the higher-order encrypted signal in the QNSC scheme with the operation of the interleaver.We experimentally demonstrate the encrypted 16/64-QAM, 16/256-QAM, 16/1024-QAM, 16/4096-QAM QNSC signals transmitted over 30-km standard single mode fiber. For the transmitted 16/4096-QAM QNSC signal, compared with the conventional QNSC method, the proposed method increases the code rate from 0.1 to 0.32 with enhanced security.展开更多
Analyze the compatibility between cosmetics and live streaming e-commerce from its own nature,marketing means and supply chain characteristics.According to the prominent problems,sort out the relationship between all ...Analyze the compatibility between cosmetics and live streaming e-commerce from its own nature,marketing means and supply chain characteristics.According to the prominent problems,sort out the relationship between all parties in the cosmetics live e-commerce industry chain.Combined with the latest regulatory policies of live streaming e-commerce and cosmetics,the responsibilities of different subjects in cosmetics live streaming e-commerce are summarized,and relevant suggestions and countermeasures are put forward for the standardization and development of live streaming e-commerce.Cosmetics brand owners are the first responsible persons for product quality.Anchors,as a mixed identity between intermediary,advertising spokesperson and operator,should bear stricter joint and several liability when recommending products related to consumers’health.If anchors fail to clearly identify themselves in the recommendation process,thus causing consumers to mistake them for the operator of the cosmetics,they should assume the obligations of the operator.展开更多
Clustering high dimensional data is challenging as data dimensionality increases the distance between data points,resulting in sparse regions that degrade clustering performance.Subspace clustering is a common approac...Clustering high dimensional data is challenging as data dimensionality increases the distance between data points,resulting in sparse regions that degrade clustering performance.Subspace clustering is a common approach for processing high-dimensional data by finding relevant features for each cluster in the data space.Subspace clustering methods extend traditional clustering to account for the constraints imposed by data streams.Data streams are not only high-dimensional,but also unbounded and evolving.This necessitates the development of subspace clustering algorithms that can handle high dimensionality and adapt to the unique characteristics of data streams.Although many articles have contributed to the literature review on data stream clustering,there is currently no specific review on subspace clustering algorithms in high-dimensional data streams.Therefore,this article aims to systematically review the existing literature on subspace clustering of data streams in high-dimensional streaming environments.The review follows a systematic methodological approach and includes 18 articles for the final analysis.The analysis focused on two research questions related to the general clustering process and dealing with the unbounded and evolving characteristics of data streams.The main findings relate to six elements:clustering process,cluster search,subspace search,synopsis structure,cluster maintenance,and evaluation measures.Most algorithms use a two-phase clustering approach consisting of an initialization stage,a refinement stage,a cluster maintenance stage,and a final clustering stage.The density-based top-down subspace clustering approach is more widely used than the others because it is able to distinguish true clusters and outliers using projected microclusters.Most algorithms implicitly adapt to the evolving nature of the data stream by using a time fading function that is sensitive to outliers.Future work can focus on the clustering framework,parameter optimization,subspace search techniques,memory-efficient synopsis structures,explicit cluster change detection,and intrinsic performance metrics.This article can serve as a guide for researchers interested in high-dimensional subspace clustering methods for data streams.展开更多
Due to their significant correlation and redundancy,conventional block cipher cryptosystems are not efficient in encryptingmultimedia data.Streamciphers based onCellularAutomata(CA)can provide amore effective solution...Due to their significant correlation and redundancy,conventional block cipher cryptosystems are not efficient in encryptingmultimedia data.Streamciphers based onCellularAutomata(CA)can provide amore effective solution.The CA have recently gained recognition as a robust cryptographic primitive,being used as pseudorandom number generators in hash functions,block ciphers and stream ciphers.CA have the ability to perform parallel transformations,resulting in high throughput performance.Additionally,they exhibit a natural tendency to resist fault attacks.Few stream cipher schemes based on CA have been proposed in the literature.Though,their encryption/decryption throughput is relatively low,which makes them unsuitable formultimedia communication.Trivium and Grain are efficient stream ciphers that were selected as finalists in the eSTREAM project,but they have proven to be vulnerable to differential fault attacks.This work introduces a novel and scalable stream cipher named CeTrivium,whose design is based on CA.CeTrivium is a 5-neighborhood CA-based streamcipher inspired by the designs of Trivium and Grain.It is constructed using three building blocks:the Trivium(Tr)block,the Nonlinear-CA(NCA)block,and the Nonlinear Mixing(NM)block.The NCA block is a 64-bit nonlinear hybrid 5-neighborhood CA,while the Tr block has the same structure as the Trivium stream cipher.The NM block is a nonlinear,balanced,and reversible Boolean function that mixes the outputs of the Tr and NCA blocks to produce a keystream.Cryptanalysis of CeTrivium has indicated that it can resist various attacks,including correlation,algebraic,fault,cube,Meier and Staffelbach,and side channel attacks.Moreover,the scheme is evaluated using histogramand spectrogramanalysis,aswell as several differentmeasurements,including the correlation coefficient,number of samples change rate,signal-to-noise ratio,entropy,and peak signal-to-noise ratio.The performance of CeTrivium is evaluated and compared with other state-of-the-art techniques.CeTrivium outperforms them in terms of encryption throughput while maintaining high security.CeTrivium has high encryption and decryption speeds,is scalable,and resists various attacks,making it suitable for multimedia communication.展开更多
The identification of anomalies within stream sediment geochemical data is one of the fastest developing areas in mineral exploration.The various means used to achieve this objective make use of either continuous or d...The identification of anomalies within stream sediment geochemical data is one of the fastest developing areas in mineral exploration.The various means used to achieve this objective make use of either continuous or discrete field models of stream sediment geochemical data.To map anomalies in a discrete field model of such data,two corrections are required:background correction and downstream dilution correction.Topography and geomorphology are important factors in variations of element content in stream sediments.However,few studies have considered,through the use of digital terrain analysis,the influence of geomorphic features in downstream dilution correction of stream sediment geochemical data.This study proposes and demonstrates an improvement to the traditional downstream dilution correction equation,based on the use of digital terrain analysis to map single-element anomalies in stream sediment geochemical landscapes.Moreover,this study compares the results of analyses using discrete and continuous field models of stream sediment geochemical data from the Xincang area,Tibet.The efficiency of the proposed methodology was validated against known mineral occurrences.The results indicate that catchment-based analysis outperforms interpolation-based analysis of stream sediment geochemical data for anomaly mapping.Meanwhile,the proposed modified downstream dilution correction equation proved more effective than the original equation.However,further testing of this modified downstream dilution correction is needed in other areas,in order to investigate its efficiency further.展开更多
The intention of this fundamental work is to explore the manipulation of a mixture of benzene,toluene and o-xylene separated from liquid-only transfer divided-wall column(LTS-DWC).First,two control structures are clea...The intention of this fundamental work is to explore the manipulation of a mixture of benzene,toluene and o-xylene separated from liquid-only transfer divided-wall column(LTS-DWC).First,two control structures are clearly proposed,including seven component control loops(CS1)and seven temperature control loops(CS2).However,two control structures can handle ±10% feed disturbances rather than larger feed disturbances.Subsequently,an equivalent four-column model by introducing withdraw ratio is developed to discuss the effect of two liquid-only side-stream on the overall reboiler duty.It is indicated that the second liquid-only side-stream withdraw ratio strongly affects the overall energy consumption.Hence,six-component control loops within the fixed second liquid-only side-stream withdraw ratio(CS3)is proposed and the purity of products returns to set value even as facing ±20% feed disturbances.Finally,based on the above results,it is established a temperature control structure(CS4)for controlling fixed second liquid-only side-stream withdraw ratio,which can cope with ±15% disturbances.Inspired by these findings,an insight into the dynamic control of LTS-DWC promotes the industrial implementation of DWC through new liquid-only side-stream configurations.展开更多
Unanticipated sabotage of two underwater pipelines in the Baltic Sea(Nord Stream 1 and 2)happened on 26September 2022.Massive quantities of natural gas,primarily methane,were released into the atmosphere,which lasted ...Unanticipated sabotage of two underwater pipelines in the Baltic Sea(Nord Stream 1 and 2)happened on 26September 2022.Massive quantities of natural gas,primarily methane,were released into the atmosphere,which lasted for about one week.As a more powerful greenhouse gas than CO_(2),the potential climatic impact of methane is a global concern.Using multiple methods and datasets,a recent study reported a relatively accurate magnitude of the leaked methane at 0.22±0.03 million tons(Mt),which was lower than the initial estimate in the immediate aftermath of the event.Under an energy conservation framework used in IPCC AR6,we derived a negligible increase in global surface air temperature of 1.8×10^(-5)℃ in a 20-year time horizon caused by the methane leaks with an upper limit of 0.25 Mt.Although the resultant warming from this methane leak incident was minor,future carbon release from additional Earth system feedbacks,such as thawing permafrost,and its impact on the methane mitigation pathways of the Paris Agreement,warrants investigation.展开更多
The ultrasonic melt treatment(UMT)is widely used in the fields of casting and metallurgy.However,there are certain drawbacks associated with the conventional process of single-source ultrasonic(SSU)treatment,such as t...The ultrasonic melt treatment(UMT)is widely used in the fields of casting and metallurgy.However,there are certain drawbacks associated with the conventional process of single-source ultrasonic(SSU)treatment,such as the fast attenuation of energy and limited range of effectiveness.In this study,the propagation models of SSU and four-source ultrasonic(FSU)in Al melt were respectively established,and the distribution patterns of acoustic and streaming field during the ultrasonic treatment process were investigated by numerical simulation and physical experiments.The simulated results show that the effective cavitation zone is mainly located in a small spherical region surrounding the end of ultrasonic horn during the SSU treatment process.When the FSU is applied,the effective cavitation zone is obviously expanded in the melt.It increases at first and then decreases with increasing the vibration-source spacing(Lv)from 30 mm to 100 mm.Especially,when the Lv is 80 mm,the area of effective cavitation zone reaches the largest,indicating the best effect of cavitation.Moreover,the acoustic streaming level and flow pattern in the melt also change with the increase of Lv.When the Lv is 80 mm,both the average flow rate and maximum flow rate of the melt reach the highest,and the flow structure is more stable and uniform,with the typical morphological characteristics of angular vortex,thus significantly expanding the range of acoustic streaming.The accuracy of the simulation results was verified by physical experiments of glycerol aqueous solution and tracer particles.展开更多
Data encryption is essential in securing exchanged data between connected parties.Encryption is the process of transforming readable text into scrambled,unreadable text using secure keys.Stream ciphers are one type of...Data encryption is essential in securing exchanged data between connected parties.Encryption is the process of transforming readable text into scrambled,unreadable text using secure keys.Stream ciphers are one type of an encryption algorithm that relies on only one key for decryption and as well as encryption.Many existing encryption algorithms are developed based on either a mathematical foundation or on other biological,social or physical behaviours.One technique is to utilise the behavioural aspects of game theory in a stream cipher.In this paper,we introduce an enhanced Deoxyribonucleic acid(DNA)-coded stream cipher based on an iterated n-player prisoner’s dilemma paradigm.Our main goal is to contribute to adding more layers of randomness to the behaviour of the keystream generation process;these layers are inspired by the behaviour of multiple players playing a prisoner’s dilemma game.We implement parallelism to compensate for the additional processing time that may result fromadding these extra layers of randomness.The results show that our enhanced design passes the statistical tests and achieves an encryption throughput of about 1,877 Mbit/s,which makes it a feasible secure stream cipher.展开更多
Recently,the combination of video services and 5G networks have been gaining attention in the wireless communication realm.With the brisk advancement in 5G network usage and the massive popularity of threedimensional ...Recently,the combination of video services and 5G networks have been gaining attention in the wireless communication realm.With the brisk advancement in 5G network usage and the massive popularity of threedimensional video streaming,the quality of experience(QoE)of video in 5G systems has been receiving overwhelming significance from both customers and service provider ends.Therefore,effectively categorizing QoE-aware video streaming is imperative for achieving greater client satisfaction.This work makes the following contribution:First,a simulation platform based on NS-3 is introduced to analyze and improve the performance of video services.The simulation is formulated to offer real-time measurements,saving the expensive expenses associated with real-world equipment.Second,A valuable framework for QoE-aware video streaming categorization is introduced in 5G networks based on machine learning(ML)by incorporating the hyperparameter tuning(HPT)principle.It implements an enhanced hyperparameter tuning(EHPT)ensemble and decision tree(DT)classifier for video streaming categorization.The performance of the ML approach is assessed by considering precision,accuracy,recall,and computation time metrics for manifesting the superiority of these classifiers regarding video streaming categorization.This paper demonstrates that our ML classifiers achieve QoE prediction accuracy of 92.59%for(EHPT)ensemble and 87.037%for decision tree(DT)classifiers.展开更多
Every application in a smart city environment like the smart grid,health monitoring, security, and surveillance generates non-stationary datastreams. Due to such nature, the statistical properties of data changes over...Every application in a smart city environment like the smart grid,health monitoring, security, and surveillance generates non-stationary datastreams. Due to such nature, the statistical properties of data changes overtime, leading to class imbalance and concept drift issues. Both these issuescause model performance degradation. Most of the current work has beenfocused on developing an ensemble strategy by training a new classifier on thelatest data to resolve the issue. These techniques suffer while training the newclassifier if the data is imbalanced. Also, the class imbalance ratio may changegreatly from one input stream to another, making the problem more complex.The existing solutions proposed for addressing the combined issue of classimbalance and concept drift are lacking in understating of correlation of oneproblem with the other. This work studies the association between conceptdrift and class imbalance ratio and then demonstrates how changes in classimbalance ratio along with concept drift affect the classifier’s performance.We analyzed the effect of both the issues on minority and majority classesindividually. To do this, we conducted experiments on benchmark datasetsusing state-of-the-art classifiers especially designed for data stream classification.Precision, recall, F1 score, and geometric mean were used to measure theperformance. Our findings show that when both class imbalance and conceptdrift problems occur together the performance can decrease up to 15%. Ourresults also show that the increase in the imbalance ratio can cause a 10% to15% decrease in the precision scores of both minority and majority classes.The study findings may help in designing intelligent and adaptive solutionsthat can cope with the challenges of non-stationary data streams like conceptdrift and class imbalance.展开更多
Handling sentiment drifts in real time twitter data streams are a challen-ging task while performing sentiment classifications,because of the changes that occur in the sentiments of twitter users,with respect to time....Handling sentiment drifts in real time twitter data streams are a challen-ging task while performing sentiment classifications,because of the changes that occur in the sentiments of twitter users,with respect to time.The growing volume of tweets with sentiment drifts has led to the need for devising an adaptive approach to detect and handle this drift in real time.This work proposes an adap-tive learning algorithm-based framework,Twitter Sentiment Drift Analysis-Bidir-ectional Encoder Representations from Transformers(TSDA-BERT),which introduces a sentiment drift measure to detect drifts and a domain impact score to adaptively retrain the classification model with domain relevant data in real time.The framework also works on static data by converting them to data streams using the Kafka tool.The experiments conducted on real time and simulated tweets of sports,health care andfinancial topics show that the proposed system is able to detect sentiment drifts and maintain the performance of the classification model,with accuracies of 91%,87%and 90%,respectively.Though the results have been provided only for a few topics,as a proof of concept,this framework can be applied to detect sentiment drifts and perform sentiment classification on real time data streams of any topic.展开更多
Information security has emerged as a key problem in encryption because of the rapid evolution of the internet and networks.Thus,the progress of image encryption techniques is becoming an increasingly serious issue an...Information security has emerged as a key problem in encryption because of the rapid evolution of the internet and networks.Thus,the progress of image encryption techniques is becoming an increasingly serious issue and considerable problem.Small space of the key,encryption-based low confidentiality,low key sensitivity,and easily exploitable existing image encryption techniques integrating chaotic system and DNA computing are purposing the main problems to propose a new encryption technique in this study.In our proposed scheme,a three-dimensional Chen’s map and a one-dimensional Logistic map are employed to construct a double-layer image encryption scheme.In the confusion stage,different scrambling operations related to the original plain image pixels are designed using Chen’s map.A stream pixel scrambling operation related to the plain image is constructed.Then,a block scrambling-based image encryption-related stream pixel scrambled image is designed.In the diffusion stage,two rounds of pixel diffusion are generated related to the confusing image for intra-image diffusion.Chen’s map,logistic map,and DNA computing are employed to construct diffusion operations.A reverse complementary rule is applied to obtain a new form of DNA.A Chen’s map is used to produce a pseudorandom DNA sequence,and then another DNA form is constructed from a reverse pseudorandom DNA sequence.Finally,the XOR operation is performed multiple times to obtain the encrypted image.According to the simulation of experiments and security analysis,this approach extends the key space,has great sensitivity,and is able to withstand various typical attacks.An adequate encryption effect is achieved by the proposed algorithm,which can simultaneously decrease the correlation between adjacent pixels by making it near zero,also the information entropy is increased.The number of pixels changing rate(NPCR)and the unified average change intensity(UACI)both are very near to optimal values.展开更多
文摘In recent decades,the importance of surface acoustic waves,as a biocompatible tool to integrate with microfluidics,has been proven in various medical and biological applications.The numerical modeling of acoustic streaming caused by surface acoustic waves in microchannels requires the effect of viscosity to be considered in the equations which complicates the solution.In this paper,it is shown that the major contribution of viscosity and the horizontal component of actuation is concentrated in a narrow region alongside the actuation boundary.Since the inviscid equations are considerably easier to solve,a division into the viscous and inviscid domains would alleviate the computational load significantly.The particles'traces calculated by this approximation are excellently alongside their counterparts from the completely viscous model.It is also shown that the optimum thickness for the viscous strip is about 9-fold the acoustic boundary layer thickness for various flow patterns and amplitudes of actuation.
基金This research was funded by the National Natural Science Foundation of China(Grant No.72001190)by the Ministry of Education’s Humanities and Social Science Project via the China Ministry of Education(Grant No.20YJC630173)by Zhejiang A&F University(Grant No.2022LFR062).
文摘Data stream clustering is integral to contemporary big data applications.However,addressing the ongoing influx of data streams efficiently and accurately remains a primary challenge in current research.This paper aims to elevate the efficiency and precision of data stream clustering,leveraging the TEDA(Typicality and Eccentricity Data Analysis)algorithm as a foundation,we introduce improvements by integrating a nearest neighbor search algorithm to enhance both the efficiency and accuracy of the algorithm.The original TEDA algorithm,grounded in the concept of“Typicality and Eccentricity Data Analytics”,represents an evolving and recursive method that requires no prior knowledge.While the algorithm autonomously creates and merges clusters as new data arrives,its efficiency is significantly hindered by the need to traverse all existing clusters upon the arrival of further data.This work presents the NS-TEDA(Neighbor Search Based Typicality and Eccentricity Data Analysis)algorithm by incorporating a KD-Tree(K-Dimensional Tree)algorithm integrated with the Scapegoat Tree.Upon arrival,this ensures that new data points interact solely with clusters in very close proximity.This significantly enhances algorithm efficiency while preventing a single data point from joining too many clusters and mitigating the merging of clusters with high overlap to some extent.We apply the NS-TEDA algorithm to several well-known datasets,comparing its performance with other data stream clustering algorithms and the original TEDA algorithm.The results demonstrate that the proposed algorithm achieves higher accuracy,and its runtime exhibits almost linear dependence on the volume of data,making it more suitable for large-scale data stream analysis research.
基金supported in part by the National Natural Science Foundation of China Project under Grant 62075147the Suzhou Industry Technological Innovation Projects under Grant SYG202348.
文摘Orthogonal frequency division multiplexing passive optical network(OFDM-PON) has superior anti-dispersion property to operate in the C-band of fiber for increased optical power budget. However,the downlink broadcast exposes the physical layer vulnerable to the threat of illegal eavesdropping. Quantum noise stream cipher(QNSC) is a classic physical layer encryption method and well compatible with the OFDM-PON. Meanwhile, it is indispensable to exploit forward error correction(FEC) to control errors in data transmission. However, when QNSC and FEC are jointly coded, the redundant information becomes heavier and thus the code rate of the transmitted signal will be largely reduced. In this work, we propose a physical layer encryption scheme based on polar-code-assisted QNSC. In order to improve the code rate and security of the transmitted signal, we exploit chaotic sequences to yield the redundant bits and utilize the redundant information of the polar code to generate the higher-order encrypted signal in the QNSC scheme with the operation of the interleaver.We experimentally demonstrate the encrypted 16/64-QAM, 16/256-QAM, 16/1024-QAM, 16/4096-QAM QNSC signals transmitted over 30-km standard single mode fiber. For the transmitted 16/4096-QAM QNSC signal, compared with the conventional QNSC method, the proposed method increases the code rate from 0.1 to 0.32 with enhanced security.
文摘Analyze the compatibility between cosmetics and live streaming e-commerce from its own nature,marketing means and supply chain characteristics.According to the prominent problems,sort out the relationship between all parties in the cosmetics live e-commerce industry chain.Combined with the latest regulatory policies of live streaming e-commerce and cosmetics,the responsibilities of different subjects in cosmetics live streaming e-commerce are summarized,and relevant suggestions and countermeasures are put forward for the standardization and development of live streaming e-commerce.Cosmetics brand owners are the first responsible persons for product quality.Anchors,as a mixed identity between intermediary,advertising spokesperson and operator,should bear stricter joint and several liability when recommending products related to consumers’health.If anchors fail to clearly identify themselves in the recommendation process,thus causing consumers to mistake them for the operator of the cosmetics,they should assume the obligations of the operator.
文摘Clustering high dimensional data is challenging as data dimensionality increases the distance between data points,resulting in sparse regions that degrade clustering performance.Subspace clustering is a common approach for processing high-dimensional data by finding relevant features for each cluster in the data space.Subspace clustering methods extend traditional clustering to account for the constraints imposed by data streams.Data streams are not only high-dimensional,but also unbounded and evolving.This necessitates the development of subspace clustering algorithms that can handle high dimensionality and adapt to the unique characteristics of data streams.Although many articles have contributed to the literature review on data stream clustering,there is currently no specific review on subspace clustering algorithms in high-dimensional data streams.Therefore,this article aims to systematically review the existing literature on subspace clustering of data streams in high-dimensional streaming environments.The review follows a systematic methodological approach and includes 18 articles for the final analysis.The analysis focused on two research questions related to the general clustering process and dealing with the unbounded and evolving characteristics of data streams.The main findings relate to six elements:clustering process,cluster search,subspace search,synopsis structure,cluster maintenance,and evaluation measures.Most algorithms use a two-phase clustering approach consisting of an initialization stage,a refinement stage,a cluster maintenance stage,and a final clustering stage.The density-based top-down subspace clustering approach is more widely used than the others because it is able to distinguish true clusters and outliers using projected microclusters.Most algorithms implicitly adapt to the evolving nature of the data stream by using a time fading function that is sensitive to outliers.Future work can focus on the clustering framework,parameter optimization,subspace search techniques,memory-efficient synopsis structures,explicit cluster change detection,and intrinsic performance metrics.This article can serve as a guide for researchers interested in high-dimensional subspace clustering methods for data streams.
文摘Due to their significant correlation and redundancy,conventional block cipher cryptosystems are not efficient in encryptingmultimedia data.Streamciphers based onCellularAutomata(CA)can provide amore effective solution.The CA have recently gained recognition as a robust cryptographic primitive,being used as pseudorandom number generators in hash functions,block ciphers and stream ciphers.CA have the ability to perform parallel transformations,resulting in high throughput performance.Additionally,they exhibit a natural tendency to resist fault attacks.Few stream cipher schemes based on CA have been proposed in the literature.Though,their encryption/decryption throughput is relatively low,which makes them unsuitable formultimedia communication.Trivium and Grain are efficient stream ciphers that were selected as finalists in the eSTREAM project,but they have proven to be vulnerable to differential fault attacks.This work introduces a novel and scalable stream cipher named CeTrivium,whose design is based on CA.CeTrivium is a 5-neighborhood CA-based streamcipher inspired by the designs of Trivium and Grain.It is constructed using three building blocks:the Trivium(Tr)block,the Nonlinear-CA(NCA)block,and the Nonlinear Mixing(NM)block.The NCA block is a 64-bit nonlinear hybrid 5-neighborhood CA,while the Tr block has the same structure as the Trivium stream cipher.The NM block is a nonlinear,balanced,and reversible Boolean function that mixes the outputs of the Tr and NCA blocks to produce a keystream.Cryptanalysis of CeTrivium has indicated that it can resist various attacks,including correlation,algebraic,fault,cube,Meier and Staffelbach,and side channel attacks.Moreover,the scheme is evaluated using histogramand spectrogramanalysis,aswell as several differentmeasurements,including the correlation coefficient,number of samples change rate,signal-to-noise ratio,entropy,and peak signal-to-noise ratio.The performance of CeTrivium is evaluated and compared with other state-of-the-art techniques.CeTrivium outperforms them in terms of encryption throughput while maintaining high security.CeTrivium has high encryption and decryption speeds,is scalable,and resists various attacks,making it suitable for multimedia communication.
基金financially supported by the National Natural Science Foundation of China(NNSFC,Project No.42002298)the Chinese Geological Survey(Project Nos.DD20201181,DD20211403)+1 种基金the National Key Research and Development Program of China(NKRDPC,Project No.2017YFC0601501)funded by The Project of"Big Data Analysis and Major Project Evaluation of Strategic Mineral Resources"from the Chinese Geological Survey。
文摘The identification of anomalies within stream sediment geochemical data is one of the fastest developing areas in mineral exploration.The various means used to achieve this objective make use of either continuous or discrete field models of stream sediment geochemical data.To map anomalies in a discrete field model of such data,two corrections are required:background correction and downstream dilution correction.Topography and geomorphology are important factors in variations of element content in stream sediments.However,few studies have considered,through the use of digital terrain analysis,the influence of geomorphic features in downstream dilution correction of stream sediment geochemical data.This study proposes and demonstrates an improvement to the traditional downstream dilution correction equation,based on the use of digital terrain analysis to map single-element anomalies in stream sediment geochemical landscapes.Moreover,this study compares the results of analyses using discrete and continuous field models of stream sediment geochemical data from the Xincang area,Tibet.The efficiency of the proposed methodology was validated against known mineral occurrences.The results indicate that catchment-based analysis outperforms interpolation-based analysis of stream sediment geochemical data for anomaly mapping.Meanwhile,the proposed modified downstream dilution correction equation proved more effective than the original equation.However,further testing of this modified downstream dilution correction is needed in other areas,in order to investigate its efficiency further.
基金supported by National Natural Science Foundation of China(21908056)Shanghai Sailing Program(19YF1410800)Science and Technology Commission of Shanghai Municipality(19DZ2271100)。
文摘The intention of this fundamental work is to explore the manipulation of a mixture of benzene,toluene and o-xylene separated from liquid-only transfer divided-wall column(LTS-DWC).First,two control structures are clearly proposed,including seven component control loops(CS1)and seven temperature control loops(CS2).However,two control structures can handle ±10% feed disturbances rather than larger feed disturbances.Subsequently,an equivalent four-column model by introducing withdraw ratio is developed to discuss the effect of two liquid-only side-stream on the overall reboiler duty.It is indicated that the second liquid-only side-stream withdraw ratio strongly affects the overall energy consumption.Hence,six-component control loops within the fixed second liquid-only side-stream withdraw ratio(CS3)is proposed and the purity of products returns to set value even as facing ±20% feed disturbances.Finally,based on the above results,it is established a temperature control structure(CS4)for controlling fixed second liquid-only side-stream withdraw ratio,which can cope with ±15% disturbances.Inspired by these findings,an insight into the dynamic control of LTS-DWC promotes the industrial implementation of DWC through new liquid-only side-stream configurations.
基金supported by the National Key Research and Development Program(Grant No.2017YFA0603503)the National Natural Science Foundation of China(Grant No.41605057)。
文摘Unanticipated sabotage of two underwater pipelines in the Baltic Sea(Nord Stream 1 and 2)happened on 26September 2022.Massive quantities of natural gas,primarily methane,were released into the atmosphere,which lasted for about one week.As a more powerful greenhouse gas than CO_(2),the potential climatic impact of methane is a global concern.Using multiple methods and datasets,a recent study reported a relatively accurate magnitude of the leaked methane at 0.22±0.03 million tons(Mt),which was lower than the initial estimate in the immediate aftermath of the event.Under an energy conservation framework used in IPCC AR6,we derived a negligible increase in global surface air temperature of 1.8×10^(-5)℃ in a 20-year time horizon caused by the methane leaks with an upper limit of 0.25 Mt.Although the resultant warming from this methane leak incident was minor,future carbon release from additional Earth system feedbacks,such as thawing permafrost,and its impact on the methane mitigation pathways of the Paris Agreement,warrants investigation.
基金This study was financially supported by the National Natural Science Foundation of China(Grant No.52071123)the Natural Science Foundation of Anhui Province(Grant No.2308085ME167)the Fundamental Research Funds for the Central Universities of China(Grant No.PA2022GDGP0029).
文摘The ultrasonic melt treatment(UMT)is widely used in the fields of casting and metallurgy.However,there are certain drawbacks associated with the conventional process of single-source ultrasonic(SSU)treatment,such as the fast attenuation of energy and limited range of effectiveness.In this study,the propagation models of SSU and four-source ultrasonic(FSU)in Al melt were respectively established,and the distribution patterns of acoustic and streaming field during the ultrasonic treatment process were investigated by numerical simulation and physical experiments.The simulated results show that the effective cavitation zone is mainly located in a small spherical region surrounding the end of ultrasonic horn during the SSU treatment process.When the FSU is applied,the effective cavitation zone is obviously expanded in the melt.It increases at first and then decreases with increasing the vibration-source spacing(Lv)from 30 mm to 100 mm.Especially,when the Lv is 80 mm,the area of effective cavitation zone reaches the largest,indicating the best effect of cavitation.Moreover,the acoustic streaming level and flow pattern in the melt also change with the increase of Lv.When the Lv is 80 mm,both the average flow rate and maximum flow rate of the melt reach the highest,and the flow structure is more stable and uniform,with the typical morphological characteristics of angular vortex,thus significantly expanding the range of acoustic streaming.The accuracy of the simulation results was verified by physical experiments of glycerol aqueous solution and tracer particles.
文摘Data encryption is essential in securing exchanged data between connected parties.Encryption is the process of transforming readable text into scrambled,unreadable text using secure keys.Stream ciphers are one type of an encryption algorithm that relies on only one key for decryption and as well as encryption.Many existing encryption algorithms are developed based on either a mathematical foundation or on other biological,social or physical behaviours.One technique is to utilise the behavioural aspects of game theory in a stream cipher.In this paper,we introduce an enhanced Deoxyribonucleic acid(DNA)-coded stream cipher based on an iterated n-player prisoner’s dilemma paradigm.Our main goal is to contribute to adding more layers of randomness to the behaviour of the keystream generation process;these layers are inspired by the behaviour of multiple players playing a prisoner’s dilemma game.We implement parallelism to compensate for the additional processing time that may result fromadding these extra layers of randomness.The results show that our enhanced design passes the statistical tests and achieves an encryption throughput of about 1,877 Mbit/s,which makes it a feasible secure stream cipher.
文摘Recently,the combination of video services and 5G networks have been gaining attention in the wireless communication realm.With the brisk advancement in 5G network usage and the massive popularity of threedimensional video streaming,the quality of experience(QoE)of video in 5G systems has been receiving overwhelming significance from both customers and service provider ends.Therefore,effectively categorizing QoE-aware video streaming is imperative for achieving greater client satisfaction.This work makes the following contribution:First,a simulation platform based on NS-3 is introduced to analyze and improve the performance of video services.The simulation is formulated to offer real-time measurements,saving the expensive expenses associated with real-world equipment.Second,A valuable framework for QoE-aware video streaming categorization is introduced in 5G networks based on machine learning(ML)by incorporating the hyperparameter tuning(HPT)principle.It implements an enhanced hyperparameter tuning(EHPT)ensemble and decision tree(DT)classifier for video streaming categorization.The performance of the ML approach is assessed by considering precision,accuracy,recall,and computation time metrics for manifesting the superiority of these classifiers regarding video streaming categorization.This paper demonstrates that our ML classifiers achieve QoE prediction accuracy of 92.59%for(EHPT)ensemble and 87.037%for decision tree(DT)classifiers.
基金The authors would like to extend their gratitude to Universiti Teknologi PETRONAS (Malaysia)for funding this research through grant number (015LA0-037).
文摘Every application in a smart city environment like the smart grid,health monitoring, security, and surveillance generates non-stationary datastreams. Due to such nature, the statistical properties of data changes overtime, leading to class imbalance and concept drift issues. Both these issuescause model performance degradation. Most of the current work has beenfocused on developing an ensemble strategy by training a new classifier on thelatest data to resolve the issue. These techniques suffer while training the newclassifier if the data is imbalanced. Also, the class imbalance ratio may changegreatly from one input stream to another, making the problem more complex.The existing solutions proposed for addressing the combined issue of classimbalance and concept drift are lacking in understating of correlation of oneproblem with the other. This work studies the association between conceptdrift and class imbalance ratio and then demonstrates how changes in classimbalance ratio along with concept drift affect the classifier’s performance.We analyzed the effect of both the issues on minority and majority classesindividually. To do this, we conducted experiments on benchmark datasetsusing state-of-the-art classifiers especially designed for data stream classification.Precision, recall, F1 score, and geometric mean were used to measure theperformance. Our findings show that when both class imbalance and conceptdrift problems occur together the performance can decrease up to 15%. Ourresults also show that the increase in the imbalance ratio can cause a 10% to15% decrease in the precision scores of both minority and majority classes.The study findings may help in designing intelligent and adaptive solutionsthat can cope with the challenges of non-stationary data streams like conceptdrift and class imbalance.
文摘Handling sentiment drifts in real time twitter data streams are a challen-ging task while performing sentiment classifications,because of the changes that occur in the sentiments of twitter users,with respect to time.The growing volume of tweets with sentiment drifts has led to the need for devising an adaptive approach to detect and handle this drift in real time.This work proposes an adap-tive learning algorithm-based framework,Twitter Sentiment Drift Analysis-Bidir-ectional Encoder Representations from Transformers(TSDA-BERT),which introduces a sentiment drift measure to detect drifts and a domain impact score to adaptively retrain the classification model with domain relevant data in real time.The framework also works on static data by converting them to data streams using the Kafka tool.The experiments conducted on real time and simulated tweets of sports,health care andfinancial topics show that the proposed system is able to detect sentiment drifts and maintain the performance of the classification model,with accuracies of 91%,87%and 90%,respectively.Though the results have been provided only for a few topics,as a proof of concept,this framework can be applied to detect sentiment drifts and perform sentiment classification on real time data streams of any topic.
基金Deanship for Research&Innovation,Ministry of Education in Saudi Arabia for funding this research work through the Project Number:IFP22UQU4400257DSR031.
文摘Information security has emerged as a key problem in encryption because of the rapid evolution of the internet and networks.Thus,the progress of image encryption techniques is becoming an increasingly serious issue and considerable problem.Small space of the key,encryption-based low confidentiality,low key sensitivity,and easily exploitable existing image encryption techniques integrating chaotic system and DNA computing are purposing the main problems to propose a new encryption technique in this study.In our proposed scheme,a three-dimensional Chen’s map and a one-dimensional Logistic map are employed to construct a double-layer image encryption scheme.In the confusion stage,different scrambling operations related to the original plain image pixels are designed using Chen’s map.A stream pixel scrambling operation related to the plain image is constructed.Then,a block scrambling-based image encryption-related stream pixel scrambled image is designed.In the diffusion stage,two rounds of pixel diffusion are generated related to the confusing image for intra-image diffusion.Chen’s map,logistic map,and DNA computing are employed to construct diffusion operations.A reverse complementary rule is applied to obtain a new form of DNA.A Chen’s map is used to produce a pseudorandom DNA sequence,and then another DNA form is constructed from a reverse pseudorandom DNA sequence.Finally,the XOR operation is performed multiple times to obtain the encrypted image.According to the simulation of experiments and security analysis,this approach extends the key space,has great sensitivity,and is able to withstand various typical attacks.An adequate encryption effect is achieved by the proposed algorithm,which can simultaneously decrease the correlation between adjacent pixels by making it near zero,also the information entropy is increased.The number of pixels changing rate(NPCR)and the unified average change intensity(UACI)both are very near to optimal values.