期刊文献+
共找到112篇文章
< 1 2 6 >
每页显示 20 50 100
A New Dynamics Analysis Model for Five-Axis Machining of Curved Surface Based on Dimension Reduction and Mapping
1
作者 Minglong Guo Zhaocheng Wei +2 位作者 Minjie Wang Zhiwei Zhao Shengxian Liu 《Chinese Journal of Mechanical Engineering》 SCIE EI CAS CSCD 2023年第6期172-184,共13页
The equipment used in various fields contains an increasing number of parts with curved surfaces of increasing size.Five-axis computer numerical control(CNC)milling is the main parts machining method,while dynamics an... The equipment used in various fields contains an increasing number of parts with curved surfaces of increasing size.Five-axis computer numerical control(CNC)milling is the main parts machining method,while dynamics analysis has always been a research hotspot.The cutting conditions determined by the cutter axis,tool path,and workpiece geometry are complex and changeable,which has made dynamics research a major challenge.For this reason,this paper introduces the innovative idea of applying dimension reduction and mapping to the five-axis machining of curved surfaces,and proposes an efficient dynamics analysis model.To simplify the research object,the cutter position points along the tool path were discretized into inclined plane five-axis machining.The cutter dip angle and feed deflection angle were used to define the spatial position relationship in five-axis machining.These were then taken as the new base variables to construct an abstract two-dimensional space and establish the mapping relationship between the cutter position point and space point sets to further simplify the dimensions of the research object.Based on the in-cut cutting edge solved by the space limitation method,the dynamics of the inclined plane five-axis machining unit were studied,and the results were uniformly stored in the abstract space to produce a database.Finally,the prediction of the milling force and vibration state along the tool path became a data extraction process that significantly improved efficiency.Two experiments were also conducted which proved the accuracy and efficiency of the proposed dynamics analysis model.This study has great potential for the online synchronization of intelligent machining of large surfaces. 展开更多
关键词 Curved surface Five-axis machining dimension reduction and mapping Milling force DYNAMICS
下载PDF
Review of Dimension Reduction Methods
2
作者 Salifu Nanga Ahmed Tijani Bawah +5 位作者 Benjamin Ansah Acquaye Mac-Issaka Billa Francis Delali Baeta Nii Afotey Odai Samuel Kwaku Obeng Ampem Darko Nsiah 《Journal of Data Analysis and Information Processing》 2021年第3期189-231,共43页
<strong>Purpose:</strong><span style="font-family:;" "=""><span style="font-family:Verdana;"> This study sought to review the characteristics, strengths, weak... <strong>Purpose:</strong><span style="font-family:;" "=""><span style="font-family:Verdana;"> This study sought to review the characteristics, strengths, weaknesses variants, applications areas and data types applied on the various </span><span><span style="font-family:Verdana;">Dimension Reduction techniques. </span><b><span style="font-family:Verdana;">Methodology: </span></b><span style="font-family:Verdana;">The most commonly used databases employed to search for the papers were ScienceDirect, Scopus, Google Scholar, IEEE Xplore and Mendeley. An integrative review was used for the study where </span></span></span><span style="font-family:Verdana;">341</span><span style="font-family:;" "=""><span style="font-family:Verdana;"> papers were reviewed. </span><b><span style="font-family:Verdana;">Results:</span></b><span style="font-family:Verdana;"> The linear techniques considered were Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA), Singular Value Decomposition (SVD), Latent Semantic Analysis (LSA), Locality Preserving Projections (LPP), Independent Component Analysis (ICA) and Project Pursuit (PP). The non-linear techniques which were developed to work with applications that ha</span></span><span style="font-family:Verdana;">ve</span><span style="font-family:Verdana;"> complex non-linear structures considered were Kernel Principal Component Analysis (KPC</span><span style="font-family:Verdana;">A), Multi</span><span style="font-family:Verdana;">-</span><span style="font-family:;" "=""><span style="font-family:Verdana;">dimensional Scaling (MDS), Isomap, Locally Linear Embedding (LLE), Self-Organizing Map (SOM), Latent Vector Quantization (LVQ), t-Stochastic </span><span style="font-family:Verdana;">neighbor embedding (t-SNE) and Uniform Manifold Approximation and Projection (UMAP). DR techniques can further be categorized into supervised, unsupervised and more recently semi-supervised learning methods. The supervised versions are the LDA and LVQ. All the other techniques are unsupervised. Supervised variants of PCA, LPP, KPCA and MDS have </span><span style="font-family:Verdana;">been developed. Supervised and semi-supervised variants of PP and t-SNE have also been developed and a semi supervised version of the LDA has been developed. </span><b><span style="font-family:Verdana;">Conclusion:</span></b><span style="font-family:Verdana;"> The various application areas, strengths, weaknesses and variants of the DR techniques were explored. The different data types that have been applied on the various DR techniques were also explored.</span></span> 展开更多
关键词 dimension reduction Machine Learning Linear dimension reduction Techniques Non-Linear reduction Techniques
下载PDF
Optimizing progress variable definition in flamelet-based dimension reduction in combustion 被引量:2
3
作者 Jing CHEN Minghou LIU Yiliang CHEN 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI CSCD 2015年第11期1481-1498,共18页
An automated method to optimize the definition of the progress variables in the flamelet-based dimension reduction is proposed.The performance of these optimized progress variables in coupling the flamelets and flow s... An automated method to optimize the definition of the progress variables in the flamelet-based dimension reduction is proposed.The performance of these optimized progress variables in coupling the flamelets and flow solver is presented.In the proposed method, the progress variables are defined according to the first two principal components(PCs) from the principal component analysis(PCA) or kernel-density-weighted PCA(KEDPCA) of a set of flamelets.These flamelets can then be mapped to these new progress variables instead of the mixture fraction/conventional progress variables.Thus,a new chemistry look-up table is constructed.A priori validation of these optimized progress variables and the new chemistry table is implemented in a CH4/N2/air lift-off flame.The reconstruction of the lift-off flame shows that the optimized progress variables perform better than the conventional ones, especially in the high temperature area.The coefficient determinations(R2 statistics) show that the KEDPCA performs slightly better than the PCA except for some minor species.The main advantage of the KEDPCA is that it is less sensitive to the database.Meanwhile, the criteria for the optimization are proposed and discussed.The constraint that the progress variables should monotonically evolve from fresh gas to burnt gas is analyzed in detail. 展开更多
关键词 principal component analysis(PCA) progress variable flamelet-based model dimension reduction
下载PDF
Adaptive subspace detection based on two-step dimension reduction in the underwater waveguide
4
作者 孔德智 孙超 +1 位作者 李明杨 谢磊 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2021年第4期1414-1422,共9页
In the underwater waveguide,the conventional adaptive subspace detector(ASD),derived by using the generalized likelihood ratio test(GLRT)theory,suffers from a significant degradation in detection performance when the ... In the underwater waveguide,the conventional adaptive subspace detector(ASD),derived by using the generalized likelihood ratio test(GLRT)theory,suffers from a significant degradation in detection performance when the samplings of training data are deficient.This paper proposes a dimension-reduced approach to alleviate this problem.The dimension reduction includes two steps:firstly,the full array is divided into several subarrays;secondly,the test data and the training data at each subarray are transformed into the modal domain from the hydrophone domain.Then the modal-domain test data and training data at each subarray are processed to formulate the subarray statistic by using the GLRT theory.The final test statistic of the dimension-reduced ASD(DR-ASD)is obtained by summing all the subarray statistics.After the dimension reduction,the unknown parameters can be estimated more accurately so the DR-ASD achieves a better detection performance than the ASD.In order to achieve the optimal detection performance,the processing gain of the DR-ASD is deduced to choose a proper number of subarrays.Simulation experiments verify the improved detection performance of the DR-ASD compared with the ASD. 展开更多
关键词 Underwater waveguide Adaptive subspace detection dimension reduction Processing gain
下载PDF
Multi-state Information Dimension Reduction Based on Particle Swarm Optimization-Kernel Independent Component Analysis
5
作者 邓士杰 苏续军 +1 位作者 唐力伟 张英波 《Journal of Donghua University(English Edition)》 EI CAS 2017年第6期791-795,共5页
The precision of the kernel independent component analysis( KICA) algorithm depends on the type and parameter values of kernel function. Therefore,it's of great significance to study the choice method of KICA'... The precision of the kernel independent component analysis( KICA) algorithm depends on the type and parameter values of kernel function. Therefore,it's of great significance to study the choice method of KICA's kernel parameters for improving its feature dimension reduction result. In this paper, a fitness function was established by use of the ideal of Fisher discrimination function firstly. Then the global optimal solution of fitness function was searched by particle swarm optimization( PSO) algorithm and a multi-state information dimension reduction algorithm based on PSO-KICA was established. Finally,the validity of this algorithm to enhance the precision of feature dimension reduction has been proven. 展开更多
关键词 kernel independent component analysis(KICA) particle swarm optimization(PSO) feature dimension reduction fitness function
下载PDF
A Dimensional Reduction Approach Based on Essential Constraints in Linear Programming
6
作者 Eirini I. Nikolopoulou George S. Androulakis 《American Journal of Operations Research》 2024年第1期1-31,共31页
This paper presents a new dimension reduction strategy for medium and large-scale linear programming problems. The proposed method uses a subset of the original constraints and combines two algorithms: the weighted av... This paper presents a new dimension reduction strategy for medium and large-scale linear programming problems. The proposed method uses a subset of the original constraints and combines two algorithms: the weighted average and the cosine simplex algorithm. The first approach identifies binding constraints by using the weighted average of each constraint, whereas the second algorithm is based on the cosine similarity between the vector of the objective function and the constraints. These two approaches are complementary, and when used together, they locate the essential subset of initial constraints required for solving medium and large-scale linear programming problems. After reducing the dimension of the linear programming problem using the subset of the essential constraints, the solution method can be chosen from any suitable method for linear programming. The proposed approach was applied to a set of well-known benchmarks as well as more than 2000 random medium and large-scale linear programming problems. The results are promising, indicating that the new approach contributes to the reduction of both the size of the problems and the total number of iterations required. A tree-based classification model also confirmed the need for combining the two approaches. A detailed numerical example, the general numerical results, and the statistical analysis for the decision tree procedure are presented. 展开更多
关键词 Linear Programming Binding Constraints dimension reduction Cosine Similarity Decision Analysis Decision Trees
下载PDF
Many-objective Optimization Method Based on Dimension Reduction for Operation of Large-scale Cooling Energy Systems
7
作者 Peng Zhu Lixiao Wang +4 位作者 Cuiqing Wu Jinyu Yu Zhigang Li Jiehui Zheng Qing-Hua Wu 《CSEE Journal of Power and Energy Systems》 SCIE EI CSCD 2023年第3期884-895,共12页
Large-scale cooling energy system has developed well in the past decade.However,its optimization is still a problem to be tackled due to the nonlinearity and large scale of existing systems.Reducing the scale of probl... Large-scale cooling energy system has developed well in the past decade.However,its optimization is still a problem to be tackled due to the nonlinearity and large scale of existing systems.Reducing the scale of problems without oversimplifying the actual system model is a big challenge nowadays.This paper proposes a dimension reduction-based many-objective optimization(DRMO)method to solve an accurate nonlinear model of a practical large-scale cooling energy system.In the first stage,many-objective and many-variable of the large system are pre-processed to reduce the overall scale of the optimization problem.The relationships between many objectives are analyzed to find a few representative objectives.Key control variables are extracted to reduce the dimension of variables and the number of equality constraints.In the second stage,the manyobjective group search optimization(GSO)method is used to solve the low-dimensional nonlinear model,and a Pareto-front is obtained.In the final stage,candidate solutions along the Paretofront are graded on many-objective levels of system operators.The candidate solution with the highest average utility value is selected as the best running mode.Simulations are carried out on a 619-node-614-branch cooling system,and results show the ability of the proposed method in solving large-scale system operation problems. 展开更多
关键词 dimension reduction group search optimization large-scale cooling energy system many-objective optimization
原文传递
Dimensionality reduction model based on integer planning for the analysis of key indicators affecting life expectancy
8
作者 Wei Cui Zhiqiang Xu Ren Mu 《Journal of Data and Information Science》 CSCD 2023年第4期102-124,共23页
Purpose:Exploring a dimensionality reduction model that can adeptly eliminate outliers and select the appropriate number of clusters is of profound theoretical and practical importance.Additionally,the interpretabilit... Purpose:Exploring a dimensionality reduction model that can adeptly eliminate outliers and select the appropriate number of clusters is of profound theoretical and practical importance.Additionally,the interpretability of these models presents a persistent challenge.Design/methodology/approach:This paper proposes two innovative dimensionality reduction models based on integer programming(DRMBIP).These models assess compactness through the correlation of each indicator with its class center,while separation is evaluated by the correlation between different class centers.In contrast to DRMBIP-p,the DRMBIP-v considers the threshold parameter as a variable aiming to optimally balances both compactness and separation.Findings:This study,getting data from the Global Health Observatory(GHO),investigates 141 indicators that influence life expectancy.The findings reveal that DRMBIP-p effectively reduces the dimensionality of data,ensuring compactness.It also maintains compatibility with other models.Additionally,DRMBIP-v finds the optimal result,showing exceptional separation.Visualization of the results reveals that all classes have a high compactness.Research limitations:The DRMBIP-p requires the input of the correlation threshold parameter,which plays a pivotal role in the effectiveness of the final dimensionality reduction results.In the DRMBIP-v,modifying the threshold parameter to variable potentially emphasizes either separation or compactness.This necessitates an artificial adjustment to the overflow component within the objective function.Practical implications:The DRMBIP presented in this paper is adept at uncovering the primary geometric structures within high-dimensional indicators.Validated by life expectancy data,this paper demonstrates potential to assist data miners with the reduction of data dimensions.Originality/value:To our knowledge,this is the first time that integer programming has been used to build a dimensionality reduction model with indicator filtering.It not only has applications in life expectancy,but also has obvious advantages in data mining work that requires precise class centers. 展开更多
关键词 Integer programming Multidimensional data dimensionality reduction Life expectancy
下载PDF
Dimensionality Reduction Using Optimized Self-Organized Map Technique for Hyperspectral Image Classification
9
作者 S.Srinivasan K.Rajakumar 《Computer Systems Science & Engineering》 SCIE EI 2023年第11期2481-2496,共16页
The high dimensionalhyperspectral image classification is a challenging task due to the spectral feature vectors.The high correlation between these features and the noises greatly affects the classification performanc... The high dimensionalhyperspectral image classification is a challenging task due to the spectral feature vectors.The high correlation between these features and the noises greatly affects the classification performances.To overcome this,dimensionality reduction techniques are widely used.Traditional image processing applications recently propose numerous deep learning models.However,in hyperspectral image classification,the features of deep learning models are less explored.Thus,for efficient hyperspectral image classification,a depth-wise convolutional neural network is presented in this research work.To handle the dimensionality issue in the classification process,an optimized self-organized map model is employed using a water strider optimization algorithm.The network parameters of the self-organized map are optimized by the water strider optimization which reduces the dimensionality issues and enhances the classification performances.Standard datasets such as Indian Pines and the University of Pavia(UP)are considered for experimental analysis.Existing dimensionality reduction methods like Enhanced Hybrid-Graph Discriminant Learning(EHGDL),local geometric structure Fisher analysis(LGSFA),Discriminant Hyper-Laplacian projection(DHLP),Group-based tensor model(GBTM),and Lower rank tensor approximation(LRTA)methods are compared with proposed optimized SOM model.Results confirm the superior performance of the proposed model of 98.22%accuracy for the Indian pines dataset and 98.21%accuracy for the University of Pavia dataset over the existing maximum likelihood classifier,and Support vector machine(SVM). 展开更多
关键词 Hyperspectral image dimensionality reduction depth-wise separable model water strider optimization self-organized map
下载PDF
Dimension Reduction Based on Sampling
10
作者 Zhuping Li Donghua Yang +3 位作者 Mengmeng Li Haifeng Guo Tiansheng Ye Hongzhi Wang 《国际计算机前沿大会会议论文集》 EI 2023年第1期207-220,共14页
Dimension reduction provides a powerful means of reducing the number of random variables under consideration.However,there were many similar tuples in large datasets,and before reducing the dimension of the dataset,we... Dimension reduction provides a powerful means of reducing the number of random variables under consideration.However,there were many similar tuples in large datasets,and before reducing the dimension of the dataset,we removed some similar tuples to retain the main information of the dataset while accelerating the dimension reduc-tion.Accordingly,we propose a dimension reduction technique based on biased sampling,a new procedure that incorporates features of both dimensional reduction and biased sampling to obtain a computationally efficient means of reducing the number of random variables under consid-eration.In this paper,we choose Principal Components Analysis(PCA)as the main dimensional reduction algorithm to study,and we show how this approach works. 展开更多
关键词 PCA dimensional reduction biased sampling
原文传递
Sufficient dimension reduction in the presence of controlling variables
11
作者 Guoliang Fan Liping Zhu 《Science China Mathematics》 SCIE CSCD 2022年第9期1975-1996,共22页
We are concerned with partial dimension reduction for the conditional mean function in the presence of controlling variables.We suggest a profile least squares approach to perform partial dimension reduction for a gen... We are concerned with partial dimension reduction for the conditional mean function in the presence of controlling variables.We suggest a profile least squares approach to perform partial dimension reduction for a general class of semi-parametric models.The asymptotic properties of the resulting estimates for the central partial mean subspace and the mean function are provided.In addition,a Wald-type test is proposed to evaluate a linear hypothesis of the central partial mean subspace,and a generalized likelihood ratio test is constructed to check whether the nonparametric mean function has a specific parametric form.These tests can be used to evaluate whether there exist interactions between the covariates and the controlling variables,and if any,in what form.A Bayesian information criterion(BIC)-type criterion is applied to determine the structural dimension of the central partial mean subspace.Its consistency is also established.Numerical studies through simulations and real data examples are conducted to demonstrate the power and utility of the proposed semi-parametric approaches. 展开更多
关键词 central partial mean subspace controlling variable hypothesis test semi-parametric regression sufficient dimension reduction
原文传递
Partial Dynamic Dimension Reduction for Conditional Mean in Regression
12
作者 GAN Shengjin YU Zhou 《Journal of Systems Science & Complexity》 SCIE EI CSCD 2020年第5期1585-1601,共17页
In many regression analysis,the authors are interested in regression mean of response variate given predictors,not its the conditional distribution.This paper is concerned with dimension reduction of predictors in sen... In many regression analysis,the authors are interested in regression mean of response variate given predictors,not its the conditional distribution.This paper is concerned with dimension reduction of predictors in sense of mean function of response conditioning on predictors.The authors introduce the notion of partial dynamic central mean dimension reduction subspace,different from central mean dimension reduction subspace,it has varying subspace in the domain of predictors,and its structural dimensionality may not be the same point by point.The authors study the property of partial dynamic central mean dimension reduction subspace,and develop estimated methods called dynamic ordinary least squares and dynamic principal Hessian directions,which are extension of ordinary least squares and principal Hessian directions based on central mean dimension reduction subspace.The kernel estimate methods for dynamic ordinary least squares and dynamic Principal Hessian Directions are employed,and large sample properties of estimators are given under the regular conditions.Simulations and real data analysis demonstrate that they are effective. 展开更多
关键词 Dynamic ordinary least square estimate dynamic principal Hessian directions kernel estimate partial dimension reduction
原文传递
Pseudo likelihood and dimension reduction for data with nonignorable nonresponse
13
作者 Ji Chen Bingying Xie Jun Shao 《Statistical Theory and Related Fields》 2018年第2期196-205,共10页
Tang et al. (2003. Analysis of multivariate missing data with nonignorable nonresponse.Biometrika, 90(4), 747–764) and Zhao & Shao (2015. Semiparametric pseudo-likelihoods in generalized linear models with nonign... Tang et al. (2003. Analysis of multivariate missing data with nonignorable nonresponse.Biometrika, 90(4), 747–764) and Zhao & Shao (2015. Semiparametric pseudo-likelihoods in generalized linear models with nonignorable missing data. Journal of the American Statistical Association, 110(512), 1577–1590) proposed a pseudo likelihood approach to estimate unknownparameters in a parametric density of a response Y conditioned on a vector of covariate X, whereY is subjected to nonignorable nonersponse, X is always observed, and the propensity of whetheror not Y is observed conditioned on Y and X is completely unspecified. To identify parameters, Zhao & Shao (2015. Semiparametric pseudo-likelihoods in generalized linear models withnonignorable missing data. Journal of the American Statistical Association, 110(512), 1577–1590)assumed that X can be decomposed into U and Z, where Z can be excluded from the propensitybut is related with Y even conditioned on U. The pseudo likelihood involves the estimation ofthe joint density of U and Z. When this density is estimated nonparametrically, in this paper weapply sufficient dimension reduction to reduce the dimension of U for efficient estimation. Consistency and asymptotic normality of the proposed estimators are established. Simulation resultsare presented to study the finite sample performance of the proposed estimators. 展开更多
关键词 dimension reduction kernel estimation nonignorable nonresponse nonresponse instrument pseudo likelihood
原文传递
Quantile treatment effect estimation with dimension reduction
14
作者 Ying Zhang Lei Wang +1 位作者 Menggang Yu Jun Shao 《Statistical Theory and Related Fields》 2020年第2期202-213,共12页
Quantile treatment effects can be important causal estimands in evaluation of biomedical treatments or interventions for health outcomes such as medical cost and utilisation.We consider their estimation in observation... Quantile treatment effects can be important causal estimands in evaluation of biomedical treatments or interventions for health outcomes such as medical cost and utilisation.We consider their estimation in observational studies with many possible covariates under the assumption that treatment and potential outcomes are independent conditional on all covariates.To obtain valid and efficient treatment effect estimators,we replace the set of all covariates by lower dimensional sets for estimation of the quantiles of potential outcomes.These lower dimensional sets are obtained using sufficient dimension reduction tools and are outcome specific.We justify our choice from efficiency point of view.We prove the asymptotic normality of our estimators and our theory is complemented by some simulation results and an application to data from the University of Wisconsin Health Accountable Care Organization. 展开更多
关键词 CAUSALITY efficiency bound propensity score quantile treatment effect sufficient dimension reduction
原文传递
A selective overview of sparse sufficient dimension reduction
15
作者 Lu Li Xuerong Meggie Wen Zhou Yu 《Statistical Theory and Related Fields》 2020年第2期121-133,共13页
High-dimensional data analysis has been a challenging issue in statistics.Sufficient dimension reduction aims to reduce the dimension of the predictors by replacing the original predictors with a minimal set of their ... High-dimensional data analysis has been a challenging issue in statistics.Sufficient dimension reduction aims to reduce the dimension of the predictors by replacing the original predictors with a minimal set of their linear combinations without loss of information.However,the estimated linear combinations generally consist of all of the variables,making it difficult to interpret.To circumvent this difficulty,sparse sufficient dimension reduction methods were proposed to conduct model-free variable selection or screening within the framework of sufficient dimension reduction.Wereview the current literature of sparse sufficient dimension reduction and do some further investigation in this paper. 展开更多
关键词 Minimax rate sparse sufficient dimension reduction variable selection variable screening
原文传递
Bridging cognitive gaps between user and model in interactive dimension reduction
16
作者 Ming Wang John Wenskovitch +2 位作者 Leanna House Nicholas Polys Chris North 《Visual Informatics》 EI 2021年第2期13-25,共13页
Interactive machine learning(ML)systems are difficult to design because of the‘‘Two Black Boxes’’problem that exists at the interface between human and machine.Many algorithms that are used in interactive ML syste... Interactive machine learning(ML)systems are difficult to design because of the‘‘Two Black Boxes’’problem that exists at the interface between human and machine.Many algorithms that are used in interactive ML systems are black boxes that are presented to users,while the human cognition represents a second black box that can be difficult for the algorithm to interpret.These black boxes create cognitive gaps between the user and the interactive ML model.In this paper,we identify several cognitive gaps that exist in a previously-developed interactive visual analytics(VA)system,Andromeda,but are also representative of common problems in other VA systems.Our goal with this work is to open both black boxes and bridge these cognitive gaps by making usability improvements to the original Andromeda system.These include designing new visual features to help people better understand how Andromeda processes and interacts with data,as well as improving the underlying algorithm so that the system can better implement the intent of the user during the data exploration process.We evaluate our designs through both qualitative and quantitative analysis,and the results confirm that the improved Andromeda system outperforms the original version in a series of high-dimensional data analysis tasks. 展开更多
关键词 Interactive machine learning Visual analytics dimension reduction USABILITY Cognitive gaps
原文传递
Comparison of dimension reduction methods for DEA under big data via Monte Carlo simulation
17
作者 Zikang Chen Song Han 《Journal of Management Science and Engineering》 2021年第4期363-376,共14页
Data with large dimensions will bring various problems to the application of data envelopment analysis(DEA).In this study,we focus on a“big data”problem related to the considerably large dimensions of the input-outp... Data with large dimensions will bring various problems to the application of data envelopment analysis(DEA).In this study,we focus on a“big data”problem related to the considerably large dimensions of the input-output data.The four most widely used approaches to guide dimension reduction in DEA are compared via Monte Carlo simulation,including principal component analysis(PCA-DEA),which is based on the idea of aggregating input and output,efficiency contribution measurement(ECM),average efficiency measure(AEC),and regression-based detection(RB),which is based on the idea of variable selection.We compare the performance of these methods under different scenarios and a brand-new comparison benchmark for the simulation test.In addition,we discuss the effect of initial variable selection in RB for the first time.Based on the results,we offer guidelines that are more reliable on how to choose an appropriate method. 展开更多
关键词 Data envelopment analysis Big data Data dimension reduction method
原文传递
Dimension reduction graph-based sparse subspace clustering for intelligent fault identification of rolling element bearings
18
作者 Le Zhao Shaopu Yang Yongqiang Liu 《International Journal of Mechanical System Dynamics》 2021年第2期207-219,共13页
Sparse subspace clustering(SSC)is a spectral clustering methodology.Since high-dimensional data are often dispersed over the union of many low-dimensional subspaces,their representation in a suitable dictionary is spa... Sparse subspace clustering(SSC)is a spectral clustering methodology.Since high-dimensional data are often dispersed over the union of many low-dimensional subspaces,their representation in a suitable dictionary is sparse.Therefore,SSC is an effective technology for diagnosing mechanical system faults.Its main purpose is to create a representation model that can reveal the real subspace structure of high-dimensional data,construct a similarity matrix by using the sparse representation coefficients of high-dimensional data,and then cluster the obtained representation coefficients and similarity matrix in subspace.However,the design of SSC algorithm is based on global expression in which each data point is represented by all possible cluster data points.This leads to nonzero terms in nondiagonal blocks of similar matrices,which reduces the recognition performance of matrices.To improve the clustering ability of SSC for rolling bearing and the robustness of the algorithm in the presence of a large number of background noise,a simultaneous dimensionality reduction subspace clustering technology is provided in this work.Through the feature extraction of envelope signal,the dimension of the feature matrix is reduced by singular value decomposition,and the Euclidean distance between samples is replaced by correlation distance.A dimension reduction graph-based SSC technology is established.Simulation and bearing data of Western Reserve University show that the proposed algorithm can improve the accuracy and compactness of clustering. 展开更多
关键词 correlation distance dimension reduction sparse subspace clustering
原文传递
Adaptive Metric Learning for Dimensionality Reduction
19
作者 Lihua Chen Peiwen Wei +1 位作者 Zhongzhen Long Yufeng Yu 《Journal of Computer and Communications》 2022年第12期95-112,共18页
Finding a suitable space is one of the most critical problems for dimensionality reduction. Each space corresponds to a distance metric defined on the sample attributes, and thus finding a suitable space can be conver... Finding a suitable space is one of the most critical problems for dimensionality reduction. Each space corresponds to a distance metric defined on the sample attributes, and thus finding a suitable space can be converted to develop an effective distance metric. Most existing dimensionality reduction methods use a fixed pre-specified distance metric. However, this easy treatment has some limitations in practice due to the fact the pre-specified metric is not going to warranty that the closest samples are the truly similar ones. In this paper, we present an adaptive metric learning method for dimensionality reduction, called AML. The adaptive metric learning model is developed by maximizing the difference of the distances between the data pairs in cannot-links and those in must-links. Different from many existing papers that use the traditional Euclidean distance, we use the more generalized l<sub>2,p</sub>-norm distance to reduce sensitivity to noise and outliers, which incorporates additional flexibility and adaptability due to the selection of appropriate p-values for different data sets. Moreover, considering traditional metric learning methods usually project samples into a linear subspace, which is overstrict. We extend the basic linear method to a more powerful nonlinear kernel case so that well capturing complex nonlinear relationship between data. To solve our objective, we have derived an efficient iterative algorithm. Extensive experiments for dimensionality reduction are provided to demonstrate the superiority of our method over state-of-the-art approaches. 展开更多
关键词 Adaptive Learning Kernel Learning dimension reduction Pairwise Constraints
下载PDF
Equation governing the probability density evolution of multi-dimensional linear fractional differential systems subject to Gaussian white noise
20
作者 Yi Luo Meng-Ze Lyu +1 位作者 Jian-Bing Chen Pol D.Spanos 《Theoretical & Applied Mechanics Letters》 CAS CSCD 2023年第3期199-208,共10页
Stochastic fractional differential systems are important and useful in the mathematics,physics,and engineering fields.However,the determination of their probabilistic responses is difficult due to their non-Markovian ... Stochastic fractional differential systems are important and useful in the mathematics,physics,and engineering fields.However,the determination of their probabilistic responses is difficult due to their non-Markovian property.The recently developed globally-evolving-based generalized density evolution equation(GE-GDEE),which is a unified partial differential equation(PDE)governing the transient probability density function(PDF)of a generic path-continuous process,including non-Markovian ones,provides a feasible tool to solve this problem.In the paper,the GE-GDEE for multi-dimensional linear fractional differential systems subject to Gaussian white noise is established.In particular,it is proved that in the GE-GDEE corresponding to the state-quantities of interest,the intrinsic drift coefficient is a time-varying linear function,and can be analytically determined.In this sense,an alternative low-dimensional equivalent linear integer-order differential system with exact closed-form coefficients for the original highdimensional linear fractional differential system can be constructed such that their transient PDFs are identical.Specifically,for a multi-dimensional linear fractional differential system,if only one or two quantities are of interest,GE-GDEE is only in one or two dimensions,and the surrogate system would be a one-or two-dimensional linear integer-order system.Several examples are studied to assess the merit of the proposed method.Though presently the closed-form intrinsic drift coefficient is only available for linear stochastic fractional differential systems,the findings in the present paper provide a remarkable demonstration on the existence and eligibility of GE-GDEE for the case that the original high-dimensional system itself is non-Markovian,and provide insights for the physical-mechanism-informed determination of intrinsic drift and diffusion coefficients of GE-GDEE of more generic complex nonlinear systems. 展开更多
关键词 Globally-evolving-based generalized density evolution equation(GE-GDEE) Linear fractional differential system Non-Markovian system Analytical intrinsic drift coefficient dimension reduction
下载PDF
上一页 1 2 6 下一页 到第
使用帮助 返回顶部