An edge coloring of hypergraph H is a function such that holds for any pair of intersecting edges . The minimum number of colors in edge colorings of H is called the chromatic index of H and is ...An edge coloring of hypergraph H is a function such that holds for any pair of intersecting edges . The minimum number of colors in edge colorings of H is called the chromatic index of H and is denoted by . Erdös, Faber and Lovász proposed a famous conjecture that holds for any loopless linear hypergraph H with n vertices. In this paper, we show that is true for gap-restricted hypergraphs. Our result extends a result of Alesandroni in 2021.展开更多
The structure and function of brain networks have been altered in patients with end-stage renal disease(ESRD).Manifold regularization(MR)only considers the pairing relationship between two brain regions and cannot rep...The structure and function of brain networks have been altered in patients with end-stage renal disease(ESRD).Manifold regularization(MR)only considers the pairing relationship between two brain regions and cannot represent functional interactions or higher-order relationships between multiple brain regions.To solve this issue,we developed a method to construct a dynamic brain functional network(DBFN)based on dynamic hypergraph MR(DHMR)and applied it to the classification of ESRD associated with mild cognitive impairment(ESRDaMCI).The construction of DBFN with Pearson’s correlation(PC)was transformed into an optimization model.Node convolution and hyperedge convolution superposition were adopted to dynamically modify the hypergraph structure,and then got the dynamic hypergraph to form the manifold regular terms of the dynamic hypergraph.The DHMR and L_(1) norm regularization were introduced into the PC-based optimization model to obtain the final DHMR-based DBFN(DDBFN).Experiment results demonstrated the validity of the DDBFN method by comparing the classification results with several related brain functional network construction methods.Our work not only improves better classification performance but also reveals the discriminative regions of ESRDaMCI,providing a reference for clinical research and auxiliary diagnosis of concomitant cognitive impairments.展开更多
Deep matrix factorization(DMF)has been demonstrated to be a powerful tool to take in the complex hierarchical information of multi-view data(MDR).However,existing multiview DMF methods mainly explore the consistency o...Deep matrix factorization(DMF)has been demonstrated to be a powerful tool to take in the complex hierarchical information of multi-view data(MDR).However,existing multiview DMF methods mainly explore the consistency of multi-view data,while neglecting the diversity among different views as well as the high-order relationships of data,resulting in the loss of valuable complementary information.In this paper,we design a hypergraph regularized diverse deep matrix factorization(HDDMF)model for multi-view data representation,to jointly utilize multi-view diversity and a high-order manifold in a multilayer factorization framework.A novel diversity enhancement term is designed to exploit the structural complementarity between different views of data.Hypergraph regularization is utilized to preserve the high-order geometry structure of data in each view.An efficient iterative optimization algorithm is developed to solve the proposed model with theoretical convergence analysis.Experimental results on five real-world data sets demonstrate that the proposed method significantly outperforms stateof-the-art multi-view learning approaches.展开更多
Traffic prediction is a necessary function in intelligent transporta-tion systems to alleviate traffic congestion.Graph learning methods mainly focus on the spatiotemporal dimension,but ignore the nonlinear movement o...Traffic prediction is a necessary function in intelligent transporta-tion systems to alleviate traffic congestion.Graph learning methods mainly focus on the spatiotemporal dimension,but ignore the nonlinear movement of traffic prediction and the high-order relationships among various kinds of road segments.There exist two issues:1)deep integration of the spatiotempo-ral information and 2)global spatial dependencies for structural properties.To address these issues,we propose a nonlinear spatiotemporal optimization method,which introduces hypergraph convolution networks(HGCN).The method utilizes the higher-order spatial features of the road network captured by HGCN,and dynamically integrates them with the historical data to weigh the influence of spatiotemporal dependencies.On this basis,an extended Kalman filter is used to improve the accuracy of traffic prediction.In this study,a set of experiments were conducted on the real-world dataset in Chengdu,China.The result showed that the proposed method is feasible and accurate by two different time steps.Especially at the 15-minute time step,compared with the second-best method,the proposed method achieved 3.0%,11.7%,and 9.0%improvements in RMSE,MAE,and MAPE,respectively.展开更多
Live Virtual Machine(VM)migration is one of the foremost techniques for progressing Cloud Data Centers’(CDC)proficiency as it leads to better resource usage.The workload of CDC is often dynamic in nature,it is better ...Live Virtual Machine(VM)migration is one of the foremost techniques for progressing Cloud Data Centers’(CDC)proficiency as it leads to better resource usage.The workload of CDC is often dynamic in nature,it is better to envisage the upcoming workload for early detection of overload status,underload status and to trigger the migration at an appropriate point wherein enough number of resources are available.Though various statistical and machine learning approaches are widely applied for resource usage prediction,they often failed to handle the increase of non-linear CDC data.To overcome this issue,a novel Hypergrah based Convolutional Deep Bi-Directional-Long Short Term Memory(CDB-LSTM)model is proposed.The CDB-LSTM adopts Helly property of Hypergraph and Savitzky–Golay(SG)filter to select informative samples and exclude noisy inference&outliers.The proposed approach optimizes resource usage prediction and reduces the number of migrations with minimal computa-tional complexity during live VM migration.Further,the proposed prediction approach implements the correlation co-efficient measure to select the appropriate destination server for VM migration.A Hypergraph based CDB-LSTM was vali-dated using Google cluster dataset and compared with state-of-the-art approaches in terms of various evaluation metrics.展开更多
Deep learning(DL)has shown its superior performance in dealing with various computer vision tasks in recent years.As a simple and effective DL model,autoencoder(AE)is popularly used to decompose hyperspectral images(H...Deep learning(DL)has shown its superior performance in dealing with various computer vision tasks in recent years.As a simple and effective DL model,autoencoder(AE)is popularly used to decompose hyperspectral images(HSIs)due to its powerful ability of feature extraction and data reconstruction.However,most existing AE-based unmixing algorithms usually ignore the spatial information of HSIs.To solve this problem,a hypergraph regularized deep autoencoder(HGAE)is proposed for unmixing.Firstly,the traditional AE architecture is specifically improved as an unsupervised unmixing framework.Secondly,hypergraph learning is employed to reformulate the loss function,which facilitates the expression of high-order similarity among locally neighboring pixels and promotes the consistency of their abundances.Moreover,L_(1/2)norm is further used to enhance abundances sparsity.Finally,the experiments on simulated data,real hyperspectral remote sensing images,and textile cloth images are used to verify that the proposed method can perform better than several state-of-the-art unmixing algorithms.展开更多
A new branch of hypergraph theory-directed hyperaph theory and a kind of new methods-dicomposition contraction(DCP, PDCP and GDC) methods are presented for solving hypernetwork problems.lts computing time is lower tha...A new branch of hypergraph theory-directed hyperaph theory and a kind of new methods-dicomposition contraction(DCP, PDCP and GDC) methods are presented for solving hypernetwork problems.lts computing time is lower than that of ECP method in several order of magnitude.展开更多
We employ graph parameter, the rupture degree, to measure the vulnerability of <em>k</em>-uniform hypergraph <em>G<sup>k</sup></em>. For the k-uniform hypergraph <em>G<sup&...We employ graph parameter, the rupture degree, to measure the vulnerability of <em>k</em>-uniform hypergraph <em>G<sup>k</sup></em>. For the k-uniform hypergraph <em>G<sup>k</sup></em> underlying a non-complete graph <em>G</em> = (<em>V</em>, <em>E</em>), its rupture degree <em>r</em>(<em>G<sup>k</sup></em>) is defined as <em>r</em>(<em>G<sup>k</sup></em>) = max{<em>ω</em>(<em>G<sup>k</sup></em> - <em>X</em>) - |<em>X</em>| - <em>m</em>(<em>G<sup>k</sup></em> - <em>X</em>): <em>X</em> <span style="white-space:nowrap;">⊂</span> <em>V</em>(<em>G<sup>k</sup></em>), <em>ω</em>(<em>G<sup>k</sup></em> - <em>X</em>) > 1}, where <em>X</em> is a cut set (or destruction strategy) of <em>G<sup>k</sup></em>, <em>ω</em>(<em>G<sup>k</sup></em> - <em>X</em>) and <em>m</em>(<em>G<sup>k</sup></em> - <em>X</em>) denote the number of components and the order of a largest component in <em>G<sup>k</sup></em> - <em>X</em>, respectively. It is shown that this parameter can be used to measure the vulnerability of networks. In this paper, the rupture degrees of several specific classes of <em>k</em>-uniform hypergraph are determined.展开更多
The product functional confguration(PFC)is typically used by frms to satisfy the individual requirements of customers and is realized based on market analysis.This study aims to help frms analyze functions and realize...The product functional confguration(PFC)is typically used by frms to satisfy the individual requirements of customers and is realized based on market analysis.This study aims to help frms analyze functions and realize functional confgurations using patent data.This study frst proposes a patent-data-driven PFC method based on a hypergraph network.It then constructs a weighted network model to optimize the combination of product function quantity and object from the perspective of big data,as follows:(1)The functional knowledge contained in the patent is extracted.(2)The functional hypergraph is constructed based on the co-occurrence relationship between patents and applicants.(3)The function and patent weight are calculated from the patent applicant’s perspective and patent value.(4)A weight calculation model of the PFC is developed.(5)The weighted frequent subgraph algorithm is used to obtain the optimal function combination list.This method is applied to an innovative design process of a bathroom shower.The results indicate that this method can help frms detach optimal function candidates and develop a multifunctional product.展开更多
This paper discusses the features and relevant theories of GIS spatial data model based on hypergraph,etc.The integrated concept model based on hypergraph and object_oriented model (HOOM) is proposed by the authors.Th...This paper discusses the features and relevant theories of GIS spatial data model based on hypergraph,etc.The integrated concept model based on hypergraph and object_oriented model (HOOM) is proposed by the authors.The principal contribution of this paper is that we study the K_section and other theories of hypergraph.An application example using HOOM is given at the end of the paper.展开更多
To overcome the limitation of the traditional clustering algorithms which fail to produce meaningful clusters in high-dimensional, sparseness and binary value data sets, a new method based on hypergraph model is propo...To overcome the limitation of the traditional clustering algorithms which fail to produce meaningful clusters in high-dimensional, sparseness and binary value data sets, a new method based on hypergraph model is proposed. The hypergraph model maps the relationship present in the original data in high dimensional space into a hypergraph. A hyperedge represents the similarity of attrlbute-value distribution between two points. A hypergraph partitioning algorithm is used to find a partitioning of the vertices such that the corresponding data items in each partition are highly related and the weight of the hyperedges cut by the partitioning is minimized. The quality of the clustering result can be evaluated by applying the intra-cluster singularity value. Analysis and experimental results have demonstrated that this approach is applicable and effective in wide ranging scheme.展开更多
In order to guarantee the wireless multicast throughput at a minimum cost, we propose a layered hypergraph high-dimension clustering algorithm (LayerHC) considering the channels and statistical locations of mobile mem...In order to guarantee the wireless multicast throughput at a minimum cost, we propose a layered hypergraph high-dimension clustering algorithm (LayerHC) considering the channels and statistical locations of mobile members. The algorithm can achieve a minimum multicast spanning tree to obtain a minimum number of relays and effective cooperative areas with low computational complexity.展开更多
Cloud storage has the characteristics of distributed and virtual, and it makes the ownership rights and management rights of users data separated. The master-slave architecture of cloud storage has a problem of single...Cloud storage has the characteristics of distributed and virtual, and it makes the ownership rights and management rights of users data separated. The master-slave architecture of cloud storage has a problem of single point failure. In this paper, we provide a cloud storage architecture model based on Semantic equivalence. According to semantic matching degree, this architecture divides the nodes into node cluster by creating semantic tree and maintains system routing through semantic hypergraph. Through simulation experiments show that dividing network into semantic can enhance scalability and flexibility of the system, and it can improve the efficiency of network organization and the security of cloud storage system, at the same time, it can also reduce the cloud data storage and the delay of reading time.展开更多
The relations among the dominating number, independence number and covering number of hypergraphs are investigated. Main results are as follows:Dv(H)≤min{α≤(H), p(H), p(H), T(H)}; De(H)≤min{v(H), T...The relations among the dominating number, independence number and covering number of hypergraphs are investigated. Main results are as follows:Dv(H)≤min{α≤(H), p(H), p(H), T(H)}; De(H)≤min{v(H), T(H), p(H)}; DT(H) ≤αT(H); S(H)≤ Dv (H) + α(H)≤n; 2≤ Dv (H) + T(H) ≤n; 2 〈 Dv (H) + v(H)≤n/2 + [n/r]; Dv (H) + p(H) 〈_n;2≤De(H) + Dv(H)≤n/2 + [n/r];α(H) + De(H)≤n;2 ≤ De(H) + v(H)≤2[n/r]; 2 De(H) + p(H)≤n-r + 2.展开更多
Fog computing is a new paradigm supporting the stringent requirements of mobility applications by bridging cloud computing and smart devices. Since the smart devices may be deployed in dynamic areas where are out of s...Fog computing is a new paradigm supporting the stringent requirements of mobility applications by bridging cloud computing and smart devices. Since the smart devices may be deployed in dynamic areas where are out of strict monitoring and protection, fog computing requires security protections to ensure confidentiality and integrity. In this article, to deal with security requirements and considering the distinctive features, a key management based on hypergraph schemed is designed. Firstly, based on the key hypergraph, the three hierarchy architecture of fog computing is divided into two subnetworks. Furthermore, each key management process of both two subnetworks is designed to satisfy the operational and security requirements of fog computing. Finally, the performance evaluation and numerical simulation have been provided to validate the proposed scheme.展开更多
Hyperspectral unmixing aims to acquire pure spectra of distinct substances(endmembers)and fractional abundances from highly mixed pixels.In this paper,a deep unmixing network framework is designed to deal with the noi...Hyperspectral unmixing aims to acquire pure spectra of distinct substances(endmembers)and fractional abundances from highly mixed pixels.In this paper,a deep unmixing network framework is designed to deal with the noise disturbance.It contains two parts:a three⁃dimensional convolutional autoencoder(denoising 3D CAE)which recovers data from noised input,and a restrictive non⁃negative sparse autoencoder(NNSAE)which incorporates a hypergraph regularizer as well as a l2,1⁃norm sparsity constraint to improve the unmixing performance.The deep denoising 3D CAE network was constructed for noisy data retrieval,and had strong capacity of extracting the principle and robust local features in spatial and spectral domains efficiently by training with corrupted data.Furthermore,a part⁃based nonnegative sparse autoencoder with l2,1⁃norm penalty was concatenated,and a hypergraph regularizer was designed elaborately to represent similarity of neighboring pixels in spatial dimensions.Comparative experiments were conducted on synthetic and real⁃world data,which both demonstrate the effectiveness and robustness of the proposed network.展开更多
Suppose to toss an independent coin with equal probability of success and failure for each subset of [n] = {1, 2, ..., n}, and form the random hypergraph H(n) by taking as hyperedges the subsets with successful coin t...Suppose to toss an independent coin with equal probability of success and failure for each subset of [n] = {1, 2, ..., n}, and form the random hypergraph H(n) by taking as hyperedges the subsets with successful coin tosses. It is proved that H(n) is almost surely connected. By defining a graph G(S) according to a subset system S, it is shown that the intersecting problem is NP-complete.展开更多
semantics information while maintaining spatial detail con-texts.Long-range context information plays a crucial role in this scenario.How-ever,the traditional convolution kernel only provides the local and small size ...semantics information while maintaining spatial detail con-texts.Long-range context information plays a crucial role in this scenario.How-ever,the traditional convolution kernel only provides the local and small size of the receptivefield.To address the problem,we propose a plug-and-play module aggregating both local and global information(aka LGIA module)to capture the high-order relationship between nodes that are far apart.We incorporate both local and global correlations into hypergraph which is able to capture high-order rela-tionships between nodes via the concept of a hyperedge connecting a subset of nodes.The local correlation considers neighborhood nodes that are spatially adja-cent and similar in the same CNN feature maps of magnetic resonance(MR)image;and the global correlation is searched from a batch of CNN feature maps of MR images in feature space.The influence of these two correlations on seman-tic segmentation is complementary.We validated our LGIA module on various CNN segmentation models with the cardiac MR images dataset.Experimental results demonstrate that our approach outperformed several baseline models.展开更多
The celebrated Erdos-Ko-Rado theorem states that given n≥2k,every intersecting k-uni-n-1 form hypergraph G on n vertices has at most(n-1 k-1)edges.This paper states spectral versions of the Erd6s-_Ko--Rado theorem:le...The celebrated Erdos-Ko-Rado theorem states that given n≥2k,every intersecting k-uni-n-1 form hypergraph G on n vertices has at most(n-1 k-1)edges.This paper states spectral versions of the Erd6s-_Ko--Rado theorem:let G be an intersecting k-uniform hypergraph on n vertices with n≥2k.Then,the sharp upper bounds for the spectral radius of Aα(G)and 2*(G)are presented,where Aα(G)=αD(G)+(1-α).A(G)is a convex linear combination of the degree diagonal tensor D(G)and the adjacency tensor A(G)for 0≤α<1,and Q^(*)(G)is the incidence Q-tensor,respectively.Furthermore,when n>2k,the extremal hypergraphs which attain the sharp upper bounds are characterized.The proof mainly relies on the Perron-Frobenius theorem for nonnegative tensor and the property of the maximizing connected hypergraphs.展开更多
文摘An edge coloring of hypergraph H is a function such that holds for any pair of intersecting edges . The minimum number of colors in edge colorings of H is called the chromatic index of H and is denoted by . Erdös, Faber and Lovász proposed a famous conjecture that holds for any loopless linear hypergraph H with n vertices. In this paper, we show that is true for gap-restricted hypergraphs. Our result extends a result of Alesandroni in 2021.
基金supported by the National Natural Science Foundation of China (No.51877013),(ZJ),(http://www.nsfc.gov.cn/)the Jiangsu Provincial Key Research and Development Program (No.BE2021636),(ZJ),(http://kxjst.jiangsu.gov.cn/)+1 种基金the Science and Technology Project of Changzhou City (No.CE20205056),(ZJ),(http://kjj.changzhou.gov.cn/)by Qing Lan Project of Jiangsu Province (no specific grant number),(ZJ),(http://jyt.jiangsu.gov.cn/).
文摘The structure and function of brain networks have been altered in patients with end-stage renal disease(ESRD).Manifold regularization(MR)only considers the pairing relationship between two brain regions and cannot represent functional interactions or higher-order relationships between multiple brain regions.To solve this issue,we developed a method to construct a dynamic brain functional network(DBFN)based on dynamic hypergraph MR(DHMR)and applied it to the classification of ESRD associated with mild cognitive impairment(ESRDaMCI).The construction of DBFN with Pearson’s correlation(PC)was transformed into an optimization model.Node convolution and hyperedge convolution superposition were adopted to dynamically modify the hypergraph structure,and then got the dynamic hypergraph to form the manifold regular terms of the dynamic hypergraph.The DHMR and L_(1) norm regularization were introduced into the PC-based optimization model to obtain the final DHMR-based DBFN(DDBFN).Experiment results demonstrated the validity of the DDBFN method by comparing the classification results with several related brain functional network construction methods.Our work not only improves better classification performance but also reveals the discriminative regions of ESRDaMCI,providing a reference for clinical research and auxiliary diagnosis of concomitant cognitive impairments.
基金This work was supported by the National Natural Science Foundation of China(62073087,62071132,61973090).
文摘Deep matrix factorization(DMF)has been demonstrated to be a powerful tool to take in the complex hierarchical information of multi-view data(MDR).However,existing multiview DMF methods mainly explore the consistency of multi-view data,while neglecting the diversity among different views as well as the high-order relationships of data,resulting in the loss of valuable complementary information.In this paper,we design a hypergraph regularized diverse deep matrix factorization(HDDMF)model for multi-view data representation,to jointly utilize multi-view diversity and a high-order manifold in a multilayer factorization framework.A novel diversity enhancement term is designed to exploit the structural complementarity between different views of data.Hypergraph regularization is utilized to preserve the high-order geometry structure of data in each view.An efficient iterative optimization algorithm is developed to solve the proposed model with theoretical convergence analysis.Experimental results on five real-world data sets demonstrate that the proposed method significantly outperforms stateof-the-art multi-view learning approaches.
文摘Traffic prediction is a necessary function in intelligent transporta-tion systems to alleviate traffic congestion.Graph learning methods mainly focus on the spatiotemporal dimension,but ignore the nonlinear movement of traffic prediction and the high-order relationships among various kinds of road segments.There exist two issues:1)deep integration of the spatiotempo-ral information and 2)global spatial dependencies for structural properties.To address these issues,we propose a nonlinear spatiotemporal optimization method,which introduces hypergraph convolution networks(HGCN).The method utilizes the higher-order spatial features of the road network captured by HGCN,and dynamically integrates them with the historical data to weigh the influence of spatiotemporal dependencies.On this basis,an extended Kalman filter is used to improve the accuracy of traffic prediction.In this study,a set of experiments were conducted on the real-world dataset in Chengdu,China.The result showed that the proposed method is feasible and accurate by two different time steps.Especially at the 15-minute time step,compared with the second-best method,the proposed method achieved 3.0%,11.7%,and 9.0%improvements in RMSE,MAE,and MAPE,respectively.
文摘Live Virtual Machine(VM)migration is one of the foremost techniques for progressing Cloud Data Centers’(CDC)proficiency as it leads to better resource usage.The workload of CDC is often dynamic in nature,it is better to envisage the upcoming workload for early detection of overload status,underload status and to trigger the migration at an appropriate point wherein enough number of resources are available.Though various statistical and machine learning approaches are widely applied for resource usage prediction,they often failed to handle the increase of non-linear CDC data.To overcome this issue,a novel Hypergrah based Convolutional Deep Bi-Directional-Long Short Term Memory(CDB-LSTM)model is proposed.The CDB-LSTM adopts Helly property of Hypergraph and Savitzky–Golay(SG)filter to select informative samples and exclude noisy inference&outliers.The proposed approach optimizes resource usage prediction and reduces the number of migrations with minimal computa-tional complexity during live VM migration.Further,the proposed prediction approach implements the correlation co-efficient measure to select the appropriate destination server for VM migration.A Hypergraph based CDB-LSTM was vali-dated using Google cluster dataset and compared with state-of-the-art approaches in terms of various evaluation metrics.
基金National Natural Science Foundation of China(No.62001098)Fundamental Research Funds for the Central Universities of Ministry of Education of China(No.2232020D-33)。
文摘Deep learning(DL)has shown its superior performance in dealing with various computer vision tasks in recent years.As a simple and effective DL model,autoencoder(AE)is popularly used to decompose hyperspectral images(HSIs)due to its powerful ability of feature extraction and data reconstruction.However,most existing AE-based unmixing algorithms usually ignore the spatial information of HSIs.To solve this problem,a hypergraph regularized deep autoencoder(HGAE)is proposed for unmixing.Firstly,the traditional AE architecture is specifically improved as an unsupervised unmixing framework.Secondly,hypergraph learning is employed to reformulate the loss function,which facilitates the expression of high-order similarity among locally neighboring pixels and promotes the consistency of their abundances.Moreover,L_(1/2)norm is further used to enhance abundances sparsity.Finally,the experiments on simulated data,real hyperspectral remote sensing images,and textile cloth images are used to verify that the proposed method can perform better than several state-of-the-art unmixing algorithms.
文摘A new branch of hypergraph theory-directed hyperaph theory and a kind of new methods-dicomposition contraction(DCP, PDCP and GDC) methods are presented for solving hypernetwork problems.lts computing time is lower than that of ECP method in several order of magnitude.
文摘We employ graph parameter, the rupture degree, to measure the vulnerability of <em>k</em>-uniform hypergraph <em>G<sup>k</sup></em>. For the k-uniform hypergraph <em>G<sup>k</sup></em> underlying a non-complete graph <em>G</em> = (<em>V</em>, <em>E</em>), its rupture degree <em>r</em>(<em>G<sup>k</sup></em>) is defined as <em>r</em>(<em>G<sup>k</sup></em>) = max{<em>ω</em>(<em>G<sup>k</sup></em> - <em>X</em>) - |<em>X</em>| - <em>m</em>(<em>G<sup>k</sup></em> - <em>X</em>): <em>X</em> <span style="white-space:nowrap;">⊂</span> <em>V</em>(<em>G<sup>k</sup></em>), <em>ω</em>(<em>G<sup>k</sup></em> - <em>X</em>) > 1}, where <em>X</em> is a cut set (or destruction strategy) of <em>G<sup>k</sup></em>, <em>ω</em>(<em>G<sup>k</sup></em> - <em>X</em>) and <em>m</em>(<em>G<sup>k</sup></em> - <em>X</em>) denote the number of components and the order of a largest component in <em>G<sup>k</sup></em> - <em>X</em>, respectively. It is shown that this parameter can be used to measure the vulnerability of networks. In this paper, the rupture degrees of several specific classes of <em>k</em>-uniform hypergraph are determined.
基金Supported by National Natural Science Foundation of China(Grant No.51875220)China Fujian Province Social Science Foundation Research Project(Grant No.FJ2021B128).
文摘The product functional confguration(PFC)is typically used by frms to satisfy the individual requirements of customers and is realized based on market analysis.This study aims to help frms analyze functions and realize functional confgurations using patent data.This study frst proposes a patent-data-driven PFC method based on a hypergraph network.It then constructs a weighted network model to optimize the combination of product function quantity and object from the perspective of big data,as follows:(1)The functional knowledge contained in the patent is extracted.(2)The functional hypergraph is constructed based on the co-occurrence relationship between patents and applicants.(3)The function and patent weight are calculated from the patent applicant’s perspective and patent value.(4)A weight calculation model of the PFC is developed.(5)The weighted frequent subgraph algorithm is used to obtain the optimal function combination list.This method is applied to an innovative design process of a bathroom shower.The results indicate that this method can help frms detach optimal function candidates and develop a multifunctional product.
文摘This paper discusses the features and relevant theories of GIS spatial data model based on hypergraph,etc.The integrated concept model based on hypergraph and object_oriented model (HOOM) is proposed by the authors.The principal contribution of this paper is that we study the K_section and other theories of hypergraph.An application example using HOOM is given at the end of the paper.
文摘To overcome the limitation of the traditional clustering algorithms which fail to produce meaningful clusters in high-dimensional, sparseness and binary value data sets, a new method based on hypergraph model is proposed. The hypergraph model maps the relationship present in the original data in high dimensional space into a hypergraph. A hyperedge represents the similarity of attrlbute-value distribution between two points. A hypergraph partitioning algorithm is used to find a partitioning of the vertices such that the corresponding data items in each partition are highly related and the weight of the hyperedges cut by the partitioning is minimized. The quality of the clustering result can be evaluated by applying the intra-cluster singularity value. Analysis and experimental results have demonstrated that this approach is applicable and effective in wide ranging scheme.
基金Acknowledgements This work was supported by Natural Science Foundation of Beijing under Grant No. 4102041.
文摘In order to guarantee the wireless multicast throughput at a minimum cost, we propose a layered hypergraph high-dimension clustering algorithm (LayerHC) considering the channels and statistical locations of mobile members. The algorithm can achieve a minimum multicast spanning tree to obtain a minimum number of relays and effective cooperative areas with low computational complexity.
基金supported in part by the National Science and technology support program of China No. 2014BAH29F05the National High-Tech R&D Program (863 Program) No. 2015AA01A705+3 种基金the National Natural Science Foundation of China under Grant No. 61572072the National Science and Technology Major Project No. 2015ZX03001041the Fundamental Research Funds for the Central Universities No. FRF-TP-14-046A2"Research on the System of Personalized Education using Big Data"
文摘Cloud storage has the characteristics of distributed and virtual, and it makes the ownership rights and management rights of users data separated. The master-slave architecture of cloud storage has a problem of single point failure. In this paper, we provide a cloud storage architecture model based on Semantic equivalence. According to semantic matching degree, this architecture divides the nodes into node cluster by creating semantic tree and maintains system routing through semantic hypergraph. Through simulation experiments show that dividing network into semantic can enhance scalability and flexibility of the system, and it can improve the efficiency of network organization and the security of cloud storage system, at the same time, it can also reduce the cloud data storage and the delay of reading time.
基金Supported by Ningbo Institute of Technology, Zhejiang Univ. Youth Innovation Foundation and Zhejiang Provincial Natural Science Foundation( Y604167).
文摘The relations among the dominating number, independence number and covering number of hypergraphs are investigated. Main results are as follows:Dv(H)≤min{α≤(H), p(H), p(H), T(H)}; De(H)≤min{v(H), T(H), p(H)}; DT(H) ≤αT(H); S(H)≤ Dv (H) + α(H)≤n; 2≤ Dv (H) + T(H) ≤n; 2 〈 Dv (H) + v(H)≤n/2 + [n/r]; Dv (H) + p(H) 〈_n;2≤De(H) + Dv(H)≤n/2 + [n/r];α(H) + De(H)≤n;2 ≤ De(H) + v(H)≤2[n/r]; 2 De(H) + p(H)≤n-r + 2.
文摘Fog computing is a new paradigm supporting the stringent requirements of mobility applications by bridging cloud computing and smart devices. Since the smart devices may be deployed in dynamic areas where are out of strict monitoring and protection, fog computing requires security protections to ensure confidentiality and integrity. In this article, to deal with security requirements and considering the distinctive features, a key management based on hypergraph schemed is designed. Firstly, based on the key hypergraph, the three hierarchy architecture of fog computing is divided into two subnetworks. Furthermore, each key management process of both two subnetworks is designed to satisfy the operational and security requirements of fog computing. Finally, the performance evaluation and numerical simulation have been provided to validate the proposed scheme.
基金Sponsored by the National Natural Science Foundation of China(Grant No.61876054)the National Key Research and Development Program of China(Grant No.2019YFC0117400).
文摘Hyperspectral unmixing aims to acquire pure spectra of distinct substances(endmembers)and fractional abundances from highly mixed pixels.In this paper,a deep unmixing network framework is designed to deal with the noise disturbance.It contains two parts:a three⁃dimensional convolutional autoencoder(denoising 3D CAE)which recovers data from noised input,and a restrictive non⁃negative sparse autoencoder(NNSAE)which incorporates a hypergraph regularizer as well as a l2,1⁃norm sparsity constraint to improve the unmixing performance.The deep denoising 3D CAE network was constructed for noisy data retrieval,and had strong capacity of extracting the principle and robust local features in spatial and spectral domains efficiently by training with corrupted data.Furthermore,a part⁃based nonnegative sparse autoencoder with l2,1⁃norm penalty was concatenated,and a hypergraph regularizer was designed elaborately to represent similarity of neighboring pixels in spatial dimensions.Comparative experiments were conducted on synthetic and real⁃world data,which both demonstrate the effectiveness and robustness of the proposed network.
文摘Suppose to toss an independent coin with equal probability of success and failure for each subset of [n] = {1, 2, ..., n}, and form the random hypergraph H(n) by taking as hyperedges the subsets with successful coin tosses. It is proved that H(n) is almost surely connected. By defining a graph G(S) according to a subset system S, it is shown that the intersecting problem is NP-complete.
基金supported by the Sichuan Science and Technology Program(Grant No.2019ZDZX0005,2019YFG0496,2020YFG0143,2019JDJQ0002 and 2020YFG0009).
文摘semantics information while maintaining spatial detail con-texts.Long-range context information plays a crucial role in this scenario.How-ever,the traditional convolution kernel only provides the local and small size of the receptivefield.To address the problem,we propose a plug-and-play module aggregating both local and global information(aka LGIA module)to capture the high-order relationship between nodes that are far apart.We incorporate both local and global correlations into hypergraph which is able to capture high-order rela-tionships between nodes via the concept of a hyperedge connecting a subset of nodes.The local correlation considers neighborhood nodes that are spatially adja-cent and similar in the same CNN feature maps of magnetic resonance(MR)image;and the global correlation is searched from a batch of CNN feature maps of MR images in feature space.The influence of these two correlations on seman-tic segmentation is complementary.We validated our LGIA module on various CNN segmentation models with the cardiac MR images dataset.Experimental results demonstrate that our approach outperformed several baseline models.
基金the National Natural Science Foundation of China(Nos.11971311,11531001)the Montenegrin-Chinese Science and Technology Cooperation Project(No.3-12).
文摘The celebrated Erdos-Ko-Rado theorem states that given n≥2k,every intersecting k-uni-n-1 form hypergraph G on n vertices has at most(n-1 k-1)edges.This paper states spectral versions of the Erd6s-_Ko--Rado theorem:let G be an intersecting k-uniform hypergraph on n vertices with n≥2k.Then,the sharp upper bounds for the spectral radius of Aα(G)and 2*(G)are presented,where Aα(G)=αD(G)+(1-α).A(G)is a convex linear combination of the degree diagonal tensor D(G)and the adjacency tensor A(G)for 0≤α<1,and Q^(*)(G)is the incidence Q-tensor,respectively.Furthermore,when n>2k,the extremal hypergraphs which attain the sharp upper bounds are characterized.The proof mainly relies on the Perron-Frobenius theorem for nonnegative tensor and the property of the maximizing connected hypergraphs.