The challenge of transitioning from temporary humanitarian settlements to more sustainable human settlements is due to a significant increase in the number of forcibly displaced people over recent decades, difficultie...The challenge of transitioning from temporary humanitarian settlements to more sustainable human settlements is due to a significant increase in the number of forcibly displaced people over recent decades, difficulties in providing social services that meet the required standards, and the prolongation of emergencies. Despite this challenging context, short-term considerations continue to guide their planning and management rather than more integrated, longer-term perspectives, thus preventing viable, sustainable development. Over the years, the design of humanitarian settlements has not been adapted to local contexts and perspectives, nor to the dynamics of urbanization and population growth and data. In addition, the current approach to temporary settlement harms the environment and can strain limited resources. Inefficient land use and ad hoc development models have compounded difficulties and generated new challenges. As a result, living conditions in settlements have deteriorated over the last few decades and continue to pose new challenges. The stakes are such that major shortcomings have emerged along the way, leading to disruption, budget overruns in a context marked by a steady decline in funding. However, some attempts have been made to shift towards more sustainable approaches, but these have mainly focused on vague, sector-oriented themes, failing to consider systematic and integration views. This study is a contribution in addressing these shortcomings by designing a model-driving solution, emphasizing an integrated system conceptualized as a system of systems. This paper proposes a new methodology for designing an integrated and sustainable human settlement model, based on Model-Based Systems Engineering and a Systems Modeling Language to provide valuable insights toward sustainable solutions for displaced populations aligning with the United Nations 2030 agenda for sustainable development.展开更多
Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear mode...Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.展开更多
It is common for datasets to contain both categorical and continuous variables. However, many feature screening methods designed for high-dimensional classification assume that the variables are continuous. This limit...It is common for datasets to contain both categorical and continuous variables. However, many feature screening methods designed for high-dimensional classification assume that the variables are continuous. This limits the applicability of existing methods in handling this complex scenario. To address this issue, we propose a model-free feature screening approach for ultra-high-dimensional multi-classification that can handle both categorical and continuous variables. Our proposed feature screening method utilizes the Maximal Information Coefficient to assess the predictive power of the variables. By satisfying certain regularity conditions, we have proven that our screening procedure possesses the sure screening property and ranking consistency properties. To validate the effectiveness of our approach, we conduct simulation studies and provide real data analysis examples to demonstrate its performance in finite samples. In summary, our proposed method offers a solution for effectively screening features in ultra-high-dimensional datasets with a mixture of categorical and continuous covariates.展开更多
In ultra-high-dimensional data, it is common for the response variable to be multi-classified. Therefore, this paper proposes a model-free screening method for variables whose response variable is multi-classified fro...In ultra-high-dimensional data, it is common for the response variable to be multi-classified. Therefore, this paper proposes a model-free screening method for variables whose response variable is multi-classified from the point of view of introducing Jensen-Shannon divergence to measure the importance of covariates. The idea of the method is to calculate the Jensen-Shannon divergence between the conditional probability distribution of the covariates on a given response variable and the unconditional probability distribution of the covariates, and then use the probabilities of the response variables as weights to calculate the weighted Jensen-Shannon divergence, where a larger weighted Jensen-Shannon divergence means that the covariates are more important. Additionally, we also investigated an adapted version of the method, which is to measure the relationship between the covariates and the response variable using the weighted Jensen-Shannon divergence adjusted by the logarithmic factor of the number of categories when the number of categories in each covariate varies. Then, through both theoretical and simulation experiments, it was demonstrated that the proposed methods have sure screening and ranking consistency properties. Finally, the results from simulation and real-dataset experiments show that in feature screening, the proposed methods investigated are robust in performance and faster in computational speed compared with an existing method.展开更多
The escalating costs of research and development, coupled with the influx of researchers, have led to a surge in published articles across scientific disciplines. However, concerns have arisen regarding the accuracy, ...The escalating costs of research and development, coupled with the influx of researchers, have led to a surge in published articles across scientific disciplines. However, concerns have arisen regarding the accuracy, validity, and reproducibility of reported findings. Issues such as replication problems, fraudulent practices, and a lack of expertise in measurement theory and uncertainty analysis have raised doubts about the reliability and credibility of scientific research. Rigorous assessment practices in certain fields highlight the importance of identifying potential errors and understanding the relationship between technical parameters and research outcomes. To address these concerns, a universally applicable criterion called comparative certainty is urgently needed. This criterion, grounded in an analysis of the modeling process and information transmission, accumulation, and transformation in both theoretical and applied research, aims to evaluate the acceptable deviation between a model and the observed phenomenon. It provides a theoretically grounded framework applicable to all scientific disciplines adhering to the International System of Units (SI). Objective evaluations based on this criterion can enhance the reproducibility and reliability of scientific investigations, instilling greater confidence in published findings. Establishing this criterion would be a significant stride towards ensuring the robustness and credibility of scientific research across disciplines.展开更多
This paper focuses on the use of models for increasing the precision of estimators in large-area forest surveys. It is motivated by the increasing availability of remotely sensed data, which facilitates the developmen...This paper focuses on the use of models for increasing the precision of estimators in large-area forest surveys. It is motivated by the increasing availability of remotely sensed data, which facilitates the development of models predicting the variables of interest in forest surveys. We present, review and compare three different estimation frameworks where models play a core role: model-assisted, model-based, and hybrid estimation. The first two are well known, whereas the third has only recently been introduced in forest surveys. Hybrid inference mixes design- based and model-based inference, since it relies on a probability sample of auxiliary data and a model predicting the target variable from the auxiliary data.We review studies on large-area forest surveys based on model-assisted, model- based, and hybrid estimation, and discuss advantages and disadvantages of the approaches. We conclude that no general recommendations can be made about whether model-assisted, model-based, or hybrid estimation should be preferred. The choice depends on the objective of the survey and the possibilities to acquire appropriate field and remotely sensed data. We also conclude that modelling approaches can only be successfully applied for estimating target variables such as growing stock volume or biomass, which are adequately related to commonly available remotely sensed data, and thus purely field based surveys remain important for several important forest parameters.展开更多
Debugging software code has been a challenge for software developers since the early days of computer programming. A simple need, because the world is run by software. So perhaps the biggest engineering challenge is f...Debugging software code has been a challenge for software developers since the early days of computer programming. A simple need, because the world is run by software. So perhaps the biggest engineering challenge is finding ways to make software more reliable. This review provides an overview of techniques developed over time in the field of software model checking to solve the problem of detecting errors in program code. In addition, the challenges posed by this technology are discussed and ways to mitigate them in future research and applications are proposed. A comprehensive examination of the various model verification methods used to detect program code errors is intended to lay the foundation for future research in this area.展开更多
Although the Model-Driven paradigm is being accepted in the research environment as a very useful and powerful option for effective software development, its real application in the enterprise context is still a chall...Although the Model-Driven paradigm is being accepted in the research environment as a very useful and powerful option for effective software development, its real application in the enterprise context is still a challenge for software engineering. Several causes can be stacked out, but one of them can be the lack of tool support for the efficient application of this paradigm. This paper presents a set of tools, grouped in a suite named NDT-Suite, which under the Model-Driven paradigm offer a suitable solution for software development. These tools explore different options that this paradigm can improve such as, development, quality assurance or requirement treatment. Besides, this paper analyses how they are being successfully applied in the industry.展开更多
目的运用Model-View-Controller架构于因特网,设计与研发中医舌诊教学的网站。方法本因特网教学系统使用Websphere6.0服务器、DB28.1.7数据库、Java Server Faces、Structure Query Language、HTML以及JavaScript等共同开发而成。邀请8...目的运用Model-View-Controller架构于因特网,设计与研发中医舌诊教学的网站。方法本因特网教学系统使用Websphere6.0服务器、DB28.1.7数据库、Java Server Faces、Structure Query Language、HTML以及JavaScript等共同开发而成。邀请85位大学护理学生参与测试。结果中医舌诊网站含盖八大相关部分。学习成效测试结果显示,网络教学方式可增加学生相关的专业知识。展开更多
The condition of rotor system must be assessed in order to develop condition-based maintenance for rotating machinery. It is determined by multiple variables such as unbalance degree, misalignment degree, the amount o...The condition of rotor system must be assessed in order to develop condition-based maintenance for rotating machinery. It is determined by multiple variables such as unbalance degree, misalignment degree, the amount of bending deformation of the shaft, occurrence of shaft crack of rotor system and so on. The estimation of the degrees of unbalance and misalignment in flexible coupling-rotor system is discussed. The model-based approach is employed to solve this problem. The models of the equivalent external loads for unbalance and misalignment are derived and analyzed. Then, the degrees of unbalance and misalignment are estimated by analyzing the components of the equivalent external loads of which the frequencies are equal to the 1 and 2 times running frequency respectively. The equivalent external loads are calculated according to the dynamic equation of the original rotor system and the differences between the dynamical responses in normal case and the vibrations when the degree of unbalance or misalignment or both changes. The denoise method based on bandpass filter is used to decrease the effect of noise on the estimation accuracy. The numerical examples are given to show that the proposed approach can estimate the degrees of unbalance and misalignment of the flexible coupling-rotor system accurately.展开更多
In previous researches on a model-based diagnostic system, the components are assumed mutually independent. Howerver , the assumption is not always the case because the information about whether a component is faulty ...In previous researches on a model-based diagnostic system, the components are assumed mutually independent. Howerver , the assumption is not always the case because the information about whether a component is faulty or not usually influences our knowledge about other components. Some experts may draw such a conclusion that 'if component m 1 is faulty, then component m 2 may be faulty too'. How can we use this experts' knowledge to aid the diagnosis? Based on Kohlas's probabilistic assumption-based reasoning method, we use Bayes networks to solve this problem. We calculate the posterior fault probability of the components in the observation state. The result is reasonable and reflects the effectiveness of the experts' knowledge.展开更多
Real-time hybrid simulation is an efficient and cost-effective dynamic testing technique for performance evaluation of structural systems subjected to earthquake loading with rate-dependent behavior. A loading assembl...Real-time hybrid simulation is an efficient and cost-effective dynamic testing technique for performance evaluation of structural systems subjected to earthquake loading with rate-dependent behavior. A loading assembly with multiple actuators is required to impose realistic boundary conditions on physical specimens. However, such a testing system is expected to exhibit significant dynamic coupling of the actuators and suffer from time lags that are associated with the dynamics of the servo-hydraulic system, as well as control-structure interaction (CSI). One approach to reducing experimental errors considers a multi-input, multi-output (MIMO) controller design, yielding accurate reference tracking and noise rejection. In this paper, a framework for multi-axial real-time hybrid simulation (maRTHS) testing is presented. The methodology employs a real-time feedback-feedforward controller for multiple actuators commanded in Cartesian coordinates. Kinematic transformations between actuator space and Cartesian space are derived for all six-degrees-of- freedom of the moving platform. Then, a frequency domain identification technique is used to develop an accurate MIMO transfer function of the system. Further, a Cartesian-domain model-based feedforward-feedback controller is implemented for time lag compensation and to increase the robustness of the reference tracking for given model uncertainty. The framework is implemented using the 1/5th-scale Load and Boundary Condition Box (LBCB) located at the University of Illinois at Urbana- Champaign. To demonstrate the efficacy of the proposed methodology, a single-story frame subjected to earthquake loading is tested. One of the columns in the fraane is represented physically in the laboratory as a cantilevered steel column. For real- time execution, the numerical substructure, kinematic transformations, and controllers are implemented on a digital signal processor. Results show excellent performance of the maRTHS framework when six-degrees-of-freedom are controUed at the interface between substructures.展开更多
文摘The challenge of transitioning from temporary humanitarian settlements to more sustainable human settlements is due to a significant increase in the number of forcibly displaced people over recent decades, difficulties in providing social services that meet the required standards, and the prolongation of emergencies. Despite this challenging context, short-term considerations continue to guide their planning and management rather than more integrated, longer-term perspectives, thus preventing viable, sustainable development. Over the years, the design of humanitarian settlements has not been adapted to local contexts and perspectives, nor to the dynamics of urbanization and population growth and data. In addition, the current approach to temporary settlement harms the environment and can strain limited resources. Inefficient land use and ad hoc development models have compounded difficulties and generated new challenges. As a result, living conditions in settlements have deteriorated over the last few decades and continue to pose new challenges. The stakes are such that major shortcomings have emerged along the way, leading to disruption, budget overruns in a context marked by a steady decline in funding. However, some attempts have been made to shift towards more sustainable approaches, but these have mainly focused on vague, sector-oriented themes, failing to consider systematic and integration views. This study is a contribution in addressing these shortcomings by designing a model-driving solution, emphasizing an integrated system conceptualized as a system of systems. This paper proposes a new methodology for designing an integrated and sustainable human settlement model, based on Model-Based Systems Engineering and a Systems Modeling Language to provide valuable insights toward sustainable solutions for displaced populations aligning with the United Nations 2030 agenda for sustainable development.
文摘Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.
文摘It is common for datasets to contain both categorical and continuous variables. However, many feature screening methods designed for high-dimensional classification assume that the variables are continuous. This limits the applicability of existing methods in handling this complex scenario. To address this issue, we propose a model-free feature screening approach for ultra-high-dimensional multi-classification that can handle both categorical and continuous variables. Our proposed feature screening method utilizes the Maximal Information Coefficient to assess the predictive power of the variables. By satisfying certain regularity conditions, we have proven that our screening procedure possesses the sure screening property and ranking consistency properties. To validate the effectiveness of our approach, we conduct simulation studies and provide real data analysis examples to demonstrate its performance in finite samples. In summary, our proposed method offers a solution for effectively screening features in ultra-high-dimensional datasets with a mixture of categorical and continuous covariates.
文摘In ultra-high-dimensional data, it is common for the response variable to be multi-classified. Therefore, this paper proposes a model-free screening method for variables whose response variable is multi-classified from the point of view of introducing Jensen-Shannon divergence to measure the importance of covariates. The idea of the method is to calculate the Jensen-Shannon divergence between the conditional probability distribution of the covariates on a given response variable and the unconditional probability distribution of the covariates, and then use the probabilities of the response variables as weights to calculate the weighted Jensen-Shannon divergence, where a larger weighted Jensen-Shannon divergence means that the covariates are more important. Additionally, we also investigated an adapted version of the method, which is to measure the relationship between the covariates and the response variable using the weighted Jensen-Shannon divergence adjusted by the logarithmic factor of the number of categories when the number of categories in each covariate varies. Then, through both theoretical and simulation experiments, it was demonstrated that the proposed methods have sure screening and ranking consistency properties. Finally, the results from simulation and real-dataset experiments show that in feature screening, the proposed methods investigated are robust in performance and faster in computational speed compared with an existing method.
文摘The escalating costs of research and development, coupled with the influx of researchers, have led to a surge in published articles across scientific disciplines. However, concerns have arisen regarding the accuracy, validity, and reproducibility of reported findings. Issues such as replication problems, fraudulent practices, and a lack of expertise in measurement theory and uncertainty analysis have raised doubts about the reliability and credibility of scientific research. Rigorous assessment practices in certain fields highlight the importance of identifying potential errors and understanding the relationship between technical parameters and research outcomes. To address these concerns, a universally applicable criterion called comparative certainty is urgently needed. This criterion, grounded in an analysis of the modeling process and information transmission, accumulation, and transformation in both theoretical and applied research, aims to evaluate the acceptable deviation between a model and the observed phenomenon. It provides a theoretically grounded framework applicable to all scientific disciplines adhering to the International System of Units (SI). Objective evaluations based on this criterion can enhance the reproducibility and reliability of scientific investigations, instilling greater confidence in published findings. Establishing this criterion would be a significant stride towards ensuring the robustness and credibility of scientific research across disciplines.
文摘This paper focuses on the use of models for increasing the precision of estimators in large-area forest surveys. It is motivated by the increasing availability of remotely sensed data, which facilitates the development of models predicting the variables of interest in forest surveys. We present, review and compare three different estimation frameworks where models play a core role: model-assisted, model-based, and hybrid estimation. The first two are well known, whereas the third has only recently been introduced in forest surveys. Hybrid inference mixes design- based and model-based inference, since it relies on a probability sample of auxiliary data and a model predicting the target variable from the auxiliary data.We review studies on large-area forest surveys based on model-assisted, model- based, and hybrid estimation, and discuss advantages and disadvantages of the approaches. We conclude that no general recommendations can be made about whether model-assisted, model-based, or hybrid estimation should be preferred. The choice depends on the objective of the survey and the possibilities to acquire appropriate field and remotely sensed data. We also conclude that modelling approaches can only be successfully applied for estimating target variables such as growing stock volume or biomass, which are adequately related to commonly available remotely sensed data, and thus purely field based surveys remain important for several important forest parameters.
文摘Debugging software code has been a challenge for software developers since the early days of computer programming. A simple need, because the world is run by software. So perhaps the biggest engineering challenge is finding ways to make software more reliable. This review provides an overview of techniques developed over time in the field of software model checking to solve the problem of detecting errors in program code. In addition, the challenges posed by this technology are discussed and ways to mitigate them in future research and applications are proposed. A comprehensive examination of the various model verification methods used to detect program code errors is intended to lay the foundation for future research in this area.
文摘Although the Model-Driven paradigm is being accepted in the research environment as a very useful and powerful option for effective software development, its real application in the enterprise context is still a challenge for software engineering. Several causes can be stacked out, but one of them can be the lack of tool support for the efficient application of this paradigm. This paper presents a set of tools, grouped in a suite named NDT-Suite, which under the Model-Driven paradigm offer a suitable solution for software development. These tools explore different options that this paradigm can improve such as, development, quality assurance or requirement treatment. Besides, this paper analyses how they are being successfully applied in the industry.
文摘目的运用Model-View-Controller架构于因特网,设计与研发中医舌诊教学的网站。方法本因特网教学系统使用Websphere6.0服务器、DB28.1.7数据库、Java Server Faces、Structure Query Language、HTML以及JavaScript等共同开发而成。邀请85位大学护理学生参与测试。结果中医舌诊网站含盖八大相关部分。学习成效测试结果显示,网络教学方式可增加学生相关的专业知识。
基金supported by National Natural Science Foundation of China(Grant No. 10772061)Heilongjiang Provincial Natural Science Foundation of China(Grant No. ZJG0704)
文摘The condition of rotor system must be assessed in order to develop condition-based maintenance for rotating machinery. It is determined by multiple variables such as unbalance degree, misalignment degree, the amount of bending deformation of the shaft, occurrence of shaft crack of rotor system and so on. The estimation of the degrees of unbalance and misalignment in flexible coupling-rotor system is discussed. The model-based approach is employed to solve this problem. The models of the equivalent external loads for unbalance and misalignment are derived and analyzed. Then, the degrees of unbalance and misalignment are estimated by analyzing the components of the equivalent external loads of which the frequencies are equal to the 1 and 2 times running frequency respectively. The equivalent external loads are calculated according to the dynamic equation of the original rotor system and the differences between the dynamical responses in normal case and the vibrations when the degree of unbalance or misalignment or both changes. The denoise method based on bandpass filter is used to decrease the effect of noise on the estimation accuracy. The numerical examples are given to show that the proposed approach can estimate the degrees of unbalance and misalignment of the flexible coupling-rotor system accurately.
文摘In previous researches on a model-based diagnostic system, the components are assumed mutually independent. Howerver , the assumption is not always the case because the information about whether a component is faulty or not usually influences our knowledge about other components. Some experts may draw such a conclusion that 'if component m 1 is faulty, then component m 2 may be faulty too'. How can we use this experts' knowledge to aid the diagnosis? Based on Kohlas's probabilistic assumption-based reasoning method, we use Bayes networks to solve this problem. We calculate the posterior fault probability of the components in the observation state. The result is reasonable and reflects the effectiveness of the experts' knowledge.
基金CONICYT-Chile through Becas Chile Scholarship under Grant No.72140204Universidad Tecnica Federico Santa Maria(Chile)through Faculty Development Scholarship under Grant No.208-13
文摘Real-time hybrid simulation is an efficient and cost-effective dynamic testing technique for performance evaluation of structural systems subjected to earthquake loading with rate-dependent behavior. A loading assembly with multiple actuators is required to impose realistic boundary conditions on physical specimens. However, such a testing system is expected to exhibit significant dynamic coupling of the actuators and suffer from time lags that are associated with the dynamics of the servo-hydraulic system, as well as control-structure interaction (CSI). One approach to reducing experimental errors considers a multi-input, multi-output (MIMO) controller design, yielding accurate reference tracking and noise rejection. In this paper, a framework for multi-axial real-time hybrid simulation (maRTHS) testing is presented. The methodology employs a real-time feedback-feedforward controller for multiple actuators commanded in Cartesian coordinates. Kinematic transformations between actuator space and Cartesian space are derived for all six-degrees-of- freedom of the moving platform. Then, a frequency domain identification technique is used to develop an accurate MIMO transfer function of the system. Further, a Cartesian-domain model-based feedforward-feedback controller is implemented for time lag compensation and to increase the robustness of the reference tracking for given model uncertainty. The framework is implemented using the 1/5th-scale Load and Boundary Condition Box (LBCB) located at the University of Illinois at Urbana- Champaign. To demonstrate the efficacy of the proposed methodology, a single-story frame subjected to earthquake loading is tested. One of the columns in the fraane is represented physically in the laboratory as a cantilevered steel column. For real- time execution, the numerical substructure, kinematic transformations, and controllers are implemented on a digital signal processor. Results show excellent performance of the maRTHS framework when six-degrees-of-freedom are controUed at the interface between substructures.