A Neutrosophic Forecasting Model For Time Series Based On First Order State And Information Entropy Of High Order Fluctuation

Download A Neutrosophic Forecasting Model For Time Series Based On First Order State And Information Entropy Of High Order Fluctuation PDF/ePub or read online books in Mobi eBooks. Click Download or Read Online button to get A Neutrosophic Forecasting Model For Time Series Based On First Order State And Information Entropy Of High Order Fluctuation book now. This website allows unlimited access to, at the time of writing, more than 1.5 million titles, including hundreds of thousands of titles in various foreign languages.
A Neutrosophic Forecasting Model for Time Series Based on First-Order State and Information Entropy of High-Order Fluctuation

In time series forecasting, information presentation directly affects prediction efficiency. Most existing time series forecasting models follow logical rules according to the relationships between neighboring states, without considering the inconsistency of fluctuations for a related period. In this paper, we propose a new perspective to study the problem of prediction, in which inconsistency is quantified and regarded as a key characteristic of prediction rules. First, a time series is converted to a fluctuation time series by comparing each of the current data with corresponding previous data.
A Forecasting Model Based on High-Order Fluctuation Trends and Information Entropy

Most existing high-order prediction models abstract logical rules that are based on historical discrete states without considering historical inconsistency and fluctuation trends. In fact, these two characteristics are important for describing historical fluctuations. This paper proposes a model based on logical rules abstracted from historical dynamic fluctuation trends and the corresponding inconsistencies. In the logical rule training stage, the dynamic trend states of up and down are mapped to the two dimensions of truth-membership and false-membership of neutrosophic sets, respectively. Meanwhile, information entropy is employed to quantify the inconsistency of a period of history, which is mapped to the indeterminercy-membership of the neutrosophic sets. In the forecasting stage, the similarities among the neutrosophic sets are employed to locate the most similar left side of the logical relationship. Therefore, the two characteristics of the fluctuation trends and inconsistency assist with the future forecasting. The proposed model extends existing high-order fuzzy logical relationships (FLRs) to neutrosophic logical relationships (NLRs). When compared with traditional discrete high-order FLRs, the proposed NLRs have higher generality and handle the problem caused by the lack of rules. The proposed method is then implemented to forecast Taiwan Stock Exchange CapitalizationWeighted Stock Index and Heng Seng Index. The experimental conclusions indicate that the model has stable prediction ability for different data sets. Simultaneously, comparing the prediction error with other approaches also proves that the model has outstanding prediction accuracy and universality.
Entropy Application for Forecasting

This book shows the potential of entropy and information theory in forecasting, including both theoretical developments and empirical applications. The contents cover a great diversity of topics, such as the aggregation and combination of individual forecasts, the comparison of forecasting performance, and the debate concerning the tradeoff between complexity and accuracy. Analyses of forecasting uncertainty, robustness, and inconsistency are also included, as are proposals for new forecasting approaches. The proposed methods encompass a variety of time series techniques (e.g., ARIMA, VAR, state space models) as well as econometric methods and machine learning algorithms. The empirical contents include both simulated experiments and real-world applications focusing on GDP, M4-Competition series, confidence and industrial trend surveys, and stock exchange composite indices, among others. In summary, this collection provides an engaging insight into entropy applications for forecasting, offering an interesting overview of the current situation and suggesting possibilities for further research in this field.