A STUDY OF MODERN METHODS OF FORECASTING ENSEMBLES CONSTRUCTION AS APPLIED TO THE INTERVAL FORECASTING PROBLEM

Дата поступления: 
14.06.2017
Год: 
2017
Номер журнала (Том): 
УДК: 
519.688
DOI: 

10.26731/1813-9108.2017.3(55).94-101

Файл статьи: 
Страницы: 
94
101
Аннотация: 

At present, scientists pay great attention to the development and improvement of machine training methods for solving various applied problems. One of such important tasks is the problem of dynamic indicators forecasting for the purpose of increasing the effectiveness of decision-making in the conditions of uncertainty. An undoubted and important characteristic of any forecasting method is the forecast accuracy. One of the most promising and modern trends in improving the forecasting accuracy is the construction of forecasting ensembles.

In this paper, the authors have carried out a study of existing methods for constructing forecasting ensembles in order to justify their use for the problem of interval forecasting. Among such methods, the voting method, the boosting method, the stacking method, the bagging method and the random subspaces method have been considered. Taking into account the specifics of the construction and training of interval forecasting models, the stacking method and the bagging method have been recommended for application. These methods are considered modern, promising and suitable for the interval forecasting models developed by the authors in order to improve the forecasting accuracy.

Список цитируемой литературы: 

1. James G., Witten D., Hastie T., Tibshirani T. An Introduction to Statistical Learning with Applications in R. 2013, 426 p.

2. Elliott G., Granger C., Timmermann A. Handbook of Economic Forecasting. 2013, Vol 2, 1324 p.

3. Krakovskii Yu.M., Luzgin A.N. Prikladnye aspekty primeneniya interval'nogo prognozirovaniya v sistemnom analize [Applied aspects of application of interval forecasting in system analysis]. Sovremennye tekhnologii. Sistemnyi analiz. Modelirovanie [Modern technologies. System analysis. Modeling], 2017, No. 2(52).

4. Krakovskii Yu.M., Luzgin A.N. Algoritm interval'nogo prognozirovaniya dinamicheskikh pokazatelei na osnove robastnoi veroyatnostnoi klasternoi modeli  [Elektronnyi resurs] [Algorithm for interval forecasting of dynamic indicators on the basis of robust probabilistic cluster model]. Nauka i obrazovanie [Science and education], 2016, No.11, pp. 113–126. URL: http://technomag.neicon.ru/doc/84 9839.html (Accessed 17.05.2017).

5. Krakovskii Yu.M., Luzgin A.N. Algoritm interval'nogo prognozirovaniya dinamicheskikh pokazatelei na osnove veroyatnostnoi neirosetevoi modeli [Algorithm for interval forecasting of dynamic indicators based on probabilistic neural network model]. Sovremennye tekhnologii. Sistemnyi analiz. Modelirovanie [Modern technologies. System analysis. Modeling], 2016, No. 4 (50), pp. 126–132.

6. Krakovskii Yu.M., Luzgin A.N. Programmnyi kompleks interval'nogo prognozirovaniya nestatsionarnykh dinamicheskikh pokazatelei [The program complex of interval forecasting of non-stationary dynamic indicators]. Vestnik IrGTU [Proceedings of Irkutsk State Technical University], 2015, No.4, pp.12–16.

7. Gorodetskii V.I., Serebryakov S.V. Metody i algoritmy kollektivnogo raspoznavaniya [Methods and algorithms of collective recognition]. Trudy SPIIRAN [SPIIRAS Proceedings], 2006, No. 3 (1), pp. 139–171.

8. Opitz D., Maclin R. Popular ensemble methods: An empirical study. Journal of Artificial Intelligence Research, 1999, Vol.11, pp. 169–198.

9. Polikar R. Ensemble based systems in decision making. IEEE Circuits and Systems Magazine, 2006, No.6 (3), pp. 21–45.

10. Rokach L. Ensemble-based classifiers. Artificial Intelligence Review, 2010, No.33 (1–2), pp. 1–39.

11. Lam L., Suen S.Y. Application of majority voting to pattern recognition: an analysis of its behavior and performance. IEEE Transactions on Systems, Man, and Cybernetics (Part A: Systems and Humans), 1997, Vol. 27, No. 5, pp. 553–568.

12. Clemen R.T. Combining forecasts: A review and annotated bibliography. International Journal of forecasting, 1989, No. 5, pp. 559–583.

13. Michael I.I., Jacobs R.A. Hierarchical mixtures of experts and the EM algorithm. Neural computation, 1994, No. 6.2, pp. 181–214.

14. Schapire R.E., Freund Y. Boosting: Foundations and Algorithms (Adaptive Computation and Machine Learning series). 2014, 554 p.

15. Collins M. Linear Classifiers. 2012. URL: http://www.cs.columbia.edu/~mcollins/courses/6998-2012/lectures/lec1.3.pdf (Accessed 16.08.2016).

16. Friedman I, Hastie T., Tibshirani R. Additive logistic regression: a statistical view of boosting (With discussion and a rejoinder by the authors). The Annals of Statistics, 2000, Vol. 28, No. 2, pp. 337–407.

17. García E., Lozano F. Boosting Support Vector Machines. Proceedings - ICMLA 2005: Fourth International Conference on Machine Learning and Applications, 2005, pp. 374–379.

18. Ting K.M., Zheng Z. A Study of AdaBoost with Naive Bayesian Classifiers: Weakness and Improvement. Computational Intel-ligence, 2003, Vol.19, No. 2, pp. 186–200.

19. Mease D., Wyner A., Buja A. Boosted Classification Trees and Class Probability. Journal of Machine Learning Research, 2007, No. 8, pp. 409–439.

20. Grim I., Pudil P. Somol. P. Boosting in probabilistic neural networks. Object recognition supported by user interaction for service robots, 2002, vol.2, pp. 126-139.

21. Wolpert D.H. Stacked generalization. Neural Networks, 1992, vol.5, No.2, pp. 241–259.

22. Ting K.M., Witten, I.H. Stacked generalization: when does it work? Proceedings of the Fifteenth international joint conference on Artifical intelligence, 1997, Vol.2, pp. 866–871.

23. Breiman L. Bagging Predictors. Technical Report No. 421, 1994. URL: https://www.stat.berkeley.edu/~breiman/bagging.pdf (Accessed: 14.02.2017).

24. Ho T.K. The Random Subspace Method for Constructing Decision Forests. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1998, Vol. 20 (8), pp. 832–844.

25. Bryll R. Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets. Pattern Recognition, 2003, Vol. 36 (6), pp. 1291–1302.

26. Skurichina M., Duin R. Bagging, Boosting and the Random Subspace Method for Linear Classifiers. Pattern Analysis & Appli-cations, 2002, Vol.5, Issue 2, pp.121–135.