Performance evaluation of competing forecasting models: A multidimensional framework based on MCDA

Bing Xu, Jamal Ouenniche

    Research output: Contribution to journalArticle

    19 Citations (Scopus)

    Abstract

    So far, competing forecasting models are compared to each other using a single criterion at a time, which often leads to different rankings for different criteria – a situation where one cannot make an informed decision as to which model performs best overall; that is, taking all performance criteria into account. To overcome this methodological problem, we propose to use a Multi-Criteria Decision Analysis (MCDA) based framework and discuss how one might adapt it to address the problem of relative performance evaluation of competing forecasting models. Three outranking methods have been used in our empirical experiments to rank order competing forecasting models of crude oil prices; namely, ELECTRE III, PROMETHEE I, and PROMETHEE II. Our empirical results reveal that the multidimensional framework provides a valuable tool to apprehend the true nature of the relative performance of competing forecasting models. In addition, as far as the evaluation of the relative performance of the forecasting models considered in this study is concerned, the rankings of the best and the worst performing models do not seem to be sensitive to the choice of importance weights or outranking methods, which suggest that the ranks of these models are robust.
    Original languageEnglish
    Pages (from-to)8312-8324
    JournalExpert Systems with Applications
    Volume39
    Issue number9
    DOIs
    Publication statusPublished - 1 Jul 2012

    Fingerprint Dive into the research topics of 'Performance evaluation of competing forecasting models: A multidimensional framework based on MCDA'. Together they form a unique fingerprint.

  • Cite this