TY - JOUR

T1 - SCORING ALTERNATIVE FORECAST DISTRIBUTIONS: COMPLETING THE KULLBACK DISTANCE COMPLEX

AU - Sanfilippo, Giuseppe

PY - 2018

Y1 - 2018

N2 - We develop two surprising new results regarding the use of proper scoring rules for evaluating the predictive quality of two alternative sequential forecast distributions. Both of the proponents prefer to be awarded a score derived from the other's distribution rather than a score awarded on the basis of their own. A Pareto optimal exchange of their scoring outcomes provides the basis for a comparison of forecast quality that is preferred by both forecasters, and also evades a feature of arbitrariness inherent in using the forecasters' own achieved scores. The well-known Kullback divergence, used as a measure of information, is evaluated via the entropies in the two forecast distributions and the two cross-entropies between them. We show that Kullback's symmetric measure needs to be appended by three component measures if it is to characterise completely the information content of the two asserted probability forecasts. Two of these do not involve entropies at all. The resulting "Kullback complex" supported by the 4-dimensional measure is isomorphic to an equivalent vector measure generated by the forecasters' expectations of their scores, each for one's own score and for the other's score. We foreshadow the results of a sophisticated application of the Pareto relative scoring procedure for actual sequentional observations, and we propose a standard format for evaluation

AB - We develop two surprising new results regarding the use of proper scoring rules for evaluating the predictive quality of two alternative sequential forecast distributions. Both of the proponents prefer to be awarded a score derived from the other's distribution rather than a score awarded on the basis of their own. A Pareto optimal exchange of their scoring outcomes provides the basis for a comparison of forecast quality that is preferred by both forecasters, and also evades a feature of arbitrariness inherent in using the forecasters' own achieved scores. The well-known Kullback divergence, used as a measure of information, is evaluated via the entropies in the two forecast distributions and the two cross-entropies between them. We show that Kullback's symmetric measure needs to be appended by three component measures if it is to characterise completely the information content of the two asserted probability forecasts. Two of these do not involve entropies at all. The resulting "Kullback complex" supported by the 4-dimensional measure is isomorphic to an equivalent vector measure generated by the forecasters' expectations of their scores, each for one's own score and for the other's score. We foreshadow the results of a sophisticated application of the Pareto relative scoring procedure for actual sequentional observations, and we propose a standard format for evaluation

UR - http://hdl.handle.net/10447/369502

UR - http://www.gler.it/archivio/ISSUE/gler_22_2.pdf

M3 - Article

VL - 22

SP - 63

EP - 90

JO - GLOBAL & LOCAL ECONOMIC REVIEW

JF - GLOBAL & LOCAL ECONOMIC REVIEW

SN - 1722-4241

ER -