Avoiding Malpractice in Econometrics: “David Hendry’s Automated Methodology Nesting Theory-Driven and Data-Driven Approaches”

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This paper focuses on the econometric methodology developed by Professor David Hendry with his associates along several decades. The paper comments on the statistical foundations of the methodology, which are based on the probability approach in Econometrics introduced by Haavelmo. The paper proposes eleven main features to sum up the methodology and they are discussed with detailed. A pivotal point in the methodology is the, Local Data generation Process (LDGP), which is unknown at the beginning and to discover it, it must be nested in a suitable General Unrestricted Model (GUM). The GUM must include variables from possibly relevant economic theories and all other types of variables that may be necessary to represent the economic system under study, including the indicator variables employed in the Indicator Saturation Estimation (ISE). Professor Hendry invented the ISE to capture as “many contaminating influences as possible” affecting the data. From the GUM a reduction process to discover a final congruent model is carried out by the procedure from general-to-specific. Usually, there are more variables than observations and a sophisticated multiple-path search using segmentation by blocks has been designed. All the process is automated by a machine-learning program, Autometrics. This methodology has also been incorporated into Climate Econometrics.

Article activity feed