LOADING

Type to search

Another ‘Samuelson, 1948’ moment? Evidence from machine learning

Uncategorized

Another ‘Samuelson, 1948’ moment? Evidence from machine learning

Share

Paul Samuelson explained the motivation for his 1948 introductory economics textbook with these words: “Today the non-specialist in physics deserves and expects to learn about atomic energy and nuclear structure in his first year of study, rather than remain bogged down in elementary experiments on falling bodies and heat calorimetry. Why then should teachers of economics withhold from the first-year course the really interesting and vital problems of overall economic policy?” (Samuelson 1948: vi)

At the time, physics students were indeed learning a lot about inclined planes. It wasn’t until 1961 that Richard Feynman took his first-year students at the California Institute of Technology to the frontier of modern physics using plain language, with a minimum of mathematics, to teach them quantum physics and relativity.

 ‘Samuelson, 1948’: Urgent problems and a teachable new theory

Because it became the industry standard in its many editions, and because the book itself changed over time, it is easy today to miss how radical the first edition of Samuelson’s book was. Samuelson put off the previously conventional starting point, “Determination of price by supply and demand”, until Part Three, which begins on page 447. Exactly ten pages later, we read: “This is all there is to the doctrine of supply and demand. All that is left to do is to point out some of the cases to which it can be applied and some to which it cannot.”

Even within Part Three, Samuelson adopts an unconventional ordering of topics, where the student first encounters the monopolistically competitive firm (“includes most firms and industries”, p. 492) before a section on the perfectly competitive firm (“includes a few agricultural industries”). And barely two pages into that section he introduces “decreasing costs and the breakdown of competition” (p. 505). 

To gauge the novelty of Samuelson’s text, we use a machine-learning method – topic modelling – to contrast Samuelson (1948) with the 1930 edition of a text written by Richard T. Ely and others, which was first published in 1893 and dominated the US market prior to Samuelson.

We ask what themes best characterise the distribution of words found in a large corpus of economics research (27,436 research articles published in major journals between 1900 and 2014).1 The themes, called “topics”, are vectors of words (each weighted by its importance in that particular topic). Having extracted the topics from more than 100 years of published research, we then ask: which of those topics best characterise the textbooks we wish to compare? 

Figure 1 compares Samuelson’s 1948 text with Ely et. al.’s 1930 edition. The length of each outline bar measures the importance of that topic in accounting for the words appearing respectively in Ely et al. (in the bars to the right of the vertical axis) and in Samuelson (in the bars to the left.) Each solid bar shows the between-textbook difference in weight on the topic in question. Solid bars to the right show a heavier weight on those topics in Ely et al. than in Samuelson and vice versa. 

Ely et al. place more weight on topics related to business organisation and regulation, transport and agriculture, the Gold Standard, and income tax than does Samuelson. Samuelson’s novelty is revealed in the importance of topics about aggregate demand (numbered 89 and 33; the topic numbers are arbitrary). 

Samuelson’s own view of the novelty of his text is the following (taken from page 2020 of the 1955 edition of his book):

.… I have set forth what I call a “grand neoclassical synthesis.” This is a synthesis of (1) the valid core of modem income determination with (2) the classical economic principles. Its basic tenet is this: Solving the vital problems of [unemployment] by the tools of income analysis will validate and bring back into relevance the classical verities. 

As macroeconomic policies informed by Samuelson’s synthesis were adopted around the world and the dread of another Great Depression faded,  the “classical verities” did indeed resume their place of honour in the table of contents; later editions of Samuelson’s book inverted the order and logic of his 1948 text and placed microeconomics at the beginning. 

A boom in supply and demand figures resulted: they constituted over a third of the analytical figures in Samuelson and Nordhaus (1998), up from a fifth in the initial edition of Samuelson’s book half a century earlier. Exchange under complete contracts by price-taking traders at a competitive market clearing equilibrium had become the benchmark model for undergraduates. 

Figure 1 Comparison of content in Samuelson (1948) and Ely et al. (1930) 

Another ‘Samuelson, 1948’ moment? Evidence from machine learning 2

Note: The length of each outline bar measures the topic weight in Ely et al. (in the bars to the right of the vertical axis) and Samuelson (in the bars to the left.)  

Another ‘Samuelson, 1948’ moment? 

This benchmark made sense to Samuelson and those who followed his return to the “classical verities” because the problem of instability and mass unemployment was no longer the dominant challenge for public policy and social welfare. But today, reliance on that model is an impediment to “an understanding of the economic institutions and problems of American civilization”, to borrow Samuelson’s expression for the purpose of his book. The pressing problems in the US and elsewhere now include climate change, economic disparities, and the future of work and private property in a knowledge- and care-based economy.

Economics too has moved on, as the topic modelling results in Figure 2 show. The macroeconomic topics that rose to prominence in the aftermath of the Great Depression are notably less heavily weighted in the research corpus of the recent period, as are the Marshallian microeconomics topics (“competition and market structure” and “elasticity of demand and supply”).  

Figure 2 A shift in the research corpus: From the mid-20th century to the early 21st century 

Another ‘Samuelson, 1948’ moment? Evidence from machine learning 3

Also reflecting a shift away from Marshallian microeconomics, among the most heavily weighted topics in the recent research corpus are strategic interactions under incomplete information (“strategic interactions, asymmetric information,” “game theory and behavioural economics,” and “equilibrium signalling…”).  The topic weights in the recent corpus also reflect the reassertion of economics as an empirical discipline (Angrist et al. 2017). 

What is striking about the comparison in Figure 2 is how little of the mid-century corpus is retained in 21st century research. The only topic with substantial weight in both periods is “equilibrium stability…”. The topics with greater weights at mid-century (“fluctuations in aggregate demand,” “empirical studies of industry,” and “elasticity of demand and supply”) have almost entirely disappeared in the recent corpus. Correspondingly, the newly prominent “strategic interactions…” and “applied econometrics…” had very small weights at mid-century.

Elements of an embryonic new benchmark

For the most part, the newly important concepts were not developed in response to the emergence of new social and economic problems. “Asymmetric information,” for example, rose to prominence with principal-agent models of credit and labour markets that provided missing microeconomic foundations for the Keynesian multiplier and involuntary unemployment. But these, along with game theory and behavioural economics, provide some of the keys to understanding economic inequality: through the exercise of power in non-clearing markets and the importance of social norms and economic rents. 

Can the transformation of the research corpus over the last half century provide the basis for a new benchmark for economics education? Any benchmark model in economics must (wittingly or not) take a position on what people are like, the rules of the game that govern how we interact, how the economy interacts with society and the biosphere, what are the pre-eminent questions to be asked, and the methods by which they may be answered. 

The modern research corpus has all but abandoned the conventional perfectly competitive equilibrium benchmark mentioned above, without articulating a replacement.  But the elements of a new benchmark already constitute the frame of reference that most research economists use. Responding in 1947 to criticism of his (still mimeographed) new text, Samuelson wrote: “The methods of analysis used are those that have been employed by 90 per cent of the active academic economists under the age of 50 over the last decade” (Giraud 2014: 141). 

We think the same may be said for a new benchmark that could be constructed on the following foundations. The economy is part of the social system and the biosphere. People are both self-interested and other-regarding; firms and other actors interact strategically (not only as price-takers) in markets, communities, families, and states; the distinct rules of the game governing economic interactions in these and other institutional settings are the key to understanding and evaluating the efficiency, fairness, and other aspects of the resulting allocations. Both stability and instability are features of the dynamics of this system.  Economics is an empirical science that seeks to evaluate outcomes based not only on a limited conception of efficiency but also distributive justice and bio-physical sustainability. 

A teachable new benchmark model

Although the quantitative analysis of textbooks using the topics discovered from the research corpus cannot adequately capture contrasting benchmarks, it can nevertheless reveal salient differences.  

In our research, we compare a number of contemporary principles textbooks, including those by Mankiw, Krugman and Wells, and Acemoglu, Laibson and List. We also include a new open access text – the CORE team’s The Economy – to which we have both contributed.  Comparisons between the CORE team’s book and Mankiw’s book are shown here, but the results are similar for the other textbooks. As can be seen, all modern textbooks share substantial coverage of standard topics in the economics of “competition and market structure”, “elasticity of demand and supply”, and “fluctuations in aggregate demand”. 

The modern textbooks are in other ways similar to each other, but different from CORE. Whilst, in contrast to Samuelson (1948), they introduce “game theory and behavioural economics”, and “comparative international development”, CORE devotes considerably more attention to both as it constructs a new benchmark from the outset. CORE’s topic novelty lies in the introduction of “innovation” and “economic history, history of economic thought”, and greater coverage of “institutional change” and “democratic political competition”. 

Figure 3 A topic comparison of CORE (2017) and Mankiw (2018)  

Another ‘Samuelson, 1948’ moment? Evidence from machine learning 4

Like Samuelson (1948), the CORE text alters the order in which topics are introduced. The logic of teaching the new benchmark model as done in CORE is to begin with an actor making a decision ‘against nature’ (which technology to use? how many hours to work?), then to introduce strategic interactions – including non-market ones – among actors and the rules by which these interactions occur, and hence the role of institutions and norms. 

The informational limits on the nature of contracts run through these foundational elements, which are brought together to analyse decisions firms make such as wage- and price-setting. External effects, public goods, and principal-agent problems that are essential to study climate change, inequality, and the knowledge- and care-based economy are in the benchmark students learn in the first seven chapters. Once these foundations are laid, the special case of price-taking actors is presented in Chapter 8 as an illuminating limiting case, rather than as the model of how the economy works to which imperfections and deviations are added. 

The new benchmark brings the added advantage in teaching of enabling a seamless transition to an aggregate economy in which policy is made by purposeful actors, there is involuntary unemployment at equilibrium, and credit constraints are endemic. The heterogeneous agents in this model are principals and agents in labour and credit markets, and in the relationship between a central bank and commercial banks. 

References

Acemoglu, D, D Laibson, and J List (2015), Economics, Pearson.

Angrist, J, P Azoulay, G Ellison, R Hill, and S F Lu (2017), “Economic Research Evolves: Fields and Styles”, American Economic Review 107(5): 293-97.

Bowles, S and W Carlin (2020), “What Students Learn in Economics 101: Time for a Change“, Journal of Economic Literature 58(1): 176-214.

CORE Team (2017), The Economy, OUP.

Ely, R T, T S Adams, M O L and A A Young (1930), Outlines of Economics, McMillan (first published in 1893).

Giraud, Y (2014), “Negotiating the “Middle of the Road” Position: Paul Samuelson, MIT, and the Politics of Textbook Writing, 1045-1955″, History of Political Economy 46: 134-52.

Krugman, P and R Wells (2015), Economics, Worth (first published in 2005).

Mankiw, N G (2018), Principles of Economcics, South-Western Cengage Learning (first published in 1997).

Samuelson, P (1948), Economics, an Introductory Analysis, McGraw Hill.

Samuelson, P and W Nordhaus (1998), Economics: An Introductory Analysis, McGraw Hill.

Endnotes

1 Details including a technical description of our use of topic modeling methods are in Bowles and Carlin (2020).

Leave a Comment

Your email address will not be published. Required fields are marked *