28.4 C
Miami
Wednesday, February 19, 2025

The human mind is in a recession

- Advertisement -spot_imgspot_img
- Advertisement -spot_imgspot_img

This article is an on-site version of Free Lunch newsletter. Premium subscribers can sign up here to get the newsletter delivered every Thursday and Sunday. Standard subscribers can upgrade to Premium here, or explore all FT newsletters

Hello readers. For all the talk of artificial intelligence, the most efficient computer on Earth remains the human brain.

It can perform the same number of operations per second as the world’s supercomputers, but only requires power equivalent to a fridge lightbulb.

For this week’s newsletter, I’m switching from the usual counter-consensus macroeconomic analysis to an exploration of an unorthodox idea: the economics of the human mind.

There is an emerging strand of research that emphasises the importance of “brain capital” — a function of brain health, capacity and skills.

It might sound woolly, but it matters for two reasons.

First, since the industrial revolution machines have been increasingly replacing human brawn and routine cognitive tasks. By 2030, the share of activities expected to be delivered predominantly by people will fall from about 50 per cent today to 33 per cent, according to the World Economic Forum’s latest Future of Jobs Report. That puts humanity’s comparative advantage in narrower areas of thinking.

Second, we now live longer. State-defined retirement ages are less relevant in knowledge-driven economies. An individual’s cognitive skills are longer-term assets than physical strength.

But “brain capital” is under pressure.

Illnesses that affect brain function — including mental health conditions, substance abuse and neurological disorders — are estimated to cost the global economy $5tn per year (roughly the size of the German economy in nominal terms today). That’s expected to rise to $16tn by 2030.

According to the World Health Organization, depression is the leading cause of disability worldwide. Its prevalence has risen 89 per cent since 1990. Alzheimer’s disease and other dementias have increased by 161 per cent, largely due to ageing populations.

Problems are spread across the age distribution. There’s a bulge in healthy years lost to poor mental wellbeing among the traditional “working age” bracket. But even in retirement, other neurological disorders spike.

Brain capacity is also being squeezed. “Our mental lives are more fragmented and scattered than ever before,” said Dan Nixon, an expert on the “attention economy” — which models attention as a scarce resource in high demand. “Apps, alerts and notifications are locked in a constant battle to capture and monetise our gaze.”

It is estimated the digital universe doubles in size every two years, with 2.5 quintillion bytes of data created every day. Much of that is now at our fingertips.

Daily screen time across devices — such as computers, laptops, tablets, mobile phones, televisions and game consoles — increased from 9 hours in 2012 to 11 hours in 2019, with time spent on mobile phones increasing by roughly two hours, according to a global study. (It received a bump after the pandemic too.)

The rising demands on our attention are limited by our ability to supply it. This is highlighted by Thales Teixeira, a former professor at the Harvard Business School. His research has shown how the cost of attention has risen, using the price of gaining 1,000 impressions on TV adverts during the Super Bowl and US prime time as a proxy. Over time, both have surged, particularly with the growth of internet usage, as competition for consumer attention has expanded to other media.

Of course, rising screen time also means spending more time accessing enriching news, research and entertainment. But trying to absorb too much content has negative side-effects.

“Constant exposure to information and notifications can overwhelm our cognitive capacities,” said Mithu Storoni, author of Hyperefficient, a book about optimising our brains. “Flitting between stimuli reduces our attention span, and the overload can contribute to mental fatigue, impaired memory and increased stress.”

Indeed, there is a relationship between overload and brain health. High social media usage has been associated with higher levels of depression, particularly in younger cohorts. High screen time can also worsen symptoms of ADHD, and has been linked to a higher risk of dementia.

Digital technology also has an impact on the third element of “brain capital” — skills.

Aside from technological and AI-linked skills, employers surveyed by the WEF said creativity, resilience and analytical thinking were among the top competencies likely to grow in importance over the next five years.

These are also skills strained by pressures on brain capacity. Digital distractions can thwart creative thinking, and the stress and anxiety from information overload can sap resilience. (Nixon mentioned the “mindless grasping impulse” — reaching for your phone, for no particular reason — triggered by the dopamine hit we experience when accessing digital content.)

What about analytical thinking? Big data, machine learning and wider access to content has supported our research capabilities. Still, even basic skills appear to have atrophied over the past decade.

The OECD’s Survey of Adult Skills shows that more developed economies have experienced declines in literacy proficiency than improvements over the past decade (even after controlling for demographic changes, such as immigration). As for numeracy, the picture is more mixed — but still worrying.

David Robson, a science writer and author of The Intelligence Trap, has some theories:

Various studies suggest that, after having risen for most of the 20th century, the average performance on intelligence tests has started to stabilise or even drop in many countries. This may reflect changes in the ways that we use our brains. For instance, we now use our smartphones for most calculations, so we don’t exercise some numerical skills as regularly. Vocabulary is also weakening, perhaps because of changes in people’s reading habits. 

Robson added that skills not captured in IQ tests, such as rationality and critical thinking, tend to be more strongly correlated with overall wellbeing. But these competencies are also under pressure.

Several studies highlight how news feed algorithms and clickbait can enhance bias by creating “online echo chambers” and disinformation. Both have also been linked to rising political polarisation. In America, voters’ sentiment about the economy reliably flips based on their alignment with the president. The Gallup Economic Confidence index highlights this, but also shows a general rise in polarity over time. Greater exposure to news affirming one’s position online is a possible explanation.

Then there’s the “Google effect”, where we treat the search engine as a form of random-access memory and remember fewer easily searchable facts as a result.

All of these blunt our critical thinking, in part by exacerbating our innate cognitive biases. This isn’t new; even pre-internet these effects existed. But the scale and intensity now is inordinately greater. In this environment, it is easier to consciously or subconsciously outsource decisions elsewhere, and this has implications that warrant deeper consideration. (Researchers at the University of Cambridge recently warned that conversational AI agents could develop the ability to influence our “intentions” too.)

What’s the upshot? Wider access to information, global improvements in education and better nutrition have all boosted brain capital. But trends in brain health, rising demands on our attention and forces undermining our critical thinking are concerning.

The human mind is a resource that needs strengthening to support long-term wellbeing, growth and innovation, particularly as technology plays a greater role in our lives and economies. As the world focuses on piling trillions into artificial intelligence, it’s wise not to lose sight of the returns that come from investing in real intelligence.

Thoughts? Rebuttals? Message me at freelunch@ft.com or on X @tejparikh90.

Food for thought

Did the second world war help build the foundations for US biomedicine innovation? A new National Bureau of Economic Research working paper reckons so, highlighting the long-run effects that coordinated and application-oriented research can have on science and technology.

Recommended newsletters for you

Trade Secrets — A must-read on the changing face of international trade and globalisation. Sign up here

Unhedged — Robert Armstrong dissects the most important market trends and discusses how Wall Street’s best minds respond to them. Sign up here

Source link

- Advertisement -spot_imgspot_img

Highlights

- Advertisement -spot_img

Latest News

- Advertisement -spot_img