Slashdot reader SysEngineer shared this report from the Guardian:
AI tools could be used to manipulate online audiences into making decisions — ranging from what to buy to who to vote for — according to researchers at the University of Cambridge. The paper highlights an emerging new marketplace for “digital signals of intent” — known as the “intention economy” — where AI assistants understand, forecast and manipulate human intentions and sell that information on to companies who can profit from it. The intention economy is touted by researchers at Cambridge’s Leverhulme Centre for the Future of Intelligence (LCFI) as a successor to the attention economy, where social networks keep users hooked on their platforms and serve them adverts. The intention economy involves AI-savvy tech companies selling what they know about your motivations, from plans for a stay in a hotel to opinions on a political candidate, to the highest bidder…
The study claims that large language models (LLMs), the technology that underpins AI tools such as the ChatGPT chatbot, will be used to “anticipate and steer” users based on “intentional, behavioural and psychological data”… Advertisers will be able to use generative AI tools to create bespoke online ads, the report claims… AI models will be able to tweak their outputs in response to “streams of incoming user-generated data”, the study added, citing research showing that models can infer personal information through workaday exchanges and even “steer” conversations in order to gain more personal information.
The article includes this quote from Dr. Jonnie Penn, an historian of technology at LCFI. “Unless regulated, the intention economy will treat your motivations as the new currency. It will be a gold rush for those who target, steer and sell human intentions.
“We should start to consider the likely impact such a marketplace would have on human aspirations, including free and fair elections, a free press and fair market competition, before we become victims of its unintended consequences.”