Simon Johnson discusses #PowerAndProgress, a new book co-authored with Daron Acemoglu on iNET.
A thousand years of history and contemporary evidence make one thing clear. Progress depends on the choices we make about technology. New ways of organizing production and communication can either serve the narrow interests of an elite or become the foundation for widespread prosperity.
The wealth generated by technological improvements in agriculture during the European Middle Ages was captured by the nobility and used to build grand cathedrals while peasants remained on the edge of starvation. The first hundred years of industrialization in England delivered stagnant incomes for working people. And throughout the world today, digital technologies and artificial intelligence undermine jobs and democracy through excessive automation, massive data collection, and intrusive surveillance.
It doesn’t have to be this way. Power and Progress demonstrates that the path of technology was once—and may again be—brought under control. The tremendous computing advances of the last half century can become empowering and democratizing tools, but not if all major decisions remain in the hands of a few hubristic tech leaders.
With their breakthrough economic theory and manifesto for a better society, Acemoglu and Johnson provide the vision needed to reshape how we innovate and who really gains from technological advances.
Some highlights below:
Johnson: "I think first and most importantly, control over data is very clear. What has happened is that we have put a lot of our own information, our data, and our photographs on the internet, hoping to share them with friends and family. However, they have been acquired without our permission to train generative AI. That's a major problem that needs to be addressed. I think the second piece that's really quite salient is surveillance, and that's something that obviously predates AI. There has been plenty of surveillance building up, but we think it's really going to reach a new level of efficiency, which means squeezing workers. That is also something that needs to be prevented. Then there are also various forms of manipulation, as you mentioned just now. The ways that we as consumers allow ourselves to be manipulated by the people who have this data, who have the algorithms, and who are being pretty cynical about what they want us to do."
"We are also very worried about what generative AI will do. I wouldn't say anybody has fully established exactly how it impacts the organization of work. One thing that ChatGPT seems to be doing is taking away jobs for low-level people who are doing relatively simple tasks, or you could call them entry-level positions. There are quite a lot of those jobs, as you know, in India. In fact, that's India's big stepping stone into the global economy. I think that losing that rung in the ladder would be a really bad blow to India. While there might be an impact on manufacturing, which you pointed out earlier, we might lose those labour-intensive textile jobs, for example. We may lose even more of those labour-intensive text jobs right - the people who input text, the people who do medical records processing, the people who run call centres."
"I think today in 2023, we're grappling with at least echoes of what we saw in 2007-2008, but the echoes are not that strong at the moment, Rob. This is in part because the vision changed, the rules changed, and the behavior changed. Now on tech, I think it's very analogous that there is a vision of machine learning creating machine intelligence, which is this I think completely misleading term. But the idea is that you want to replace humans in production, in the service sector, everywhere in the economy. They can do it. Sometimes it's not very effective Tehran and Pascal Restrepo coined the term so-so automation like self-checkout kiosks at the grocery store. They don't boost productivity that much, but they do tilt the balance of power between the owners or the grocery store and the workers and consequence. Then they're also popular perhaps with analysts so that technology does get adopted. I think we're grappling with another vision, Rob, that's become too predominant, too prevalent, and somewhat dangerous. It doesn't mean we're anti-tech. I'm not anti-finance. We need a financial sector. I don't want a financial actor that blows yourself up. I don't want a tech sector that destroys millions or tens of millions of jobs without giving us an opportunity to build new jobs, new tasks, new things humans can do in the spirit of transformations that innovation often brings."
Rob: And do you think that the how'd I say new digital technology has played a big role in the onset of the or the implementations in the Ukraine war?
Johnson: "Well, I think the situation with Russia's invasion of Ukraine, Rob, is very dangerous in many ways. But if you think about the technology, when we develop technology in the past and when we've intensely looked for malevolent applications like during World War One, where the technology that was behind artificial fertilizer was turned into poison gas, the production of poison gas by the same scientists. I think that sort of distortion of technology and focusing on killing people is very problematic. And I think there is potential always potential for more of that, particularly when technologically advanced countries are drawn into prolonged conflict. So, I think we really need a de-escalation. We need Russia to leave Ukraine actually, and then we need a de-escalation around Russia. We need China to recognize that, and we should recognize that ourselves. If we can quiet down the world and push more of the technology into productive peaceful purposes, everyone gains."
It's well worth watching the whole interview.
Comments
Post a Comment