Is AI still going to be totalitarian?

That is the topic of my latest Bloomberg column, here is part of the argument:

One of the fears with GPT-3 is that students will use it to generate realistic-sounding term papers. That may well be a problem (could the oral exam make a comeback?). But it also shows how the technology can encourage decentralized idea production and the subversion of authority. It is the opposite of centralized control of everything.

Perhaps the biggest political fear is that AI supports vast amounts of surveillance. Governments use facial and gait surveillance to trace people’s movements in public, for example. That is a valid concern, but it is not clear that AI has given today’s major autocratic governments such a big boost.

Russia, for one, was supposed to be such an impressive cyberpower, able to cripple entire societies with its cyberattacks. Maybe Russia has yet to show what it is capable of, but as the Ukraine war proceeds its cybercapabilities seem less scary. (Cyber is not synonymous with AI, but both are advanced and interrelated technologies that Russia seems to be failing at.)

Russia has proved it can use a lot of heavy artillery in a very destructive fashion. It has not shown it can mobilize AI technologies to deploy very effective forms of targeted warfare. It seems once again that brute force, not advanced technology, is the friend of autocracy.

The No. 1 autocratic AI power, of course, is China, but here too the course of events is uncertain. The Chinese government uses an impressive array of AI technologies to monitor its population, but to what end have the Chinese turned these technologies?

China has been doubling down on its Covid Zero policy, at great expense to the Chinese economy. These policies are possible only because the Chinese state had such advanced tracking and monitoring capabilities in the first place. At first those technologies were used to limit the spread of Covid, often quite effectively. But the current Covid strains are harder to control and it is difficult to see exactly what the Chinese endgame looks like. China has taken an AI asset and turned it into an AI liability.

That flip should not come as a surprise, considering the benefits and costs of autocracies. Autocracy typically is a “high variance” form of government: It can have major successes, such as the building of Chinese infrastructure, but the relative absence of checks and balances means that major failures are also likely, in this case the persistence of Covid Zero policy.

In essence, the advent of advanced AI raises the stakes. But if autocracy is a high-variance form of government, raising the stakes is risky.

Here is Henry Farrell on similar issues.

Comments

Comments for this post are closed