JARON LANIERGLEN WEYL. AI is an Ideology, Not a Technology: At its core, «artificial intelligence» is a perilous belief that fails to recognize the agency of humans // Wired, 15 марта 2020

ИИ — это идеология, а не технология: Автономный интеллект не существует, — ни наш, ни машинный — [Конспект-пересказ Сергея Карелова] // Medium, 23 марта 2020


Основные тезисы:

[Фетишизация ИИ может вести к фатальным последствиям. Опровержения ИИ как мировоззрения: гуманистический и плюралистический аргументы.]

Computation is an essential technology, but the AI way of thinking about it can be murky and dysfunctional. <…>

You can reject the AI way of thinking for a variety of reasons. One is that you view people as having a special place in the world and being the ultimate source of value on which AIs ultimately depend. (That might be called a humanist objection.) Another view is that no intelligence, human or machine, is ever truly autonomous: Everything we accomplish depends on the social context established by other human beings who give meaning to what we wish to accomplish. (The pluralist objection.) Regardless of how one sees it, an understanding of AI focused on independence from—rather than interdependence with—humans misses most of the potential for software technology.

[Важность человеческого кодирования данных для алгоритма]

In fact, as recent reporting has shown, China’s greatest advantage in AI is less surveillance than a vast shadow workforce actively labeling data fed into algorithms. Just as was the case with the relative failures of past hidden labor forces, these workers would become more productive if they could learn to understand and improve the information systems they feed into, and were recognized for this work, rather than being erased to maintain the “ignore the man behind the curtain” mirage that AI rests on. 

[Самообучающиеся ИИ пригодны для решения узкого класса проблем, основанных на точных определениях, статистики или измерениях.]

AI without human data is only possible for a narrow class of problems, the kind that can be defined precisely, not statistically, or based on ongoing measures of reality. Board games like chess and certain scientific and math problems are the usual examples, though even in these cases human teams using so-called AI resources usually outperform AI by itself. While self-trainable examples can be important, they are rare and not representative of real-world problems.

[ИИ — это не набор алгоритмов, а политическая или социальная философия, антигуманная в своей основе]

“AI” is best understood as a political and social ideology rather than as a basket of algorithms. The core of the ideology is that a suite of technologies, designed by a small technical elite, can and should become autonomous from and eventually replace, rather than complement, not just individual humans but much of humanity. Given that any such replacement is a mirage, this ideology has strong resonances with other historical ideologies, such as technocracy and central-planning-based forms of socialism, which viewed as desirable or inevitable the replacement of most human judgement/agency with systems created by a small technical elite. It is thus not all that surprising that the Chinese Communist Party would find AI to be a welcome technological formulation of its own ideology.

[ИИ может использоваться как инструмент социального творчества и освобождения. Либо — как орудие порабощения людей и централизации власти техноавторитарными элитами.]

As authoritarian governments try to compete against pluralistic technologies in the 21st century, they will inevitably face pressures to empower their own citizens to participate in creating technical systems, eroding the grip on power. On the other hand, an AI-driven cold war can only push both sides toward increasing centralization of power in a dysfunctional techno-authoritarian elite that stealthily stifles innovation. To paraphrase Edmund Burke, all that is necessary for the triumph of an AI-driven, automation-based dystopia is that liberal democracy accept it as inevitable.