doghouse
King Of The Elite Idiots
- Messages
- 12,430
PostEverything Perspective
Fascism is back. Blame the Internet.
It’s polarizing our politics, making us ignore inconvenient facts and impeding real activism.
Some Americans ask: What is wrong with the Internet? Others ask: Can fascism return? These questions are the same question.
Despite all the happy talk about connecting people, the Internet has not spread liberty around the world. On the contrary, the world is less free, in part because of the Web. In 2005, when about a quarter of the world’s population was online, common sense held that more connectivity would mean more freedom. But while Mark Zuckerberg was calling connectivity a basic human right, the more traditional rights were in decline as the Internet advanced. According to Freedom House, every year since 2005 has seen a retreat in democracy and an advance of authoritarianism. The year 2017, when the Internet reached more than half the world’s population, was marked by Freedom House as particularly disastrous. Young people who came of age with the Internet care less about democracy and are more sympathetic to authoritarianism than any other generation.
It’s also telling that the Internet has become a weapon of choice for those who wish to spread authoritarianism. Russia’s president and its leading propagandist both cite a fascist philosopher who believed that factuality was meaningless. In 2016, Russian Twitter bots spread divisive messages designed to discourage some Americans from voting and encourage others to vote for Russia’s preferred presidential candidate, Donald Trump. Britain’s vote to leave the European Union that same year was substantially influenced by bots from beyond its borders. Germany’s democratic parties, by contrast, have agreed not to use bots during political campaigns. The only party to resist the idea was the extreme right Alternative für Deutschland — which was helped by Russia’s bots in last year’s elections.
Democracy arose as a method of government in a three-dimensional world, where interlocutors could be physically identified and the world could be discussed and verified. Modern democracy relies upon the notion of a “public space” where, even if we can no longer see all our fellow citizens and verify facts together, we have institutions such as science and journalism that can provide joint references for discussion and policy. The Internet breaks the line between the public and private by encouraging us to confuse our private desires with the actual state of affairs. This is a constant human tendency. But in assuming that the Internet would make us more rather than less rational, we have missed the obvious danger: that we can now allow our browsers to lead us into a world where everything we would like to believe is true.
We think of computers as “ours” and imagine that we are the rational ones, using computers as tools. For many of us, much of the time, this may be a disastrously self-flattering perspective. When we run a search or read a feed, we are encountering instead an entity that has run algorithms about our preferences and which presents a version of reality that suits us. Yes, people can also humor us, but not with the same heartless determination, and not with the same flawless and cumulative memory of our weakness. Traditionally we have thought of artificial intelligence as a kind of rival to our own intelligence, emerging in parallel. What is actually happening is not parallel development but interaction, in which entities that are not themselves intelligent can nevertheless make us stupid.
In the famous “Turing test,” designed to determine whether a computer program could convince a human that it was also a human, skeptical humans ask hard questions and consider the output. This is our model of the enlightened person, but it scarcely resembles how we deal with computers. Rather than testing their reason, we concede our own at the outset if we are made to feel good about ourselves. Tellingly, the first computer program to pass the Turing test imitated a psychoanalyst. It turned the tables: We were no longer testing it; it was manipulating us. We believe computers are trustworthy when they seem to care how we feel. We follow sites that amplify our emotions, outraging us or elating us, not asking ourselves whether they are designed to keep us on line so that we see targeted ads — or, for that matter, used as weapons by foreigners to harm us.
Democracy depends upon a certain idea of truth: not the babel of our impulses, but an independent reality visible to all citizens. This must be a goal; it can never fully be achieved. Authoritarianism arises when this goal is openly abandoned, and people conflate the truth with what they want to hear. Then begins a politics of spectacle, where the best liars with the biggest megaphones win. Trump understands this very well. As a businessman he failed, but as a politician he succeeded because he understood how to beckon desire. By deliberately spreading unreality with modern technology, the daily tweet, he outrages some and elates others, eroding the very notion of a common world of facts.
In fascism, feeling is first. Fascists of the 1920s and 1930s wanted to undo the enlightenment and appeal to people as members of a tribe, race or species. What mattered was a story of us and them that could begin a politics of conflict and combat. Fascists proposed that the world was run by conspirators whose mysterious hold must be broken by violence. This could be achieved by a leader (Führer, Duce) who spoke directly to and for the people, without laws and institutions. Totalitarianism meant domination of the whole self, without respect for private and public.
Our memory of the 20th century grew hazy just as we began the plunge into cyberspace, which is perhaps why we did not notice certain alarming features of the experience. The Internet has revived fascist habits of mind. Smartphones and news feeds structure attention so that we cannot think straight. Their programmers deliberately appeal to psychological tactics such as intermittent reinforcement to keep us online rather than thinking. Is pulling your phone out 80 times a day really a free choice? Companies know that interruptions to flow are more likely to get a response, which is why the experience of a smartphone or a social platform is so jarring. Once attention is gained, it is kept by deliberately bottomless feeds that reinforce what we like and think. Researchers have found users of the Internet believe they know more, but in fact are less able to recall what they think they know.
The fascist psychology of the Internet had obvious political possibilities, some of which have now been consciously exploited. Facebook set the standard in providing fiction as fact before our last election. It and other platforms loops exploit our clicking habits to draw us toward a world of “us and them.” Social and political bots, many of them Russian, exploited our gullibility to deepen our divisions, and spread conspiracy theories. We found it normal to read the email of other people, breaking down the barrier between private and public. And the winning presidential candidate used and uses Twitter to emote with his supporters without mediation.
To be sure, Fascism 2.0 differs from the original. Traditional fascists wanted to conquer both territories and selves; the Internet will settle for your soul. The racist oligarchies that are emerging behind the Internet today want you on the couch, outraged or elated, it doesn’t matter which, so long as you are dissipated at the end of the day. They want society to be polarized, believing in virtual enemies that are inside the gate, rather than actually marching or acting in the physical world. Polarization directs Americans at other Americans, or rather at the Internet caricatures of other Americans, rather than at fundamental problems such as wealth inequality or foreign interference in democratic elections. The Internet creates a sense of “us and them” inside a country, and an experience that feels like politics but involves no actual policy.
By the same logic, the Internet can indeed be used for progressive purposes, as when an activist calls for a protest in Ukraine or Egypt or when public-school teachers use social media to organize strikes in a state where spending in education has fallen by 28 percent in the past decade. The crucial point: In such cases people are using the Internet against itself, to get their bodies into the real world. The reaction of leaders such as Trump and Russian President Vladimir Putin is telling: They immediately call real protesters paid actors or agents of foreign powers, trying to wrap the human world back inside fiction. In the age of the Internet, stretching one’s legs with strangers is a frightening political act.
The most disturbing resemblance between Fascism 1.0 and Fascism 2.0 is authentic popularity. Some Americans want to punish Russia. Others want to punish Silicon Valley. Both impulses are reasonable. But both dodge the fundamental issue. It is we who choose to be fooled, much as Europeans did in the 1930s. Why should the trolls, bots and algorithms respect us when we do not respect ourselves? Fascism played on loneliness and gullibility. That’s a lesson we can learn — but not from machines. We can fix the Internet only by taking an honest look at ourselves.
Fascism is back. Blame the Internet.
It’s polarizing our politics, making us ignore inconvenient facts and impeding real activism.
Some Americans ask: What is wrong with the Internet? Others ask: Can fascism return? These questions are the same question.
Despite all the happy talk about connecting people, the Internet has not spread liberty around the world. On the contrary, the world is less free, in part because of the Web. In 2005, when about a quarter of the world’s population was online, common sense held that more connectivity would mean more freedom. But while Mark Zuckerberg was calling connectivity a basic human right, the more traditional rights were in decline as the Internet advanced. According to Freedom House, every year since 2005 has seen a retreat in democracy and an advance of authoritarianism. The year 2017, when the Internet reached more than half the world’s population, was marked by Freedom House as particularly disastrous. Young people who came of age with the Internet care less about democracy and are more sympathetic to authoritarianism than any other generation.
It’s also telling that the Internet has become a weapon of choice for those who wish to spread authoritarianism. Russia’s president and its leading propagandist both cite a fascist philosopher who believed that factuality was meaningless. In 2016, Russian Twitter bots spread divisive messages designed to discourage some Americans from voting and encourage others to vote for Russia’s preferred presidential candidate, Donald Trump. Britain’s vote to leave the European Union that same year was substantially influenced by bots from beyond its borders. Germany’s democratic parties, by contrast, have agreed not to use bots during political campaigns. The only party to resist the idea was the extreme right Alternative für Deutschland — which was helped by Russia’s bots in last year’s elections.
Democracy arose as a method of government in a three-dimensional world, where interlocutors could be physically identified and the world could be discussed and verified. Modern democracy relies upon the notion of a “public space” where, even if we can no longer see all our fellow citizens and verify facts together, we have institutions such as science and journalism that can provide joint references for discussion and policy. The Internet breaks the line between the public and private by encouraging us to confuse our private desires with the actual state of affairs. This is a constant human tendency. But in assuming that the Internet would make us more rather than less rational, we have missed the obvious danger: that we can now allow our browsers to lead us into a world where everything we would like to believe is true.
We think of computers as “ours” and imagine that we are the rational ones, using computers as tools. For many of us, much of the time, this may be a disastrously self-flattering perspective. When we run a search or read a feed, we are encountering instead an entity that has run algorithms about our preferences and which presents a version of reality that suits us. Yes, people can also humor us, but not with the same heartless determination, and not with the same flawless and cumulative memory of our weakness. Traditionally we have thought of artificial intelligence as a kind of rival to our own intelligence, emerging in parallel. What is actually happening is not parallel development but interaction, in which entities that are not themselves intelligent can nevertheless make us stupid.
In the famous “Turing test,” designed to determine whether a computer program could convince a human that it was also a human, skeptical humans ask hard questions and consider the output. This is our model of the enlightened person, but it scarcely resembles how we deal with computers. Rather than testing their reason, we concede our own at the outset if we are made to feel good about ourselves. Tellingly, the first computer program to pass the Turing test imitated a psychoanalyst. It turned the tables: We were no longer testing it; it was manipulating us. We believe computers are trustworthy when they seem to care how we feel. We follow sites that amplify our emotions, outraging us or elating us, not asking ourselves whether they are designed to keep us on line so that we see targeted ads — or, for that matter, used as weapons by foreigners to harm us.
Democracy depends upon a certain idea of truth: not the babel of our impulses, but an independent reality visible to all citizens. This must be a goal; it can never fully be achieved. Authoritarianism arises when this goal is openly abandoned, and people conflate the truth with what they want to hear. Then begins a politics of spectacle, where the best liars with the biggest megaphones win. Trump understands this very well. As a businessman he failed, but as a politician he succeeded because he understood how to beckon desire. By deliberately spreading unreality with modern technology, the daily tweet, he outrages some and elates others, eroding the very notion of a common world of facts.
In fascism, feeling is first. Fascists of the 1920s and 1930s wanted to undo the enlightenment and appeal to people as members of a tribe, race or species. What mattered was a story of us and them that could begin a politics of conflict and combat. Fascists proposed that the world was run by conspirators whose mysterious hold must be broken by violence. This could be achieved by a leader (Führer, Duce) who spoke directly to and for the people, without laws and institutions. Totalitarianism meant domination of the whole self, without respect for private and public.
Our memory of the 20th century grew hazy just as we began the plunge into cyberspace, which is perhaps why we did not notice certain alarming features of the experience. The Internet has revived fascist habits of mind. Smartphones and news feeds structure attention so that we cannot think straight. Their programmers deliberately appeal to psychological tactics such as intermittent reinforcement to keep us online rather than thinking. Is pulling your phone out 80 times a day really a free choice? Companies know that interruptions to flow are more likely to get a response, which is why the experience of a smartphone or a social platform is so jarring. Once attention is gained, it is kept by deliberately bottomless feeds that reinforce what we like and think. Researchers have found users of the Internet believe they know more, but in fact are less able to recall what they think they know.
The fascist psychology of the Internet had obvious political possibilities, some of which have now been consciously exploited. Facebook set the standard in providing fiction as fact before our last election. It and other platforms loops exploit our clicking habits to draw us toward a world of “us and them.” Social and political bots, many of them Russian, exploited our gullibility to deepen our divisions, and spread conspiracy theories. We found it normal to read the email of other people, breaking down the barrier between private and public. And the winning presidential candidate used and uses Twitter to emote with his supporters without mediation.
To be sure, Fascism 2.0 differs from the original. Traditional fascists wanted to conquer both territories and selves; the Internet will settle for your soul. The racist oligarchies that are emerging behind the Internet today want you on the couch, outraged or elated, it doesn’t matter which, so long as you are dissipated at the end of the day. They want society to be polarized, believing in virtual enemies that are inside the gate, rather than actually marching or acting in the physical world. Polarization directs Americans at other Americans, or rather at the Internet caricatures of other Americans, rather than at fundamental problems such as wealth inequality or foreign interference in democratic elections. The Internet creates a sense of “us and them” inside a country, and an experience that feels like politics but involves no actual policy.
By the same logic, the Internet can indeed be used for progressive purposes, as when an activist calls for a protest in Ukraine or Egypt or when public-school teachers use social media to organize strikes in a state where spending in education has fallen by 28 percent in the past decade. The crucial point: In such cases people are using the Internet against itself, to get their bodies into the real world. The reaction of leaders such as Trump and Russian President Vladimir Putin is telling: They immediately call real protesters paid actors or agents of foreign powers, trying to wrap the human world back inside fiction. In the age of the Internet, stretching one’s legs with strangers is a frightening political act.
The most disturbing resemblance between Fascism 1.0 and Fascism 2.0 is authentic popularity. Some Americans want to punish Russia. Others want to punish Silicon Valley. Both impulses are reasonable. But both dodge the fundamental issue. It is we who choose to be fooled, much as Europeans did in the 1930s. Why should the trolls, bots and algorithms respect us when we do not respect ourselves? Fascism played on loneliness and gullibility. That’s a lesson we can learn — but not from machines. We can fix the Internet only by taking an honest look at ourselves.