Why should we be afraid of our online identities: Filter Bubble (Excerpt)

Image representing Google as depicted in Crunc...

Google’s filtering systems, for example, rely heavily on Web history and what you click on (click signals) to infer what you like and dislike. These clicks often happen in an entirely private context: The assumption is that searches for “intestinal gas” and celebrity gossip Web sites are between you and your browser. You might behave differently if you thought other people were going to see your searches. But it’s that behavior that determines what content you see in Google News, what ads Google displays—what determines, in other words, Google’s theory of you.
Image representing Facebook as depicted in Cru...

The basis for Facebook’s personalization is entirely different. While Facebook undoubtedly tracks clicks, its primary way of thinking about your identity is to look at what you share and with whom you interact. That’s a whole different kettle of data from Google’s: There are plenty of prurient, vain, and embarrassing things we click on that we’d be reluctant to share with all of our friends in a status update. And the reverse is true, too. I’ll cop to sometimes sharing links I’ve barely read—the long investigative piece on the reconstruction of Haiti, the bold political headline—because I like the way it makes me appear to others. The Google self and the Facebook self, in other words, are pretty different people. There’s a big difference between “you are what you click” and “you are what you share.”

Both are pretty poor representations of who we are, in part because there is no one set of data that describes who we are. “Information about our property, our professions, our purchases, our finances, and our medical history does not tell the whole story,” writes privacy expert Daniel Solove. “We are more than the bits of data we give off as we go about our lives.”

But the one-identity problem illustrates one of the dangers of turning over your most personal details to companies who have a skewed view of what identity is. Maintaining separate identity zones is a ritual that helps us deal with the demands of different roles and communities. And something’s lost when, at the end of the day, everything inside your filter bubble looks roughly the same. Your bacchanalian self comes knocking at work; your work anxieties plague you on a night out.

Excerpt From: Eli, Pariser. “The Filter Bubble.”

Why Artificial Intelligence cannot compete with Human Intelligence

Cover of "Predictably Irrational: The Hid...

It’s become a bit in vogue to pick on the human brain. We’re “predictably irrational,” in the words of behavioral economist Dan Ariely’s bestselling book. Stumbling on Happiness author Dan Gilbert presents volumes of data to demonstrate that we’re terrible at figuring out what makes us happy. Like audience members at a magic show, we’re easily conned, manipulated, and misdirected.

All of this is true. But as Being Wrong author Kathryn Schulz points out, it’s only one part of the story. Human beings may be a walking bundle of miscalculations, contradictions, and irrationalities, but we’re built that way for a reason: The same cognitive processes that lead us down the road to error and tragedy are the root of our intelligence and our ability to cope with and survive in a changing world. We pay attention to our mental processes when they fail, but that distracts us from the fact that most of the time, our brains do amazingly well.

The mechanism for this is a cognitive balancing act. Without our ever thinking about it, our brains tread a tightrope between learning too much from the past and incorporating too much new information from the present. The ability to walk this line—to adjust to the demands of different environments and modalities—is one of human cognition’s most astonishing traits. Artificial intelligence has yet to come anywhere close.

Excerpt From: Eli, Pariser. “The Filter Bubble.”

Experts are wrong more often

“Philip Tetlock, a political scientist, found similar results when he invited a variety of academics and pundits into his office and asked them to make predictions about the future in their areas of expertise. Would the Soviet Union fall in the next ten years? In what year would the U.S. economy start growing again? For ten years, Tetlock kept asking these questions. He asked them not only of experts, but also of folks he’d brought in off the street—plumbers and schoolteachers with no special expertise in politics or history. When he finally compiled the results, even he was surprised. It wasn’t just that the normal folks’ predictions beat the experts’. The experts’ predictions weren’t even close.

Why? Experts have a lot invested in the theories they’ve developed to explain the world. And after a few years of working on them, they tend to see them everywhere. For example, bullish stock analysts banking on rosy financial scenarios were unable to identify the housing bubble that nearly bankrupted the economy—even though the trends that drove it were pretty clear to anyone looking. It’s not just that experts are vulnerable to confirmation bias—it’s that they’re especially vulnerable to it.”

Excerpt From: Eli, Pariser. “The Filter Bubble.”

Internet makes educated mis-educated

English: Cropped version of File:Official port...

What’s perplexing is that since the election, the percentage of Americans who hold that belief (Obama is a Muslim) has nearly doubled, and the increase, according to data collected by the Pew Charitable Trusts, has been greatest among people who are college educated. People with some college education were more likely in some cases to believe the story than people with none—a strange state of affairs.

Why? According to the New Republic’s Jon Chait, the answer lies with the media: “Partisans are more likely to consume news sources that confirm their ideological beliefs. People with more education are more likely to follow political news. Therefore, people with more education can actually become mis-educated.”

Excerpt From: Eli, Pariser. “The Filter Bubble.”