After (finally) finishing my entry on more twitter hatin’ and conflatin’, I seem to be in technology/social media mode. As I start to think more about blogs and social media in relation to ethics, moral selfhood and care of the self, here are a few sources that might be helpful:
1. Jonathan Franzen. Liking is for Cowards: Go For What Hurts (Also known as: Technology Provides an Alternative to Love”
One key argument he makes is that internet technology (ex. the “like” button on facebook) contributes to our narcissism and our refusal to move outside of ourselves to actually connect (and love) others. When we “like” something or friend someone, we just invite it into “our private hall of flattering mirrors.” I want to come back to Franzen’s claims in his essay and really think them through, especially what they mean for the Self. I’m not sure how or if it connects, but I want to revisit Chela Sandoval’s discussion of love in Methodology of the Oppressedand read it beside Franzen’s assessment of love.
2. Natasha Singer. The Trouble with the Echo Chamber Online
Speaking of insular selves who devote too much energy to reading/thinking about what they like/what they are interested in, Singer discusses the problems with the personalization of the web. Here’s a relevant passage:
But, in a effort to single out users for tailored recommendations or advertisements, personalization tends to sort people into categories that may limit their options. It is a system that cocoons users, diminishing the kind of exposure to opposing viewpoints necessary for a healthy democracy, says Jaron Lanier, a computer scientist and the author of “You Are Not a Gadget.”
I was excited to see this article because I have been known, quite frequently, to rail against the streamlining of my experience–especially when it comes to Netflix and how they recommend films based on my daughter’s excessive watching of Barney or Horseland or Suite Life on Deck.
3. Parser, Eli. The Filter Bubble: What the Internet is Hiding From You
In this book Parser, who is the former executive director of Moveon.org, discusses the dangers of web personalization and the filters that search engines–like google–or social media–like facebook—use to streamline our internet experience. Here’s his description of the filter bubble:
The basic code at the heart of the new Internet is pretty simple. The new generation of Internet filters looks at the things you seem to like—the actual things you’ve done, or the things people like you like—and tries to extrapolate. They are prediction engines, constantly creating and refining a theory of who you are and what you’ll do and want next. Together, these engines create a unique universe of information for each of us—what I’ve come to call a filter bubble—which fundamentally alters the way we encounter ideas and information.
Of course, to some extent we’ve always consumed media that appealed to our interests and avocations and ignored much of the rest. But the filter bubble introduces three dynamics we’ve never dealt with before.
First, you’re alone in it. A cable channel that caters to a narrow interest (say, golf ) has other viewers with whom you share a frame of reference. But you’re the only person in your bubble. In an age when shared information is the bedrock of shared experience, the filter bubble is a centrifugal force, pulling us apart.
Second, the filter bubble is invisible. Most viewers of conservative or liberal news sources know that they’re going to a station curated to serve a particular political viewpoint. But Google’s agenda is opaque. Google doesn’t tell you who it thinks you are or why it’s showing you the results you’re seeing. You don’t know if its assumptions about you are right or wrong—and you might not even know it’s making assumptions about you in the first place. My friend who got more investment-oriented information about BP still has no idea why that was the case— she’s not a stockbroker. Because you haven’t chosen the criteria by which sites filter information in and out, it’s easy to imagine that the information that comes through a filter bubble is unbiased, objective, true. But it’s not. In fact, from within the bubble, it’s nearly impossible to see how biased it is.
Finally, you don’t choose to enter the bubble. When you turn on Fox News or read The Nation, you’re making a decision about what kind of filter to use to make sense of the world. It’s an active process, and like putting on a pair of tinted glasses, you can guess how the editors’ leaning shapes your perception. You don’t make the same kind of choice with personalized fi lters. They come to you—and because they drive up profi ts for the Web sites that use them, they’ll become harder and harder to avoid.
You can read an excerpt of the book here. You can also watch a Democracy Now! interview with Parser here.