Aside from the fact that he’s the love of my life and I enjoy discussing things I’m working on with him, I thought he could offer a welcomed point-of-view on the filter bubble topic.
Brad has worked as a .net developer for the past six years and has a lot of insight into this world of coding, data mining and personalization via the Internet.
We decided to conduct a common experiment.
I went to Google on my laptop computer, and he went to Google on his cell phone.
We typed “Drake” into the search box, hoping to find results about the rapper named Drake.
The following are our top five results:
Gheni:
1. Stereogum story on Drake’s new song “Going Home.”
2. Vampire Weekend's Ezra Koenig publishes bizarre review of Drake's Nothing Was the Same” album
3. Drake | October's Very Own
4. Drake (rapper) - Wikipedia, the free encyclopedia
5. U.S. Magazine story about Selena Gomez Crushing on Drake
Brad:
1. Stereogum story on Drake’s new song “Going Home.”
2. U.S. Magazine story about Selena Gomez Crushing on Drake
3. Drake’s official website
4. Drake University ( an university in Des Moines, IA,
with no affiliation to the rapper)
5. Drake (rapper) - Wikipedia, the free
encyclopedia
I am not sure what algorithms Google used to determine what results to deliver to each of us, but just like some aspects of our individual personalities, our outputs were different.
Eli Pariser, author of “The Filter Bubble: What the Internet is Hiding from You,” explains our differences in results are not a unique phenomenon.
That is what happens every day in this new world of filter bubbles.
Pariser explained the filter bubble as follows:
“The new generation of Internet filters looks at the things you seem to like—the actual things you’ve done , or the things people like you like—and tries to extrapolate. They are prediction engines, constantly creating and refining a theory of who you are and what you’ll do and want next. Together, these engines create a unique universe of information for each of us—what I’ve come to call a filter bubble—which fundamentally alters the way we
encounter ideas and information” (Pariser, 2011, p. 9).
These filter bubbles are an extension of the concept of personalization, in which our online activity is personalized to our individual preferences.
Pariser argues the presence of filter bubbles introduces three new dynamics:
1. “You’re alone in it” (Pariser, 2011, p. 9).
2. “The filter bubble is invisible. Most viewers of conservative or liberal news sources know that they’re going to a station curated to serve a particular political viewpoint. But Google’s agenda is opaque. Google doesn’t
tell you who it thinks you are or why it’s showing you the results you’re seeing” (Pariser, 2011, p.10).
3. “You don’t choose to enter the bubble…They come to you—and because they drive up profits for the websites that use them, they’ll become harder and harder to avoid” (Pariser, 2011, p. 10). “The filter bubble’s costs are both personal and cultural” said Pariser (Pariser, 2011, p. 14).
While filter bubbles insulate information we take interest in, they also keep out new information that may or may not interest us.
Unfortunately or fortunately, depending on your personal opinion, these virtual walls could be catastrophic to personal education growth, the success of our public spheres and potentially to the democracy of the United States of America and life beyond.
Pariser said “the race to know as much as possible about you has become the central battle of the era for Internet giants like Google, Facebook, Apple and Microsoft” (Pariser, 2011).
Really, most websites have joined in on the “fun” (Pariser, 2011, p.7).
All the sites take different approaches in acquiring your information, but most methodologies involve tracking cookies and big data.
Google used click signals, meaning they record your click history and use your records to infer what you might like to see or read (Pariser, 2011, p. 35).
Meanwhile, Facebook is less subtle, asking outright for information about your likes and dislikes (Pariser, 2011, p. 35).
I found Pariser’s explanation of how Facebook operates their minified fascinating, as I’ve always wondered how they decide whose status updates I see and how often I see them. Facebook uses an algorithm called EdgeRank.
EdgeRank operates on three factors:
1. Affinity-“The friendlier you are with someone--as determined by the amount of time you spend interacting and checking out his or her profile” (Pariser, 2011, pp. 37-38).
2. “The relative weight of that type of content--Relationship status updates are weighted very highly” (Pariser, 2011, pp. 37-38).
3. Time--recently posted items are weighted over older ones (Pariser, 2011, pp. 37-38).
Brad knows all too well about how exposure to different posts, based on this ranking system, can evoke a range of emotions.
Brad is Facebook friends with a guy who he does not know offline.
The guy added Brad as a friend a couple of years ago after they "met" each other in a Facebook group for people who all drove the same type of car.
The guy, who holds very conservative political ideals, often posts negative and factually incorrect information about President Barack Obama.
“He bashes Obama about stuff that Obama has no control over,” said Brad. “I am neither for nor against Obama. I don’t know enough about his policies.”
He continued, “Until somebody posts something that seems extremely ignorant, then I research the topic so I can address their stupidity.”
When I asked why he does not just delete the guy, he responded, “I tend not to delete people unless they upset me with something really stupid. I like to educate people.”
In his own way, he’s fighting the filter bubble, which provides “less room for the chance encounters that bring insight and learning,” by keeping some diversity of opinion in his social media life (Pariser, 2011, p. 15).
As a .net developer and coder, Brad said filter bubbles can begin with the coding itself, but its rare.
“If a person knows what they’re looking at, they should come to the same conclusion. Like a math problem, they can approach it in different ways but still come to the same answer,” said Brad.
There are instances where differences such as gender can create differences, though.
He gave the example of the new Grand Theft Auto 5 video game, particularly in the strip club scenes.
“Typically, what woman do you know who wants to program the strip club portion of the game?” asked
Brad. “They may not have been to a strip club. Those are mostly for men. Nor do I want a woman to do it unless she goes and does research on it.”
Despite his sexist answer, I admit he might have a point.
Our personal differences matter and add to the creation of personalized filter bubbles.
Many media companies have already jumped on the filter bubble bandwagon, enabling our desire for personalization.
“Las Ultimas Noticias, a major newspaper in Chile, began basing its content entirely on what readers clicked on in 2004: Stories with lots of clicks got follow-ups, and stories with no clicks got killed. The reporters
don’t have beats anymore—they just try to gin up stories that will get clicks” (Pariser, 2011, p.71).
Also Yahoo’s Upshot blog transitioned to an operating procedure where “a team of editors mine the data produced by streams of search queries to see what terms people are interested in, in real time. Then they produce articles responsive to those queries” (Pariser, 2011, p. 71).
Pariser said he sees the future going as follows:
1. The cost of producing and distributing media of all kinds will continue to fall closer and closer to zero” (Pariser, 2011, pp. 51-52).
2. "We’ll be deluged with choices of what to pay attention to and we’ll continue to suffer from attention crash... “We’ll rely ever more heavily on human software curators to determine what news we should consume” (Pariser,2011, pp. 51-52).
3. We’ll rely on a mix of nonprofessional editors and software code to figure out what to watch, read and see. The code will draw heavily on the power of personalization and displace professional human editors” (Pariser,
2011, pp. 51-52).
While I agree with Pariser’s forecasts, they are troubling.
I wonder if people will become more ignorant since they will likely be shut off, willing or not, from diverse opinion.
Furthermore, the media’s future could prove problematic.
While I am all for giving media consumers stories they want to read, I am equally adamant about giving them stories they don’t think or know they need exposure to.
The media’s job is to innovate, advocate and educate.
This push for personalization endangers this trinity of responsibility.
I was also disturbed to hear Pariser’s story about the advertar.
I do not want bots friending or following me just to gain personal information about me.
Lastly, I was even more disturbed to learn from a new television commercial by Microsoft Outlook that Google mines our personal emails for the purposes of targeting us with advertisements and spam.
Outlook, who claims to respect its users’ privacy, said the commercial said more information can be found at www.scroogled.com.
I offer these questions for discussion:
1. What are your thoughts on the news media’s usage of personalization as a survival method? What do you think is next on the frontier?
2. In this world of personalization and filter bubbles, do you ever think social media sites like Twitter or Facebook will ever create a component allowing users to see who views their content regardless of whether they comment or interact with a post in some way? The rationale might be that only people who show up on your minified are people who have viewed or interacted with your post and therefore users might want to only interact with them going
forward?
3. How can filter bubbles assist a democracy?
I hope you were not turned off by the overload of stories about Brad in this post, but he’s part of my personal filter bubble. Also he really is a genius about all things related to computer coding. As always, I love healthy discussions so if you’d like to talk about any of my discussion questions or anything else related to filter bubbles, feel free to contact me! Thanks for reading!