Tuesday, July 1, 2014
A Facebook "social experiment" to manipulate your feelings
It is in our nature to be affected by the emotions of those around us, whether we are cognizant of it or not. Words, attitudes, appearance, and body language lend clues into whether one is happy or sad, and the human psyche more often than not leads us to empathize with those around us. Facebook however demonstrated "emotional contagion" strictly through text content. Those that saw fewer positive posts, themselves wrote fewer positive posts, and those that saw fewer negative posts in turn produced fewer negative posts. Put loosely, users that saw fewer negative posts were slightly happier.
Honestly, I'm not at all surprised, and at first glance I am not terribly disturbed by it. It's a matter of remembering who is the customer. I'm not paying Facebook for a service, so I am not the customer ... I am the product being consumed. Since I understand that, I can keep that in mind when deciding what to share and what not to share. In truth, I think Facebook did the world a service by revealing in a controversial way what all media do. As is stated in the study report, “Because people’s friends frequently produce much more content than one person can view, the News Feed filters posts, stories, and activities undertaken by friends. News Feed is the primary manner by which people see content that friends share. Which content is shown or omitted in the News Feed is determined via a ranking algorithm that Facebook continually develops and tests in the interest of showing viewers the content they will find most relevant and engaging.” So yes, Facebook manipulates news feeds because the business is best served when users (the product) are actively engaged. It's not all that different from traditional media - no one would deny that all media pick and choose what to report.
I do not find it in the least bit surprising that Facebook would try such an experiment. What is disturbing, though, is thinking about how this idea could be used in some really unnerving ways. Facebook has somewhere around a billion users from all around the globe. It's not too much of a stretch to think certain three-letter-acronym agencies could compel the site to use this capability with the express goal of inducing dissatisfaction with a particular government or (if you are a conspiracy theorist) with a candidate for office running against an entrenched incumbent.
Before running wild with such ideas though, it is worth looking at the actual results of the experiment. Yes, Facebook was able to demonstrate a change in the content of posts by the manipulated users - but it was on the order of a tenth of a percent. Statistically measurable but hardly overwhelming.
TL;DR? Facebook performed a research experiment that borders on creepy, proved that they could (minutely) manipulate user emotional state, and reminded users once again that if you are not paying, you are not the customer and should keep that in mind.
Do you have something to add? A question you'd like answered? Think I'm out of my mind? Join the conversation below, reach out by email at david (at) securityforrealpeople.com, or hit me up on Twitter at @dnlongen
Apple releases iOS 9.3.5 to block a sophisticated iPhone spy techniqu
Updated 2 September: It turns out that the same vulnerabilities exist in OS X for MacBooks and iMacs, and can be used t ...

"Ho! Ho! Ho!" or "Oh No No!"
It's December! A time for family gatherings, vacation travels, Christmas shopping - and holiday scams. Here are a few ...

Incremental wins: iOS11 strengthens the idea of Trust
Two years ago, a friend piqued my curiosity with a question about a iPhone / iPad app teenagers were using to hide content f ...

Six steps to block credit card fraud
Just over a year ago, I put together a simple guide to dodging financial fraud; it quickly became one of the most popular ...

Be sure to deregister Amazon devices purchased as gifts
Now that post-Thanksgiving shopping is in full swing, here's a brief tip for those purchasing Amazon gadgets as Christm ...