- Back to Home »
- Is Facebook Using Emotional Data To Target Ads?
Posted by : Du-Ann Daniel Wednesday, 10 May 2017
Facebook has denied the allegations, but let’s delve a little deeper into the issue at hand.
Back in 2014 Facebook came under some serious fire for what is now known as their emotional manipulation study. The study was conducted by Facebook without notifying or obtaining consent from users and aimed to find out if a user’s emotional state could be manipulated according to what content they saw in their timeline. People were understandably quite annoyed by this due to the fact that they were not informed and that the research was then published in a paper titled “Experimental Evidence Of Massive-Scale Emotional Contagion Through Social Networks”.
The experiment aimed to confirm whether emotional contagion can in fact be passed from person to person, even if there is no physical contact involved (ie, over the internet). And to determine whether when people were shown more negative posts, they produced fewer ‘happy’ posts, and when shown more ‘happy’ posts, they produced fewer negative posts.
Now the social network is back in the line of fire over allegations that they claimed they could detect when teen users as young as 14 were feeling emotionally vulnerable, particularly the emotions of “stressed”, “defeated”, “overwhelmed”, “anxious”, “nervous”, “stupid”, “silly”, “useless”, and a “failure”. This came to light after a presentation intended for advertisers was leaked.
The issue was originally reported on by the news network The Australian at the beginning of May, and Facebook has denied the allegations on all fronts. A Facebook spokesperson described The Australian’s report as misleading and denied that Facebook has any tools for advertisers to target users based on their emotional state.
But How Could Facebook Know What I’m Feeling?
One method that came to light in 2014 was analysis of the words in post content. Each word in your posts could have an emotional score attached to them, allowing the social network to determine your emotional state.
The other method that they could possibly use is information gathered from the reaction icons available on all posts. With these reaction buttons we are actually telling Facebook exactly how we feel about a post, which gives them more data on our personalities.
So What Do We Believe?
It’s hard to say which side of the story we should trust. On the one hand Facebook has been known to engage in this type of manipulative research before, but on the other hand it could all be a case of wrong accusations. At the same time, Facebook has not denied that they collected data along these lines, but rather that the data, which it claims is anonymous, was never used to target ads.
After the initial denial, Facebook did apologise and said studying users’ emotions broke internal rules on how it conducts research, promising an investigation into the matter.
The jury is still out on this one. Stay tuned for any further developments and updates.