Skip to main content

Facebook targets ads based on your political views


Facebook claims to know your political biases — and it's generating ads to exploit them.

The social network rolled out a new "ad preferences" page earlier this month, allowing users to understand more about how the company markets to them based on interests like "video games," "Airbnb" and the curiously labeled "human skin color." As The New York Times reportedTuesday, those interests also include political leanings and how likely a user is "to engage in politics."
If you want to see where you fall according to Facebook, log into the social network on a computer, visit the ad preferences page, then click "Lifestyle and Culture." Your political views should be included there, depending on how much Facebook knows about you. 
This shouldn't come as a surprise, though the presentation may be jarring to some. Facebook has allowed users to self-identify as politically liberal, conservative, or whatever on their profiles for years and years. And the bargain we make to use "free" online services is practically an internet cliche at this point: Companies like Facebook and Google trade on the information we willingly provide them with in exchange for use of their products.
There's nothing nefarious about the ad preferences page, in other words, especially given that you can "remove" anything that you're uncomfortable with.



Still, the function raises some questions that aren't easy to answer. "Media bias" has become amajor
 topic of discussion this election season, for example: Does having an easy-to-access measurement of political leanings put political reporters in a compromised position? If a reporter is accused of being liberal, could they tweet a screenshot of their Facebook ad preferences showing they're moderate? If a user chooses not to display their political beliefs on their profile, should Facebook be allowed to serve them ads based on what it's learned of those beliefs?


And months after the social network landed in hot water over a supposed anti-conservative bias in its "Trending Topics" module, how does Facebook decide what's "conservative" and what's "liberal" (or "very liberal")?
Facebook says in its FAQ for ad preferences that it targets marketing partially based on "activity on Facebook apps and services." According to the social network, that "includes things like" what pages you like, "information from your Facebook and Instagram profile" and "places you check in using Facebook." The social network can also follow you around the internet if a website uses the Facebook pixel, which helps businesses build ad campaigns using Facebook's platform.


Facebook determines your political beliefs based on pages or profiles you follow
A spokesperson for Facebook told Mashable that if a user self-identifies their political beliefs, that information will override anything else. If you say you're "liberal," Facebook will supposedly serve you relevant ads — no matter what pages you actually follow. 
But if you don't self-identify, things are a little more complicated. The spokesperson said Facebook determines your political beliefs based on pages or profiles you follow. If you "like" several pages that are generally followed by people who self-identify as "moderate," Facebook will likely determine that you are also moderate. 
In other words, Facebook is not making a judgement call about whether a profile or page leans left or right, according to the spokesperson. You can override Facebook's automatic categorization by self-identifying your political views on your profile, the spokesperson said, and you can also remove the "interest" from your ad preferences.
Regardless, remember always: Anything you do or share online can follow you around in unexpected ways.
UPDATED Aug. 23 at 3:10 p.m. PT with comment from Facebook.

Popular posts from this blog

how to share all group facebook one click and very easy

Facebook Starts Letting Teens Post Publicly

Like a cautious parent,Facebook is giving teen users new freedom despite risks. For the first time, users under 18 can post publicly. The logic is that other sites don’t restrict kids, teens are getting more web savvy, and young celebrities want a voice. This could let minors publicly share things they’ll regret, so they must manually opt in to public sharing and confirm they understand the risks. Somewhat disingenuously, Facebook frames its blog post about the change as being about adding more protection for teens. It starts off saying that now when people age 13 to 17 sign up, their posts to the News Feed are defaulted to “friends only” instead of “friends of friends (fof)” as they were before. That is important because many people don’t change their default settings, and if you have thousands of friends with thousands of friends, the fof setting would share your posts to more than a million people. But considering there are 1.15 billion people on Facebook already, and its growth has…

Instagram surges past 700M users, fueled by algorithmic feed

Instagram has gone through a whirlwind of changes the past few months. Between bookmarks, likable comments, live video, tags, zoom, drafts, Stories and of course the controversial algorithmic feed – no one can argue that the Facebook-owned photo sharing app hasn’t been innovating at a rapid pace. But the real question is what effect are these product enhancements having on the bottom line – which in the case of Instagram is measured in user growth. And the answers seems to be that it’s working. The app just announced that they have grown to over 600 million monthly active users. This is almost 10% of the world population – and 100 million more than the company had 6 months ago when they announced a milestone of 500M monthly actives. As a comparison, it took about 9 months to get from 400M to 500M monthly actives – so their growth is still accelerating, even with such a large user base.
While the growth over the past 6 months was still broad-based, the company noted that they were doing p…