Facebook’s parent company Meta has announced it will remove sensitive ad targeting options related to health, race or ethnicity, political affiliation, religion or sexual orientation from January 19.
Currently, advertisers can target people who have expressed an interest in issues, public figures or organizations related to these topics. This information comes from tracking user activity on Facebook, Instagram and other company-owned platforms.
For example, someone who has shown an interest in “same-sex marriage” may see an ad from a non-profit organization supporting same-sex marriage. But categories could also be misused, and Meta, formerly Facebook, has come under intense scrutiny by regulators and the public to cleanse its platform of abuse and misinformation.
Meta Platforms Inc. said in a blog post on Tuesday that the decision was “not easy and we know this change may have a negative impact on some businesses and organizations.” Shares of the company closed at $ 335.37 on Tuesday, down nearly 1%.
âSome of our advertising partners have expressed concerns that these targeting options will disappear due to their ability to generate positive societal change, while others understand the decision to remove them,â wrote Graham Mudd, vice -President of Marketing and Advertising. “Like many of our decisions, it was not a simple choice and required a balance of competing interests where there was a two-way plea.”
The Menlo Park, Calif., Company, which made $ 86 billion in revenue last year thanks largely to its granular ad targeting options, has had many issues with the way it delivers advertisements to its billions of users.
In 2019, Facebook announced that it would revise its ad targeting systems to prevent discrimination in housing, credit and employment ads as part of a legal settlement. The social network said at the time that it would no longer allow housing, job or credit ads targeting people by age, gender or zip code. This also limited other targeting options so that these ads did not exclude people on the basis of race, ethnicity, and other categories legally protected in the United States, including national origin. and sexual orientation.
It also allowed outside groups that were part of the lawsuit, including the American Civil Liberties Union, to test its advertising systems to make sure they don’t allow discrimination. The company has also agreed to meet with the groups every six months for the next three years and is in the process of creating a tool for anyone to search for real estate listings in the United States targeting different regions of the country.
After an outcry over its lack of transparency over political ads, Facebook ran ahead of the 2016 election, in stark contrast to the way ads are regulated on traditional media, the company has created an ad archive that includes details such as who paid for an ad and when it ran. But it does not share information about who receives the ad.
Outside researchers have tried to remedy this. But in August, Facebook shut down the personal accounts of two New York University researchers and shut down their investigation into disinformation disseminated through political ads on the social network.
Facebook said at the time that researchers violated its terms of service and were involved in unauthorized data collection from its huge network. Academics have said, however, that the company is trying to exert control over research that portrays it in a negative light.
NYU researchers from the Ad Observatory project have spent several years studying the Facebook Ads Library, where searches can be done on the ads served on Facebook products.
The access was used to “uncover systemic loopholes in Facebook’s ad library, to identify disinformation in political ads, including many that sow distrust of our electoral system, and to study the apparent amplification of disinformation partisan by Facebook, âsaid Laura Edelson, the senior researcher behind NYU Cybersecurity for Democracy, in response to the shutdown.