Meta Limits Options for Targeted Advertising to Teenagers

Dan Meier 11 January, 2023 

Advertisers will no longer be able to use gender to target ads to teenagers on Facebook and Instagram, under Meta’s new guidelines due to come into effect in February.

The social media giant is overhauling its advertising policy for young users, following concerns over online child safety in the US and Europe.

“We recognise that teens aren’t necessarily as equipped as adults to make decisions about how their online data is used for advertising, particularly when it comes to showing them products available to purchase,” the company said in a blog post. “For that reason, we’re further restricting the options advertisers have to reach teens, as well as the information we use to show ads to teens.”

As well as removing gender as a targeting option, the new policy assures that user activity can no longer be used to tailor ads for teenagers. Their activity on other websites was already off limits for advertisers, but now their engagement with specific Facebook and Instagram posts will cease to inform the types of ads young users will see.

Under the new guidelines, age and location will be the only pieces of information about a teen that Meta will use to serve them ads. “Age and location help us continue to ensure teens see ads that are meant for their age and products and services available where they live,” the company explained.

Meta additionally plans to expand young users’ control over the types of ads they see, adding a “See Less” option to their Ad Preferences settings starting in March. “For example, if a teen wants to see fewer ads about a genre of TV show or an upcoming sports season, they should be able to tell us that,” said the blog post. The users will also be able to hide any or all ads from a specific advertiser.

The child safety issue has recently flared up on both sides of the Atlantic as regulators work to mitigate the risks of early social media adoption. In December, Meta held a “Youth Safety and Well-Being” summit to establish principles for providing age-appropriate online experiences, incorporating guidance from child developmental experts, UN children’s rights principles and global regulation.

“I think everybody has a role,” said Nick Clegg, President of Global Affairs at Meta. “Social media companies have a role, families have a role, parents have a role, governments have a role, regulators have a role. This is a space where I think it is totally legitimate and normal for regulators to act.”

Protect and serve ads

Meanwhile in the UK, the Online Safety Bill is expected to pass this year after delays and amendments by government ministers in an ongoing row over free speech. The legislation would hand social media firms legal responsibility for content that harms young people, and extend Ofcom’s role to that of online safety regulator. With its new duties, the authority will be able to fine social media companies up to 10 percent of their revenues for failing to protect children’s online safety.

This week the watchdog called for evidence “on risks of harm to children online”, to help protect children from harmful content and prevent access to pornography. “We would like to hear from those with an expertise in protecting children online, and providers of online services,” the regulator said.

While these efforts have been welcomed by child advocacy groups, the biggest challenge remains the prevalence of accounts using a false date of birth. Last month the UK’s Advertising Standards Authority (ASA) found that over 17 percent of children are incorrectly registered on social media platforms with their age given as 18+, which increases their risk of being served age-inappropriate advertising by almost 20 percent.

Although Meta’s Advertising Standards prohibit advertising alcohol, gambling and weight-loss products to teenagers, the company is responsible for the bulk of ads served to young people on social media. Last year the company began testing age verification methods on Instagram, but the reliability and privacy of these techniques are uncertain, as VideoWeek previously explored.

Meta has called for an industry body to address these challenges “in the absence of consistent and clear regulation,” a rare plea for stronger legislation by a company regularly hit by regulatory fines.

That could explain why the big tech firm is getting its house in order as new powers emerge this year. Ofcom has already threatened adult sites with legal action over inefficiencies in their age assurance protocols, this week launching an enforcement programme within its current jurisdiction, which covers video-sharing platforms in the UK. As the watchdog’s online safety powers come into effect, Meta’s age verification measures will quickly need to mature.

Follow VideoWeek on Twitter and LinkedIn.

2023-01-11T16:35:57+01:00

About the Author:

Reporter at VideoWeek.
Go to Top