United States Securities and Exchange Commission

Washington, DC 20549

 

NOTICE OF EXEMPT SOLICITATION

 

NAME OF REGISTRANT: Meta Platforms Inc. (META)

 

NAME OF PERSONS RELYING ON EXEMPTION: Proxy Impact

 

ADDRESS OF PERSON RELYING ON EXEMPTION: 5011 Esmond Ave, Richmond, CA 94805

 

 

WRITTEN MATERIALS: The attached written materials are submitted pursuant to Rule 14a-6(g)(1) (the “Rule”) promulgated under the Securities Exchange Act of 1934. Submission is not required of this filer under the terms of the Rule but is made voluntarily in the interest of public disclosure and consideration of these important issues.

 

   
 

 

  

________________________________________________________________________

Meta Platforms Inc. (FB)

 

Proposal #11–Report on Child Safety Impacts and Actual Harm Reduction to Children

 

Annual Meeting May 29, 2024

Contact: Michael Passoff, CEO, Proxy Impact michael@proxyimpact.com

 

A Message from lead filer Lisette Cooper, PhD, vice chair, Fiduciary Trust International1

 

Dear Fellow Meta shareholders,

 

Technological innovation can be like fire, capable of being used for good as well as for bad. In the case of Meta’s social media platforms, they can be used for good, but unfortunately, they have also been shown to cause great harm to children and teenagers who use them.

Meta’s social media platforms, which include Facebook and Instagram, have been linked to many dangers to both the mental and physical wellbeing of children and teenagers. These encompass cyberbullying, harassment, and exposure to sexual or violent content, which can lead to depression, anxiety, self-harm, or a distorted self-image. Children and teenagers using Meta’s social media platforms—including Facebook, Instagram, WhatsApp, and Messenger—can also be groomed, and fall prey to sextortion and human trafficking.

 

According to the National Center for Missing and Exploited Children (NCMEC), the vast majority of reported instances of online exploitation of children take place on Meta platforms. The Center has announced that its CyberTipline received nearly 36 million reports of child sexual abuse material, child sex trafficking, and online enticement in 2023—and almost 31 million of them came from Meta’s platforms.2 This represents an increase of 93% from Meta’s nearly 16 million reports in 2019, when shareholders first raised this issue with the company.3

 

In fact, Meta algorithms created for Instagram guide pedophiles to sellers of child sexual abuse material, essentially “connecting a vast pedophile network,” according to a Wall Street Journal investigation published in June 2023.4

 

Meta’s internal documents show that it rejected efforts to improve child safety and that an estimated 100,000 kids were sexually harassed on its platforms every day.5 6 NCMEC reports AI-generated child sexual abuse is exploding, which raises additional concerns given Meta’s commitment to developing AI products.7

 

_____________________________

1 This statement represents the views of Dr. Cooper, and not those of the Franklin Templeton organization.

2 https://www.missingkids.org/content/dam/missingkids/pdfs/2023-reports-by-esp.pdf

3 https://www.missingkids.org/content/dam/missingkids/pdfs/2019-reports-by-esp.pdf

4 https://www.wsj.com/articles/instagram-vast-pedophile-network-4ab7189

5 https://www.nytimes.com/2024/01/31/technology/meta-childrens-safety-documents.html

6 https://www.wsj.com/tech/children-on-instagram-and-facebook-were-frequent-targets-of-sexual-harassment-state-says-68401b07

7 https://www.theguardian.com/technology/2024/apr/16/child-sexual-abuse-content-online-ai?ref=morningtech.it

 

   
 

 

In light of these revelations, Meta could be doing so much more to work with law enforcement and child safety advocates to mitigate, or even reduce, these threats. Unfortunately, against the advice of both of these groups, Meta has chosen to pursue end-to-end encryption of Facebook Messenger—despite warnings that doing so will mask the actions of predators by hiding millions of records of child sexual abuse material. Privacy is important, but end-to-end encryption of Messenger is a step that should have been delayed until more progress on the protection of children was demonstrated.

 

Shareholders Can Play Big Role

I have spent many years of my career in asset management helping investors grow and protect their wealth by aligning their money with their values. Shareholders in Meta can make a huge difference in convincing the company leadership to do more to strengthen protections on Meta social media platforms for children and teenagers. They can also demand that the company’s management do more to cooperate with law enforcement agencies and child safety organizations to make social media a less dangerous place for our nation’s most vulnerable.

 

Meta has a great deal to lose, financially, if it continues to ignore the dangers posed by its social media platforms. For example, following the U.S. Surgeon General’s Advisory on social media and youth mental health, 42 U.S. state attorneys general filed lawsuits against Meta, citing the Advisory’s findings that Facebook and Instagram algorithms are designed to make the platforms addictive, and that they are harmful to the mental health of children and teens.

 

In September 2022, Meta was fined just over $400 million by the Data Protection Commission of Ireland for improperly safeguarding children’s information on Instagram. And the potential financial penalties will only increase thanks to new legislation in the U.S., the U.K., and the European Union. The U.K. Online Safety Act of 2023 incorporates measures to keep online users, and especially children, safe from harmful and fraudulent content. Meanwhile, the E.U.’s Digital Services Act (DSA) and Digital Markets Act, which went into effect in February 2024, will require Meta and other companies to identify, report, and remove child sexual abuse material from their platforms. On May 16, 2024, the E.U. launched a formal investigation of Meta concerning its breach of DSA child safety laws, which can result in fines of up to $8 billion.8

 

In this country, the REPORT Act was signed into law by President Biden earlier this month. This new law will enable the NCMEC’s national tipline to do more to collect reports of online exploitation, and will require those reports—and accompanying evidence—to be preserved longer, giving law enforcement more time to investigate and prosecute.

 

_____________________________

8 https://www.independent.co.uk/tech/facebook-instagram-meta-eu-investigation-b2546150.html

 

   
 

 

The new legislation around the world will only increase the potential legal and financial repercussions and penalties which Meta faces if it doesn’t take tangible steps to improve child safety on its platforms.

 

Our Resolution

In collaboration with Proxy Impact,9 and with support of other Meta shareholders, I have filed a resolution to be voted upon at Meta’s annual meeting in Menlo Park, Calif., on May 29, 2024. The resolution requests that:

 

“within one year, the Board of Directors adopts targets and publishes annually a report (prepared at reasonable expense, excluding proprietary information) that includes quantitative metrics appropriate to assessing whether Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platforms.”

 

Sadly, my family and I are too aware of how dangerous social media can be. My daughter was groomed by a man who misrepresented himself on Facebook Messenger, and later sold her into sexual slavery. She is now a recovering survivor, and a leader in the movement to end human trafficking and online sexual exploitation. My daughter has also spoken at two previous Meta annual meetings, including the company’s annual meeting on May 26, 2021.10

 

Building Safety Into Social Media Platforms By Design

While parents do have a role to play in protecting their children online, predators are very clever and resourceful, and so are their victims. The reality is that, to truly tackle the issue, tech companies like Meta need to build safety into their social media platforms by design. Short-term and reactive safety measures won’t make a significant difference over the long term.

 

If Meta does not build safety into its social media platforms by choice, proactively, then legislation will require doing so in ways that may not be best for shareholders. In addition, the longer Meta drags its heels, the more the risk to its reputation will grow, further threatening Meta’s long-term sustainability and profitability. Making a consistent effort to keep kids safe from exploitation in the present will also ensure they and their parents will remain customers and users well into the future.

 

_____________________________

9 https://www.proxyimpact.com/facebook

10 https://www.proxyimpact.com/_files/ugd/b07274_137f9b5a0d4a454b88cad70a5394c20c.pdf

 

   
 

 

We the filers of this resolution recognize and understand the need to protect privacy, and we do not dispute that end-to-end encryption will be a necessary step on this front. All we ask is that Meta focus on protecting our children before protecting user privacy in the immediate term—and delay end-to-end encryption until more has been done to remove the dangers lurking on its platforms.

 

As the world has become aware of how the algorithms and features on Meta’s social media platforms are being abused in the worst way, many people and institutions are mobilizing to end the abuse, and protect children and teenagers. I urge Meta shareholders to make their voices heard and unite behind our resolution to hold Meta management accountable. They now have the opportunity to protect young people online, while also protecting their investments.

 

When shareholders align their values with their investments, big corporations listen.

 

 

# # # # # #

 

 

A Message from Michael Passoff, CEO, Proxy Impact

 

Both ISS and Glass Lewis recommend FOR this resolution.

·ISS

“the company continues to experience controversies related to child safety on its platforms. There are also concerns that the company's plans to apply end-to-end encryption by default across its messaging platforms will severely hinder investigations of child predators. There is rising concern about the impacts of social media on teen mental health. Given the potential financial and reputational impacts of potential controversies related to child safety on the company's platforms, shareholders would benefit from additional information on how the company is managing the risks related to child safety. Therefore, this proposal merits shareholder support.”

 

·Glass Lewis

“We do not believe that the Company has provided sufficient disclosure to demonstrate to shareholders that these risks will be managed by the Company, nor do we have any reason to be assured that the Company will act proactively rather than reactively, as demonstrated by numerous controversies related to the distribution of high-risk content on its platform and messaging services. As such, we believe that the requested report and the adoption and reporting of targets will provide shareholders with valuable information, so they can better understand this sensitive issue in the context of the Company's efforts to minimize harmful content on its platforms. Moreover, we believe that this precatory proposal provides the Company sufficient latitude in implementation. Accordingly, we believe that adoption of this proposal could serve to mitigate regulatory and reputational risk and provide shareholders with sufficient context to understand how the Company is managing this issue.”

 

   
 

 

A Majority of Shareholders Supported This Resolution in 2023

A similar resolution in 2023 received the support of 817 million shares, valued at $216 billion based on the stock closing price on the day of the annual meeting. This represented 16.27% of the vote, which equaled majority support of 53.8% of the non-management-controlled vote. CEO Mark Zuckerberg owns approximately 16% of the Meta stock but gets 10 votes for one share, while regular shareholders get one vote for one share—consequently he controls over 60% of the vote, which distorts the level of shareholder concern on this issue.

 

Meta’s Unfulfilled Opposition Statements

Shareholders filed the first child safety resolution in 2020 and Meta’s opposition statements have barely changed since then as they primarily focus on stating existing policies, attempts to increase parental controls and user education, and hoped-for improvements in age verification, content moderation and removal. Unfortunately, the pace and scale of these efforts has utterly failed to stem the exponential growth in negative impacts to youth according to survivors, health and child safety organizations, regulatory agencies, and media investigations which report on these issues.

 

Even the Opposition Statements highly touted achievements, such as “In August 2023 alone, we disabled more than 500,000 accounts for violating our child sexual exploitation policies” have come in response to media stories illuminating these problems.

 

If Meta has made any significant measurable improvements in reducing child safety risks, then it is keeping that information to itself, hence the resolution’s call for an annual report focused on quantitative metrics in order to assess whether Meta is actually taking comprehensive and successful steps to reduce its levels of child endangerment.

 

 

We ask that you Vote FOR Proposal #11 – Report on Child Safety Impacts and Actual Harm Reduction to Children

 

 

 

 

 

 


Meta Platforms (NASDAQ:META)
Graphique Historique de l'Action
De Mai 2024 à Juin 2024 Plus de graphiques de la Bourse Meta Platforms
Meta Platforms (NASDAQ:META)
Graphique Historique de l'Action
De Juin 2023 à Juin 2024 Plus de graphiques de la Bourse Meta Platforms