Understanding the Regulatory Threats Facebook Faces

The rise of Facebook over its 17 years of existence has been one of the most exceptional cases of business growth in history. In retrospect, it almost seems obvious that Facebook would achieve such dominance. However, going back to even 2012, when Facebook went public at a now astonishingly low valuation of $46bn market cap (vs. $850bn today, as of 4/2/21; source: Capital IQ), its rise was never actually predestined. From its launch as an exclusive, college student only social directory to its early monetization period of the late 20 aughts before finally evolving into the dominant advertising juggernaut it is today, Facebook has adapted and evolved. Its initial challenge was competition from rival social media platforms (including photo sharing sites like Myspace as well as other forms of social media such as iMessage & Twitter). As it largely buried its direct competition, it took on the print and analog media world, ultimately coming on top of that battle as well (digital advertising, over half of which goes to Google and Facebook in the US, now makes up over half of advertising spend worldwide). Its largest threat now is no longer from external competition but rather from the myriad of regulatory threats it faces.

Facebook today encompasses four distinct brands, which consistently rank as four to the top five apps downloaded in the US:

- Facebook Blue: including both facebook.com and the messaging app platform, which Facebook has expanded from the initial photo sharing site into a comprehensive social media platform, local marketplace, and messaging service

· Largely built organically over the years

· ~2.8bn monthly active users (“MAUs”)

- Instagram: social media platform for sharing photos and videos

· Acquired for $1bn in 2012

· ~1.0bn MAUs

- WhatsApp: messaging and call system with reach features and a focus on privacy including end-to-end encryption

· Acquired for $16bn in 2014

· ~2.0bn MAUs

- Oculus: a virtual reality headset system

· Acquired for $2bn in 2014

· Oculus sold approximately 1.7mm headsets in 2019, up from 0.9mm in 2018

Misinformation Regulation

Punctuated by the 2016 Cambridge Analytica scandal, Facebook has come under mounting public pressure to do more to control the spread of misinformation, in particular that which is of a political nature. Largely in response to this, Facebook has hired 15,000 content moderators to flag “fake news” and other forms of misinformation, with some saying it would need an additional 15,000 to effectively monitor. Various countries have rolled out significant misinformation regulation. The most significant potential regulatory change, which I will expand on further below, is the potential repeal or amendment of Section 230.

Consumer Data Privacy

Across Facebook and Instagram (and to a much lesser extent WhatsApp), Facebook draws significant value from the data it collects on its users. This data includes demographics data (age, sex, etc.), behavioral data (how long a user views content, frequency of posting, etc.), as well as various aggregate data and metadata. The value of the data manifests primarily in two ways: it improves the various forms of content recommendation including Facebook’s newsfeed and Instagram’s feed and it improves the ability of advertisers (who are Facebook’s actual customers) to target users with advertising. In recent years, a number of jurisdictions have rolled out consumer data privacy frameworks that fundamentally limit the ability of the ability of companies to collect and store data. The most comprehensive framework to date has been the European Union’s GDPR, which puts onerous liability on companies that fail to protect the privacy of customers. Interestingly, GDPR may in fact have net benefitted Facebook, as its walled garden policy of customer data means it monetizes its data via ad targeting as opposed to data reselling (the model of many media companies), meaning they were hurt less than their competitors. Other jurisdictions have rolled out similar regulations including California, which enacted the California Consumer Privacy Act, which went into effect in January 2020. Further privacy regulation may lead to higher compliance costs for Facebook or potential degradation of Facebook’s data assets.

Anti-Trust Threat

As mentioned earlier, Facebook has grown both organically, via its Facebook Blue apps, but also inorganically via its acquisitions of Instagram, WhatsApp and Oculus. Facebook faces two distinct threats on the anti-trust threat: the possibility of being split up (which I handicap as a low probability yet high impact event) and the possibility of no longer being allowed to make WhatsApp / Instagram type acquisitions (which I handicap as nearly certain). As of April 2021, an antitrust investigation is ongoing, as the FTC and 46 states attorney generals are collaborating on a suit. Perhaps most damning in the investigation is emails drafted by Mark Zuckerberg alleging that “Instagram could hurt us”, indicating that part of the business rationale for the acquisition back in 2012 was to limit competition, a clear case of anti-trust regulation violation.

Should Facebook be split up, the impact would be severe for Facebook but not necessarily completely negative. Based on my analysis including looking at the current revenue split between Instagram and Facebook Blue (IG is ~30% of Facebook’s revenue) as well as the potential monetization models for WhatsApp, I believe that as standalone businesses, Instagram and WhatsApp could now be worth in the realm of $300bn and $400bn respectively. Given Facebook’s current market cap, this means that the business may in fact be worth more apart than together. Facebook’s ownership of WhatsApp has enabled WhatsApp to focus on growing user base rather than revenue (and growing it in a direction where it minimalizes cannibalization of Facebook Blue and Instagram) to date. Absent Facebook’s subsidies, WhatsApp would have to focus on monetization to pay its employees and server costs. At just $10/user/year (vs. ~$20 for Facebook Blue & ~$7 for WeChat, a Chinese analog), WhatsApp has $20bn+ of potential annual revenue. There is also historical precedent for antitrust breakups being accretive for the targets. Both Standard Oil’s heirs, which today includes Exxon-Mobil and Chevron, as well as Bell System’s (aka “Ma Bell”) heirs, which includes AT&T and Verizon, went on to generally achieve significant success.

Anticompetitive Behavior

Related to but somewhat distinct from the Anti-Trust regulatory threat are Facebook’s allegations of other forms of anticompetitive behavior. These business practices include their policy of restricting third party developers from accessing Facebook’s APIs without agreeing to work on an exclusive and non-competitive as well as various ways that they limit the interoperability of Facebook’s services with third parties. Potential regulation includes ending the former business practice as well as the federal government enforcing an open source communication protocol for content sharing and messaging between social media services.

Section 230 Repeal or Amendment

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” — Section 230 of the Communications Decency Act.

Now known as the “twenty-six words that created the Internet”, Section 230 effectively created immunity for telecom & internet businesses for the content they served. Due to the misinformation mentioned earlier as well as the case that social media companies unfairly benefit relative to their non-social media competitors, there has been talk at the highest levels of either repealing Section 230 or amending it significantly. Should that happen, Facebook would have to drastically increase its content moderation efforts and potentially even begin limiting various features of user generated content including low latency posting and the ability to cross post content.

Social Media Addiction and Mental Health Regulatory Threat

Facebook also faces risks stemming from its association with the various mental health effects of its products. These mental health effects include diminished attention span (per a 2015 Microsoft study, the average human attention span has declined from 12 seconds in 2000 to 8 seconds in 2015), anxiety and depression, and social media addiction. In the US, the Social Media Addiction Reduction Technology Act has been proposed by Senator Josh Hawley to limit various forms of addictive behavior encouraged by social media sites.

Conclusion

While Facebook has been vilified in recent years, it is important to remember the good it has also caused. It was not too long ago that the only way to share photos was to have film duplicated and mailed. This was expensive, inefficient, and slow. Just 13 years ago, in 2008, the national carriers in the U.S. charged $0.20 per text. Free messaging is now ubiquitous and a given for any service. These data points are important for both voters and regulators to remember as they envision a future way for Facebook to operate. I believe Facebook’s recent strategy shift from a “town hall” model to one that enables private, secure communication and content sharing demonstrates the company’s leadership ability to evolve not only in ways that immediately increase users or revenue but also to evolve in a way that it is conscientious of its effects on the world. Therefore, while certain specific regulation may be prudent, any forms of hostile or otherwise punitive regulation would be ill advised.