In recent months, we have seen concerted attempts to reign in digital and social media from all along the political spectrum – from Washington, in the states and even internationally.  We thought that we would look at some of those efforts and their motivations today.  We will look at many of these issues in more detail in future articles.

Towards the end of last year, the Trump Administration sought to strip social media platforms of Section 230 protections because of their alleged bias against conservative speakers (see our articles here and here).  A similar perception seems to underlie the recently proposed Florida legislation that seems to create for social media a policy similar to the equal opportunities (or “equal time”) policy that applies to broadcasters – a social media service cannot “de-platform” a political candidate if it allows the opposing candidate access to that platform.  That proposed legislation also has announced goals of requiring clear rules for access and editing of political views on such sites.  A press release about that legislation is here, though the actual text does not yet seem to be available for review.

On the other end of the political spectrum, there seems to be a concern that platforms are not doing enough to police such platforms.  Ways to ensure that the platforms combat disinformation are sought.  Hearings before the House Energy and Commerce Committee were held last week talking about combatting disinformation principally on traditional media platforms, with additional hearings next week looking at the same issues for social media.  We expect more review of these issues in the near future.

Section 230 is also under review in other contexts as well.  The SAFE TECH Act introduced in the Senate earlier this year would, among other things, remove Section 230 protections from content that a platform is paid to transmit, make the protections less automatic, and exempt many types of speech from the protections it affords – plus it would create a regime where an online service would have to take down offending content once it receives notice or face loss of any protections.

Internationally, Australian attempts to create what in effect be a statutory license for online platforms carriage of news from traditional media sources (or at least those sources registered with the government).  Online platforms, allowing online platforms to carry news from traditional media sources (or at least those sources registered with the government).  To carry such content (or even to just provide links to that content), online platforms would have to pay royalties to the media platforms that created that content.  If an online platform and content creator could not agree on the amount of the royalty, a government panel would intervene and set a royalty (seemingly akin to the rate-setting function of our Copyright Royalty Board for music licenses).  The idea is to compensate creators for the use of their content and thereby provide a more stable economic base for local news platforms to replace the revenue they have to the online services in recent years.  Of course, the online platforms argue that in many ways they are promoting traffic to the online services offered by these very same publishers who now want compensation.  It is a debate certain to be repeated around the world in the coming years.

But these high-profile attempts to regulate online platforms are not the only ways that where regulations are sought to make regulation of online platforms more like that which applies to the traditional media with which they compete.  We wrote yesterday about attempts to bring emergency information to streaming services through the expansion of EAS which now applies to broadcast, cable and wireless companies.  In the past, the FCC has extended captioning requirements to online platforms that repurpose broadcast video programming so that the hearing impaired can enjoy that content online.

Recent articles suggest that some in Congress are looking to the FCC to intervene in the offerings of virtual MVPDs – cable-like services being offered by some online platforms to stream local television stations to local communities without the need for the cable service at all.  But, unlike the cable and satellite television services, there is now no regulation over which television services these virtual MVPDs must carry (see our article here about a past review of this to see if these online services should have regulatory treatment more like that applicable to cable and satellite services).

Like the debates over how large the reach of a broadcast company can be (see our article here about the recent Supreme Court argument on one attempt to allow broadcasters to get bigger so that they can compete with online platforms), there is a similar concern about the size of the online platforms.  Senator Klobuchar has already introduced legislation that would increase scrutiny of new mergers and review ones that have already occurred – with a particular emphasis on tech mergers.  Similar concerns have been expressed by many others across the political spectrum.

Political speech and advertising are implicated as well.  We wrote years ago about an FEC review of the sponsorship identification requirements for political ads that are delivered through online platforms, a proceeding that has yet to be resolved.  Many states have stepped into the breach issuing their own laws on political broadcasting.  While some, like the Maryland law we wrote about here, have been found to be unconstitutional, other similar laws are being enforced in other states against political advertising carried on all media platforms, including online ones.

Privacy on the Internet is another issue that is likely to factor into debates over the future of tech companies.  As ads for the same products follow all of us from website to website, we wonder just how much do these companies know about us (and how much should they be allowed to know)?  Even online copyright issues are being reviewed (see our article here on the review of many aspects of online copyright issues, including the Digital Millennium Copyright Act’s Section 512 protections for platforms who host user-generated content and the perceived “whack-a-mole” problem with the current take-down notice requirements).

In the coming months, we will be looking at many of these debates and the issues that they raise.  The First Amendment of course, is a common issue in many of these situations.  Our constitutional protections for free speech present a unique hurdle in the US that any regulation must overcome, a burden that does not exist to the same degree in other countries that are facing many of these same issues.  The First Amendment led to the demise of the Maryland bill that attempted to regulate online political advertising and will likely forestall many other attempts at regulation.  Economic and policy issues also come into play.  For instance, with respect to Section 230 protections, there are the questions as to whether online platforms, like the social media accounts that most of us check routinely throughout the day, will continue to provide the services that they do without protections from liability for the content posted by their millions of users.  Practical issues also must be weighed.  Can online platforms have the technical capacity do everything that critics and regulators ask of them (an inquiry that is at the heart of the EEO issues we wrote about yesterday).  Watch as we tackle many of these issues in coming articles.