Political Broadcasting

Even with the holidays upon us, regulation never stops.  There are numerous regulatory dates in December to which broadcasters need to keep in mind.  Furthermore, as the 2024 presidential campaign is already underway, there are political advertising deadlines to watch out for.  Here are some of the upcoming deadlines:

December 1 is the filing deadline for Biennial Ownership Reports by all licensees of commercial and noncommercial full-power TV/AM/FM stations, Class A TV stations, and LPTV stations.  The reports must reflect station ownership as of October 1, 2023 (see our article here on the FCC’s recent reminder about these reports).  The FCC has been pushing for stations to fill these out completely and accurately by the deadline (see this reminder issued by the FCC last week), as the Commission uses these reports to get a snapshot of who owns and controls what broadcast stations, including information about the race and gender of station owners and their other broadcast interests (see our article from 2021 about the importance the FCC attaches to these filings). Continue Reading December Regulatory Dates for Broadcasters – Biennial Ownership Reports, Annual EEO Public File Reports, LPFM Filing Window, LUC Political Windows for 2024 Election, and More

Here are some of the regulatory developments of significance to broadcasters from the past week, with links to where you can go to find more information as to how these actions may affect your operations.

  • The FCC has until December 27th to comply with a court order requiring the agency to conclude its still-pending

Facebook parent Meta announced this week that it will require labeling on ads using artificial intelligence or other digital tools regarding elections and political and social issues. Earlier this week, we wrote about the issues that AI in political ads pose for media companies and about some of the governmental regulations that are being considered (and the limited rules that have thus far been adopted).  These concerns are prompting all media companies to consider how AI will affect them in the coming election, and Meta’s announcement shows how these considerations are being translated into policy.

The Meta announcement sets out situations where labeling of digitally altered content will be required.  Such disclosure of the digital alteration will be required when digital tools have been used to:

  • Depict a real person as saying or doing something they did not say or do; or
  • Depict a realistic-looking person that does not exist or a realistic-looking event that did not happen, or alter footage of a real event that happened; or
  • Depict a realistic event that allegedly occurred, but that is not a true image, video, or audio recording of the event.

The Meta announcement makes clear that using AI or other digital tools to make inconsequential changes that don’t impact the message of the ad (they give examples of size adjusting, cropping an image, color correction, or image sharpening) will be permitted without disclosure.  But even these changes can trigger disclosure obligations if they are in fact consequential to the message.  In the past, we’ve seen allegations of attack ads using shading or other seemingly minor changes to depict candidates in ways that make them appear more sinister or which otherwise convey some other negative message – presumably the uses that Meta is seeking to prohibit. 

This change will be applicable not just to US elections, but worldwide.  Already, I have seen TV pundits, when asked about the effect that the new policy will have, suggesting that what is really important is what other platforms, including television and cable, do to match this commitment.  So we thought that we would look at the regulatory schemes that, in some ways, limit what traditional electronic media providers can do in censoring political ads.  As detailed below, broadcasters, local cable companies, and direct broadcast satellite television providers are subject to statutory limits under Section 315 of the Communications Act that forbid them from “censoring” the content of candidate advertising.  Section 315 essentially requires that candidate ads (whether from a federal, state, or local candidate) be run as they are delivered to the station – they cannot be rejected based on their content.  The only exception thus far recognized by the FCC has been for ads that have content that violates federal criminal law.  There is thus a real question as to whether a broadcaster or cable company could impose a labeling requirement on candidate ads given their inability to reject a candidate ad based on its content.  Note, however, that the no-censorship requirement only applies to candidate ads, not those purchased by PACs, political parties, and other non-candidate individuals or groups.  So, policies like that adopted by Meta could be considered for these non-candidate ads even by these traditional platforms. Continue Reading Meta to Require Labeling of Digitally Altered Political Ads (Including Those Generated By AI) – Looking at the Rules that Apply to Various Media Platforms Limiting Such Policies on Broadcast and Cable

In the Washington Post last weekend, an op-ed article suggested that political candidates should voluntarily renounce the use of artificial intelligence in their campaigns.  The article seemed to be looking for candidates to take the actions that governments have largely thus far declined to mandate.  As we wrote back in July, despite calls from some for federal regulation of the use of AI-generated content in political ads, little movement in that direction has occurred. 

As we noted in July, a bill was introduced in both the Senate and the House of Representatives to require that there be disclaimers on all political ads using images or video generated by artificial intelligence, in order to disclose that they were artificially generated (see press release here), but there has been little action on that legislation.  The Federal Election Commission released a “Notice of Availability” in August (see our article here) asking for public comment on whether it should start a rulemaking to determine if the use of deepfakes and other synthetic media imitating a candidate violate FEC rules that forbid a candidate or committee from fraudulently misrepresenting that they are “speaking or writing or otherwise acting for or on behalf of any other candidate or political party or employee or agent thereof on a matter which is damaging to such other candidate or political party or employee or agent thereof.”  Comments were filed last month (available here), and include several (including those of the Republican National Committee) that question the authority of the FEC to adopt any rules in this area, both as a matter of statutory authority and under the First Amendment.  Such comments do not portend well for voluntary limits by candidates, nor for actions from an FEC that by law has 3 Republican and 3 Democratic commissioners.Continue Reading Artificial Intelligence in Political Ads – Media Companies Beware

Here are some of the regulatory developments of significance to broadcasters from the past week, with links to where you can go to find more information as to how these actions may affect your operations.

  • The FCC’s Enforcement Bureau released its second EEO audit notice for 2023, which targets 150 radio and television stations for

Here are some of the regulatory developments of significance to broadcasters from the past week, with links to where you can go to find more information as to how these actions may affect your operations.

  • The FCC’s Media Bureau released a Public Notice reminding commercial and noncommercial broadcasters of their upcoming obligation to file biennial

With the 2024 election looming, broadcasters are already receiving requests for political advertising time, from PACs and other issue groups, and from both established candidates and newcomers eager to make an early splash to enhance their public standing.  Some of these potential buyers advance unique policy positions and, sometimes, unusual ad buying strategies.  How are broadcasters to deal with these early political ad buyers? 

Each broadcaster needs to discuss the issues that arise with these early political ads, both internally with their business teams and with their outside FCC counsel or in-house legal advisor.  The first question to ask is whether a station even wants to run these ads.  Ads from non-candidate buyers do not need to be run by stations but, if run, will likely impose some political file obligations on stations to the extent that they discuss candidates, potential candidates, or electoral and political issues (for more on political file issues, see our articles here, here, and here, and this video discussion that I did for the Indiana Broadcasters Association). Continue Reading Broadcaster’s Legal Considerations for Early Season Political Ads

The Senate this week approved Anna Gomez for the open Democratic FCC seat that has been vacant since the start of the Biden Administration.  As we wrote in May when the President first nominated her, Gomez is experienced in government circles, having worked at NTIA (a Department of Commerce agency dealing with federal spectrum use and other communications matters) and recently at the State Department preparing for international meetings about communications issues.  She also has a history in private law firm practice. 

Together with her nomination, the President renominated Commissioners Starks and Carr for new terms as Commissioners, but those nominations remain pending – not having been approved this week with the Gomez nomination.  Democratic Commissioner Starks’s term has already expired but he continues to serve under the allowable one-year carry-over which ends at the beginning of January 2024.  Republican Commissioner Carr’s term will expire at the end of this year, but he would be able to serve through the end of 2024 if his renomination is not confirmed.  There is some speculation that these nominations will be packaged with other pending nominations for positions at other government agencies to avoid having the FCC return to a partisan stalemate again in January if the Starks’ renomination is not approved by then. Continue Reading And Then There Were Five – Senate Approves Anna Gomez as Fifth FCC Commissioner – What Broadcast Issues Could a Full FCC Consider? 

Here are some of the regulatory developments of significance to broadcasters from the past week, with links to where you can go to find more information as to how these actions may affect your operations.

  • In the last two license renewal cycles, more fines have been issued for full-power stations violating the requirement that they

The Federal Election Commission last week voted to open for public comment the question of whether to start a rulemaking proceeding to declare that “deepfakes” or other AI technology used to generate false images of a candidate doing or saying something, without a disclosure that the image, audio or video, was generated by artificial intelligence and portrays fictitious statements and actions, violates the FEC’s rules.  The FEC rule that is allegedly being violated is one that prohibits a candidate or committee from fraudulently misrepresentating that they are “speaking or writing or otherwise acting for or on behalf of any other candidate or political party or employee or agent thereof on a matter which is damaging to such other candidate or political party or employee or agent thereof.”  In other words, the FEC rule prohibits one candidate or committee from falsely issuing statements in the name of an opposing candidate or committee.  The FEC approved the Draft Notice of Availability to initiate the request for public comment on a second rulemaking petition filed by the group Public Citizen, asking for this policy to be adopted.  This Notice of Availability was published in the Federal Register today, initiating the comment period.  The deadline for comments is October 16, 2023.  This is just a preliminary request for comments as to the merits of the Public Citizen petition, and whether the FEC should move forward with a more formal proceeding.

As we wrote in an article a few weeks ago, the FEC had a very similar Notice of Availability before it last month and took no action, after apparently expressing concerns that the FEC does not have statutory authority to regulate deliberately deceptive AI-produced content in campaign ads.  Apparently Public Citizen’s second petition adequately addressed that concern.  The Notice published in the Federal Register today at least starts the process, although it may be some time before any formal rules are adopted.  As we noted in our article, a few states have already taken action to require disclosures about AI content used in political ads, particularly those in state and local elections.  Thus far, there is no similar federal requirement. Continue Reading FEC Asks for Public Comment on Petition for Rulemaking on the Use of Artificial Intelligence in Political Ads