artificial intelligence

Here are some of the regulatory developments of significance to broadcasters from the past week, with links to where you can go to find more information as to how these actions may affect your operations.

  • Congress passed, and the President signed, a continuing resolution to extend funding for the Federal government, including the FCC, averting

Expecting quiet weeks, we took the holidays off from providing our weekly summary of regulatory actions of interest to broadcasters.  But, during that period, there actually were many regulatory developments.  Here are some of those developments, with links to where you can go to find more information as to how these actions may affect your

Earlier this week, we covered the broadcast issues that the FCC may be facing in 2024.  But the FCC is just one of the many branches of government that regulates the activities of broadcasters.  There are numerous federal agencies, the Courts, Congress, and even state legislatures that all are active in adopting rules, making policies, or issuing decisions that can affect the business of broadcasting and the broader media industry.  What are some of the issues we can expect to see addressed in 2024 by these authorities?

For radio, there are music rights issues galore that will be considered.  Early in the year, the Copyright Royalty Board will be initiating the proceeding to set streaming royalties for webcasters (including broadcasters who stream their programming on the Internet) for 2026-2030.  These proceedings, which occur every five years, are lengthy and include extensive discovery and a trial-like hearing to determine what royalty a “willing buyer and a willing seller” would arrive at for the noninteractive use of sound recordings transmitted through internet-based platforms.  Because of the complexity of the process, the CRB starts the proceeding early in the year before the year in which the current royalty rate expires.  So, as the current rates expire at the end of 2025, parties will need to sign up to participate in the proceeding to determine 2026-2030 rates early this year, even though the proceeding is unlikely to be resolved until late 2025 (unless there is an earlier settlement)(the CRB Notice asking for petitions to participate in the proceeding is expected to be published in the Federal Register tomorrow).  Initial stages of the litigation (including the identification of witnesses, the rate proposals, the evidence supporting those proposals, and the initial discovery) will likely take place this year. Continue Reading Gazing into the Crystal Ball at Legal and Policy Issues for Broadcasters in 2024 – Part II: What to Expect from the Courts and Agencies Other than the FCC

Here are some of the regulatory developments of significance to broadcasters from the past two weeks, with links to where you can go to find more information as to how these actions may affect your operations.

  • The FCC adopted a Report and Order establishing rules implementing the January 2023 Low Power Protection Act, which provides

Here are some of the regulatory developments of significance to broadcasters from the past two weeks, with links to where you can go to find more information as to how these actions may affect your operations.

  • The AM for Every Vehicle Act was scheduled for a US Senate vote this week through an expedited process

Another state has joined the list of those that require clear disclosure of the use of artificial intelligence (“AI”) in political ads, joining others that have addressed concerns about deep fakes corrupting the political process. Michigan’s Governor Whitmer just signed a bill that adds Michigan to 4 other states (Texas, California, Washington, and Minnesota) that have enacted laws requiring the clear identification of the use of AI in political ads.  As many media companies are struggling with their policies on AI, and as the federal government has not acted to impose limits on the use of AI in political ads (see our posts here and here), it has been up to states to adopt rules that limit these practices.

The Michigan bill, H.B. 5141, applies to “qualified political advertisements” which include any advertising “relating to a candidate for federal, state, or local office in this state, any election to federal, state, or local office in this state, or a ballot question that contains any image, audio, or video that is generated in whole or substantially with the use of artificial intelligence.”  A companion bill, H.B. 5143, defines “artificial intelligence” as “a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments, and that uses machine and human-based inputs to do all of the following: (a) Perceive real and virtual environments. (b) Abstract such perceptions into models through analysis in an automated manner. (c) Use model inference to formulate options for information or action.”Continue Reading Michigan Becomes the Fifth State to Require Disclosure of the Use of AI in Political Ads

Here are some of the regulatory developments of significance to broadcasters from the past week, with links to where you can go to find more information as to how these actions may affect your operations.

  • The NAB and REC Networks, an LPFM advocacy organization, jointly requested an extension of the December 12, 2023 deadline for

Facebook parent Meta announced this week that it will require labeling on ads using artificial intelligence or other digital tools regarding elections and political and social issues. Earlier this week, we wrote about the issues that AI in political ads pose for media companies and about some of the governmental regulations that are being considered (and the limited rules that have thus far been adopted).  These concerns are prompting all media companies to consider how AI will affect them in the coming election, and Meta’s announcement shows how these considerations are being translated into policy.

The Meta announcement sets out situations where labeling of digitally altered content will be required.  Such disclosure of the digital alteration will be required when digital tools have been used to:

  • Depict a real person as saying or doing something they did not say or do; or
  • Depict a realistic-looking person that does not exist or a realistic-looking event that did not happen, or alter footage of a real event that happened; or
  • Depict a realistic event that allegedly occurred, but that is not a true image, video, or audio recording of the event.

The Meta announcement makes clear that using AI or other digital tools to make inconsequential changes that don’t impact the message of the ad (they give examples of size adjusting, cropping an image, color correction, or image sharpening) will be permitted without disclosure.  But even these changes can trigger disclosure obligations if they are in fact consequential to the message.  In the past, we’ve seen allegations of attack ads using shading or other seemingly minor changes to depict candidates in ways that make them appear more sinister or which otherwise convey some other negative message – presumably the uses that Meta is seeking to prohibit. 

This change will be applicable not just to US elections, but worldwide.  Already, I have seen TV pundits, when asked about the effect that the new policy will have, suggesting that what is really important is what other platforms, including television and cable, do to match this commitment.  So we thought that we would look at the regulatory schemes that, in some ways, limit what traditional electronic media providers can do in censoring political ads.  As detailed below, broadcasters, local cable companies, and direct broadcast satellite television providers are subject to statutory limits under Section 315 of the Communications Act that forbid them from “censoring” the content of candidate advertising.  Section 315 essentially requires that candidate ads (whether from a federal, state, or local candidate) be run as they are delivered to the station – they cannot be rejected based on their content.  The only exception thus far recognized by the FCC has been for ads that have content that violates federal criminal law.  There is thus a real question as to whether a broadcaster or cable company could impose a labeling requirement on candidate ads given their inability to reject a candidate ad based on its content.  Note, however, that the no-censorship requirement only applies to candidate ads, not those purchased by PACs, political parties, and other non-candidate individuals or groups.  So, policies like that adopted by Meta could be considered for these non-candidate ads even by these traditional platforms. Continue Reading Meta to Require Labeling of Digitally Altered Political Ads (Including Those Generated By AI) – Looking at the Rules that Apply to Various Media Platforms Limiting Such Policies on Broadcast and Cable

In the Washington Post last weekend, an op-ed article suggested that political candidates should voluntarily renounce the use of artificial intelligence in their campaigns.  The article seemed to be looking for candidates to take the actions that governments have largely thus far declined to mandate.  As we wrote back in July, despite calls from some for federal regulation of the use of AI-generated content in political ads, little movement in that direction has occurred. 

As we noted in July, a bill was introduced in both the Senate and the House of Representatives to require that there be disclaimers on all political ads using images or video generated by artificial intelligence, in order to disclose that they were artificially generated (see press release here), but there has been little action on that legislation.  The Federal Election Commission released a “Notice of Availability” in August (see our article here) asking for public comment on whether it should start a rulemaking to determine if the use of deepfakes and other synthetic media imitating a candidate violate FEC rules that forbid a candidate or committee from fraudulently misrepresenting that they are “speaking or writing or otherwise acting for or on behalf of any other candidate or political party or employee or agent thereof on a matter which is damaging to such other candidate or political party or employee or agent thereof.”  Comments were filed last month (available here), and include several (including those of the Republican National Committee) that question the authority of the FEC to adopt any rules in this area, both as a matter of statutory authority and under the First Amendment.  Such comments do not portend well for voluntary limits by candidates, nor for actions from an FEC that by law has 3 Republican and 3 Democratic commissioners.Continue Reading Artificial Intelligence in Political Ads – Media Companies Beware

Here are some of the regulatory developments of significance to broadcasters from the past week, with links to where you can go to find more information as to how these actions may affect your operations.

  • The FCC’s Enforcement Bureau released its second EEO audit notice for 2023, which targets 150 radio and television stations for