Use of Technology for Advocacy

Pricing algorithms are nothing new. They are, generally speaking, computer programs intended to help sellers optimize prices in real time, or close to it. These programs can use data on demand, costs, or even competitors’ prices to “learn” to set the prices of products. What is new is the proliferation of these programs across industries and the emergence of artificial intelligence-driven pricing algorithms. 

In a recent public comment addressed to the United States Copyright Office, the Federal Trade Commission seemingly expanded upon remarks made at the National Advertising Division back in September that it will aggressively and proactively challenge alleged unfair practices involving artificial intelligence, even if that means stretching the meaning of “unfair” to increase its jurisdiction over such matters.

Generative AI has taken the world by storm since OpenAI launched ChatGPT in November 2022. But the buzz and excitement of GAI has come with difficult legal questions that threaten the new technology. Several lawsuits—some of which we have discussed in detail—have been filed against companies whose GAI products have been trained on copyrighted materials. Up until now, we have only been able to speculate how courts will handle GAI as the industry has held its collective breath.

ChatGPT may be smart enough to pass the bar exam, but lawyers should take caution before relying on the Artificial Intelligence (“AI”) platform to conduct any legal business.

On June 22, 2023, Judge P. Kevin Castel of the Southern District of New York released a lengthy order sanctioning two attorneys for submitting a brief drafted by ChatGPT. Judge Castel reprimanded the attorneys, explaining that while “there is nothing inherently improper about using a reliable artificial intelligence tool for assistance,” the attorneys “abandoned their responsibilities” by submitting a brief littered with fake judicial opinions, quotes and citations.

Antitrust and tech is in the legal news almost daily, and often multiple times a day.  Here are a few recent developments with notable implications that may have flown under the radar: 1) renewed focus on gig economy issues; 2) potential enforcement efforts regarding director overlaps; and 3) challenges to MFN pricing. 

The U.S. Patent and Trademark Office has issued guidance on how it will treat applications to register “generic.com” terms in the wake of the Supreme Court’s June 30, 2020 decision in United States Patent and Trademark Office v. Booking.com.

We previously wrote about the Supreme Court’s Booking.com decision, which affirmed the Fourth Circuit’s decision that the mark BOOKING.COM was registrable and not generic. The Supreme Court’s decision tracked the arguments in an amicus brief we submitted on behalf of consumer perception specialists and academics from leading U.S. universities.

In a recent order from Livingston v. City of Chicago, Magistrate Judge Young Kim of the Northern District of Illinois provided useful guidance to litigants in the use of technology assisted review, or TAR. Importantly, Judge Kim affirmed what is known as “Sedona Principle Six,” the notion that a responding party is in the best position to design and evaluate procedures for preserving and producing its own electronically stored information, or ESI.   

In the latest piece to come out of the FTC’s new focus on emerging technologies, the FTC Bureau of Consumer Protection issued new guidance on the use of artificial intelligence (“AI”) and algorithms. The guidance follows up on a 2018 hearing where the FTC explored AI, algorithms, and predicative analysis. As the FTC recognizes, these technologies already pervade the modern economy. They influence consumer decision making – from what video to watch next, to what ad to click on, or what product to purchase. They make investment decisions, credit decisions, and, increasingly, health decisions, which has also sparked the interest of State Attorneys General and the Department of Health & Human Services. But the promise of new technologies also comes with risk. Specifically, the FTC cites an instance in which an algorithm designed to allocate medical interventions ended up funneling resources to healthier, white populations.