Courts issued two seemingly conflicting rulings on whether AI generated materials are protected. Heppner (S.D.N.Y.) found that documents created with a consumer version of Claude AI were not privileged or work product because the tool exposed data to a third party provider. Warner (E.D. Mich.) reached the opposite result the
Privacy
SDNY Addresses Privilege and Work Product Implications of Using Unsecured Public AI Tools
A recent decision in United States v. Heppner appears to be the first federal ruling to directly address attorney‑client privilege and work‑product issues arising from a non‑lawyer’s use of a consumer-grade insecure AI tool for legal research. The court held that materials generated through Anthropic’s consumer version of Claude were…
The Return of the Video Privacy Protection Act (VPPA)
This year has seen a tremendous spike in the number of cases alleging violations of the Video Privacy Protection Act (“VPPA”), 18 U.S.C. § 2710, a statute enacted in 1988 in response to the Washington City Paper’s publication of a list of films that then-Supreme Court nominee Robert Bork had rented from a video store. The statute was originally intended to “allow[] consumers to maintain control over personal information divulged and generated in exchange for receiving services from video tape service providers.”
FTC Continues to Stake Out Role as Key AI Regulator
While speaking at the annual conference of the National Advertising Division on September 19, 2023, the Federal Trade Commission (“FTC”) announced a generative AI (“AI”) policy that is consistent with Chairwoman Khan’s focus on the perceived harms to consumers from large technology companies, fully embracing a plan to regulate AI swiftly, aggressively, and proactively.
The agency began its remarks on AI by observing that its purported policy decision to allow technology companies to self-regulate during the “Web 2.0” era was a mistake. Self-regulation, according to the FTC, was a failure that ultimately resulted in the collection of too much power and too much data by a handful of large technology companies.
Big Tech, Biometrics and BIPA: Meta’s Recent $68.5M Class Action Settlement
In July, Instagram’s parent company Meta Platforms, Inc. (“Meta”) agreed to a $68.5 million class-action biometric privacy settlement in connection with the company’s alleged violation of Illinois’ Biometric Information Privacy Act, 740 ILCS 14/1, et seq. (BIPA).
Factors in Fee-Shifting for Prevailing Defendants
Statutes permitting discretionary attorney fee-shifting for prevailing defendants vary in the circumstances under which fee-shifting is permitted. Two recent cases tackling the question of why and when a lawsuit warrants shifting attorneys’ fees from a prevailing defendant to the plaintiff who brought the claim reflect some of these differences. One case focused on “frivolousness” of the lawsuit, and the other imposed a “bad faith” requirement—despite the absence of such language from the relevant statute. The perceived motivation of the respective plaintiffs and purpose behind the statutes under which the claims were brought were influential.
Recent Partial Dismissal of Illinois Biometric Privacy Suit May Add Some Weight to the Scale on the Side of Defendants
Class action lawsuits accusing companies of violating the Illinois Biometric Information Privacy Act (“BIPA”) have more than doubled following a February 2023 ruling by the Illinois Supreme Court, which found, based on a plain reading of the statute, a separate claim accrues each time a person’s biometric identifier is scanned in violation of the statute.
Consumer Data Privacy Laws: What’s Happened and What Comes Next
Increasing oversight of tech companies, particularly in the realm of consumer privacy, has been a rare example of bipartisan agreement. Despite data privacy being a growing concern for consumers, however, there has been relatively little federal policymaking. To counteract this lack of action, some states have stepped in to fill this void—and have enacted policies that could have large impacts on how businesses operate. The rapid rate at which these laws are being enacted – eleven have been enacted– indicates states are taking an increasingly protective view of consumers’ data privacy. Businesses need to be prepared to comply with these new mandates, or risk costly enforcement measures.