Privacy & Data Security

With great promise comes great scrutiny. As artificial intelligence (“AI”) has become part of industries’ and individuals’ daily repertoire, it has also come under focus by antitrust regulators. The DOJ, in its so-called “Project Gretzky,” is gearing up with data scientists and others to be a tech-savvy version

Making do on its promise to “use every tool” in its arsenal to regulate artificial intelligence (‘AI”), the Federal Trade Commission (“FTC”) unanimously approved a resolution on November 21, 2023 authorizing the use of compulsory process in non-public investigations involving AI-related products and services. 

In a recent public comment addressed to the United States Copyright Office, the Federal Trade Commission seemingly expanded upon remarks made at the National Advertising Division back in September that it will aggressively and proactively challenge alleged unfair practices involving artificial intelligence, even if that means stretching the meaning of “unfair” to increase its jurisdiction over such matters.

This year has seen a tremendous spike in the number of cases alleging violations of the Video Privacy Protection Act (“VPPA”), 18 U.S.C. § 2710, a statute enacted in 1988 in response to the Washington City Paper’s publication of a list of films that then-Supreme Court nominee Robert Bork had rented from a video store. The statute was originally intended to “allow[] consumers to maintain control over personal information divulged and generated in exchange for receiving services from video tape service providers.”

While speaking at the annual conference of the National Advertising Division on September 19, 2023, the Federal Trade Commission (“FTC”) announced a generative AI (“AI”) policy that is consistent with Chairwoman Khan’s focus on the perceived harms to consumers from large technology companies, fully embracing a plan to regulate AI swiftly, aggressively, and proactively. 

The agency began its remarks on AI by observing that its purported policy decision to allow technology companies to self-regulate during the “Web 2.0” era was a mistake. Self-regulation, according to the FTC, was a failure that ultimately resulted in the collection of too much power and too much data by a handful of large technology companies. 

Class action lawsuits accusing companies of violating the Illinois Biometric Information Privacy Act (“BIPA”) have more than doubled following a February 2023 ruling by the Illinois Supreme Court, which found, based on a plain reading of the statute, a separate claim accrues each time a person’s biometric identifier is scanned in violation of the statute.  

A parent corporation is typically not held liable for the acts of a subsidiary. As such, disregarding the corporate form (i.e., by piercing the corporate veil) and holding the parent liable is an extraordinary remedy. That said, if a parent company exercises enough control over a subsidiary, however, courts may hold the parent liable. Because there is often some degree of overlap between a parent and its subsidiary, a question courts are often faced with is just how much control is enough to justify imposing liability on a parent for its subsidiary’s actions?