Generative AI has taken the world by storm since OpenAI launched ChatGPT in November 2022. But the buzz and excitement of GAI has come with difficult legal questions that threaten the new technology. Several lawsuits—some of which we have discussed in detail—have been filed against companies whose GAI products have been trained on copyrighted materials. Up until now, we have only been able to speculate how courts will handle GAI as the industry has held its collective breath.
Imagine you are an investor and you decide to file a lawsuit after a company that you invest in suffers a stock drop. When you get to the courthouse, you find that you are the first person to file a federal securities class action on these facts. However, because of the Private Securities Litigation Reform Act (PSLRA), the district court chooses another party to be “lead plaintiff” in the litigation. Under the control of that lead plaintiff, the court dismisses the case prior to class certification, and you want to appeal that decision. Do you have standing? Your name is in the case caption for the active complaint. You were, in fact, the very first plaintiff in this action. But you aren’t the lead plaintiff anymore.
The concept of corporate legal separateness has long been a fortress protecting affiliated business entities such as parents, subsidiaries, and sister companies from various kinds of liability and litigation. However, how much protection does such legal separateness offer the information that corporations gather and store when faced with vehicles of written discovery such as interrogatory requests or requests for production? In other words, if an opposing party requests information or documents from a party that requires that party to seek information or documents from an affiliated non-party entity, is the party then required to seek the requested information or documents from its affiliated non-party entities?
There is a time and place for everything, or so they say. Eminem and Too $hort are both somewhat polarizing artists. From songs such as Eminem’s “Cleaning Out My Closet” to Too $hort’s infamous “Blow The Whistle”, some of their more provocative music has been put in the spotlight in the workplace of an apparel manufacturer. Stephanie Sharp and six other employees, including one man, filed a hostile work environment claim under Title VII of the Civil Rights Act against their employer. The plaintiffs alleged that many employees, “mostly women”, complained to the employer about the “obscene and sexually offensive and misogynistic character” of the music being played in the workplace, even as far as various employees placing speakers on a forklift and driving around the facility blasting the music. However, notably, “a number of men” were also “offended by the manner in which the music portrayed men, and their relationships with women.” The employer argued that the conduct was not discriminatory on the basis of sex, emphasizing that “both men and women were offended by the work environment allegedly created by the music played in the warehouse.”
We have previously reported on changes the Law Commission was considering to the Arbitration Act 1996 (the Act). The Law Commission has now published its final report (the Final Report, available here).
The report draws to a close a review of English arbitration legislation that began in January 2022. A draft bill to implement the Commission’s conclusions and recommendations into law is provided with the report so it is now for the UK government to decide whether to introduce those changes to parliament.
This year has seen a tremendous spike in the number of cases alleging violations of the Video Privacy Protection Act (“VPPA”), 18 U.S.C. § 2710, a statute enacted in 1988 in response to the Washington City Paper’s publication of a list of films that then-Supreme Court nominee Robert Bork had rented from a video store. The statute was originally intended to “allow consumers to maintain control over personal information divulged and generated in exchange for receiving services from video tape service providers.”
This year, the federal government’s new health equity regulations began taking effect. The regulations represent the government’s increased commitment to health equity advancement as a major part of its regulatory enforcement. As these changes go into effect, states and businesses have begun to implement laws and policies in order to comply with the updated regulatory framework.
While speaking at the annual conference of the National Advertising Division on September 19, 2023, the Federal Trade Commission (“FTC”) announced a generative AI (“AI”) policy that is consistent with Chairwoman Khan’s focus on the perceived harms to consumers from large technology companies, fully embracing a plan to regulate AI swiftly, aggressively, and proactively.
The agency began its remarks on AI by observing that its purported policy decision to allow technology companies to self-regulate during the “Web 2.0” era was a mistake. Self-regulation, according to the FTC, was a failure that ultimately resulted in the collection of too much power and too much data by a handful of large technology companies.