Grammarly, the tool meant to assist with spelling, grammar, and in identifying plagiarism, is being sued for a new AI tool called “Expert Review.” The tool offers editing suggestions from established authors and writers—ostensibly not a bad idea—except that none of those people consented to being involved in the first place.
The tool offers real-time writing tips from celebrities like Stephen King and Neil deGrasse Tyson, as well as journalists, like The Markup founder Julie Angwin, who filed the class action lawsuit against Grammarly’s parent company Superhuman, after she alleged the tool used her likeness without her permission: “have worked for decades honing my skills as a writer and editor, and I am distressed to discover that a tech company is selling an imposter version of my hard-earned expertise,” Angwin said in a statement.
From photorealistic deepfakes on Sora to scammers using chatbots to swindle users out of money, AI has already been bending reality and using people’s likenesses at worrying speeds. The Grammarly lawsuit shows how professional writers’ likenesses are also up for grabs—in addition to having that same technology threaten their very careers and livelihoods. This is the latest battle in the war over what legal and ethical boundaries AI should not cross.
Sorry, not sorry
The federal lawsuit, which was filed in the Southern District of New York Wednesday, “challenges Grammarly’s misappropriation of the names and identities of hundreds of journalists, authors, writers, and editors to earn profits for Grammarly and its owner, Superhuman,” per court documents reviewed by Fast Company.
Angwin’s lawsuit comes as Superhuman has recently announced plans to phase out Expert Review. Shishir Mehrotra, Superhuman’s CEO, addressed the decision to remove the tool in a post on LinkedIn on Wednesday: “This kind of scrutiny improves our products, and we take it seriously.”
He continued: “As context, the agent was designed to help users discover influential perspectives and scholarship relevant to their work, while also providing meaningful ways for experts to build deeper relationships with their fans.”
But commenters on the public post, which include a linguistics professor, a New York Times editor, a public library clerk and others in the writing and editing industry, pushed back, arguing the CEO’s words don’t take real accountability or capture the gravity of the situation.
In a statement shared with Fast Company, Mehrotra followed up on his apology, but was dismissive of the lawsuit. “We have reviewed the lawsuit, and we believe the legal claims are without merit and will strongly defend against them,” he said in the statement.
The new identity wars
As AI continues to develop at breakneck pace, many workers—especially ones in fields at high risk for automation, like writing and editing—this Grammarly lawsuit brings fresh fears around how workers can protect not just their work, but their identities.
And many professionals are taking action. Actor Matthew McConaughey, for example, filed eight trademark applications earlier this year to protect his likeness and voice as AI deepfakes being scarily realistic and accurate.
Angwin’s attorney Peter Romer-Friedman calls the situation a “very straightforward legal case,” telling Fast Company that “various state laws for a long time have provided that it’s unlawful to use a person’s name, whether they’re famous or not, for commercial purposes or gained without their consent.” He says “that’s exactly what Superhuman did through the Grammarly expert review tool.”
Regardless of how it plays out, legal and AI experts worry about what the incident means for the future of many industries—and that some workplaces may not be ready.
The lawsuit “points to a troubling trend,” says Vered Zlaikha, partner and Head of Cyber & AI practice at Lipa Meir & Co. “In the race to attract users and market share, some AI developers and vendors may be tempted to push legal and ethical boundaries.” She also thinks this could be the first of many such legal battles between companies using AI tools, and the workers they affect.
“We may well see additional lawsuits and class actions brought by various affected parties, including users and individuals referenced or implicated by AI,” she notes.
Dalit Heldenberg, an AI adoption strategist and advisor, agreed, saying “we’re already seeing” it.
“Disney recently sent a cease-and-desist to Google over AI tools generating its characters, which led Gemini to start blocking those prompts,” Heldenberg told Fast Company. “It’s a sign that companies are beginning to draw clear boundaries around how AI products can use their intellectual property.”
In other words, even people or companies who use an AI tool without being aware of its legality may open themselves up to being sued. Fast Company asked Angin’s lawyer point blank if the people and organizations who used ‘Expert Review’ were legally exposed and if the firm planned to pursue legal action against them. “I can’t speak to that at this time,” he replied.
Look before you leap into AI
Zlaikha advises companies using AI tools to ask questions before jumping in and rolling them out.
“What contractual protections are in place?” she asks, and who bears responsibility in the face of legal action? How does the organization retain control and oversight over how the tool is created and deployed?
While larger companies will likely have the resources to adapt to the shifting AI litigation landscape, they’re also more likely to be targets. As Heldenberg put it, “When you have millions of users and hundreds of millions in revenue, you’re the first call a plaintiff’s lawyer makes. Smaller companies face a different risk: they often adopt AI tools without fully understanding the legal exposure they might create.”
As for Angwin, as the lawsuit states, she hopes to “stop Grammarly and its owner, Superhuman, from trading on her name and those of hundreds of other journalists, authors, editors, and even lawyers” and to stop the companies from “attributing words to them that they never uttered.”Meanwhile, Superman CEO Mehrotra, says “there is a better approach to bringing experts onto our platform and we are working on a version that will provide significantly more benefit to both users and experts,” he said in the statement to Fast Company.
