Generating key takeaways...

A proposed class action in Manhattan federal court accuses Grammarly, the AI writing tool, of using the reputations of well-known journalists, authors and academics to market an AI editing feature without their permission.

The complaint, filed by investigative journalist Julia Angwin, alleges the company’s “Expert Review” tool presented editing suggestions as if they were drawn from named writers, creating the impression those individuals had agreed to participate.

The case highlights a growing legal question for AI products: whether software companies can invoke real people’s identities or expertise to frame automated advice.

According to the complaint, Grammarly subscribers who activated the feature saw status messages such as “reading your text” and “finding experts to review your piece,” followed by prompts including “Applying ideas from Julia Angwin” alongside short biographies of the cited figures.

Angwin, a Pulitzer-winning reporter and founder of The Markup, says she never consented to be included as an “expert,” never licensed her name or likeness and never approved the comments attributed to her.

She told the court she learned of the feature only after a March 2026 article reported that the tool relied on real people’s names. In the filing, she said she was “shocked and horrified” that an AI product was using her identity for profit and warned users might hold her responsible if they followed advice attributed to her.

The lawsuit names Grammarly and its parent company, Superhuman Platform, Inc, and seeks damages exceeding $5 million.

The complaint cites New York and California right-of-publicity laws as well as California common law, arguing that displaying living authors’ names in the interface and marketing the feature as applying their ideas amounted to unauthorised commercial use.

It also includes an unjust enrichment claim, alleging the company generated subscription revenue from the reputations of writers whose identities were used without compensation.

Critics say that even when the system described recommendations as “inspired” by a particular expert, the practical effect was to trade on recognisable names to build trust in the product.

Grammarly has since paused the feature. According to reports, the company disabled Expert Review after criticism and said it would reassess the product.

Source: Noah Wire Services

Noah Fact Check Pro

The draft above was created using the information available at the time the story first
emerged. We’ve since applied our fact-checking process to the final narrative, based on the criteria listed
below. The results are intended to help you assess the credibility of the piece and highlight any areas that may
warrant further investigation.

Freshness check

Score:
8

Notes:
The article reports on a class action lawsuit filed on March 11, 2026, against Grammarly’s ‘Expert Review’ feature. ([wired.com](https://www.wired.com/story/grammarly-is-facing-a-class-action-lawsuit-over-its-ai-expert-review-feature/?utm_source=openai)) This is the earliest known publication date for this specific lawsuit, indicating the content is fresh. However, similar reports have appeared in other reputable outlets around the same time, suggesting the narrative is not entirely original. ([law360.com](https://www.law360.com/articles/2451988/grammarly-hit-with-class-action-over-expert-review-ai-tool?utm_source=openai))

Quotes check

Score:
7

Notes:
The article includes direct quotes from Julia Angwin, such as her statement expressing distress over the unauthorized use of her expertise. ([wired.com](https://www.wired.com/story/grammarly-is-facing-a-class-action-lawsuit-over-its-ai-expert-review-feature/?utm_source=openai)) These quotes are consistent with those found in other reputable sources, indicating they are not original to this article. ([law360.com](https://www.law360.com/articles/2451988/grammarly-hit-with-class-action-over-expert-review-ai-tool?utm_source=openai))

Source reliability

Score:
6

Notes:
The article originates from FindLaw, a legal information website. While it provides detailed coverage of the lawsuit, FindLaw is not a major news organisation and may not have the same editorial standards as more established outlets.

Plausibility check

Score:
9

Notes:
The claims in the article align with reports from other reputable sources, such as WIRED and Law360, confirming the plausibility of the events described. ([wired.com](https://www.wired.com/story/grammarly-is-facing-a-class-action-lawsuit-over-its-ai-expert-review-feature/?utm_source=openai))

Overall assessment

Verdict (FAIL, OPEN, PASS): FAIL

Confidence (LOW, MEDIUM, HIGH): MEDIUM

Summary:
The article provides a detailed account of the class action lawsuit against Grammarly’s ‘Expert Review’ feature, with information corroborated by other reputable sources. However, the reliance on FindLaw as the primary source, which is not a major news organisation, raises concerns about the independence and reliability of the information. Additionally, the use of quotes that appear in other reputable sources suggests a lack of originality. Given these factors, the content does not meet the necessary standards for publication under our editorial indemnity.

Share.

Get in Touch

Looking for tailored content like this?
Whether you’re targeting a local audience or scaling content production with AI, our team can deliver high-quality, automated news and articles designed to match your goals. Get in touch to explore how we can help.

Or schedule a meeting here.

© 2026 Engage365. All Rights Reserved.
Exit mobile version