2 min read
Texas AG settles AI accuracy case with Pieces Technology
Caitlin Anthoney Jan 1, 2025 5:32:59 PM

On September 18, 2024, Texas Attorney General Ken Paxton announced a settlement with Pieces Technology under the Texas Deceptive Trade Practices-Consumer Protection Act (DPTA). The enforcement action centered on alleged false advertising of the company’s AI capabilities, marking the first generative AI settlement under a state consumer protection act.
What happened
Pieces Technology uses AI to assist hospitals by summarizing and drafting clinical notes. The company advertised its AI as having a "critical hallucination rate" below 0.001%—a metric describing the likelihood of false or misleading outputs.
The Texas AG argued that these claims were “false, misleading, or deceptive” under the DPTA.
Although Pieces denied wrongdoing, the settlement requires the company to:
- Define and explain the metrics it uses to advertise AI accuracy, including calculation methods.
- Clearly disclose harmful or potentially harmful uses of its products to customers.
- Avoid any false or misleading advertising related to its AI tools.
No monetary penalties were imposed, but Pieces must comply with future state demands to demonstrate adherence to the settlement terms.
The backstory
The case is part of a broader trend where state attorneys general focus on AI through consumer protection laws. Earlier this year, Massachusetts issued guidance on AI reliability, and Colorado passed a law regulating algorithmic discrimination, effective in 2026.
Going deeper
The Texas settlement with Pieces Technology sheds light on AI transparency and accountability, with several notable takeaways:
- AI hallucinations: Instances where AI generates false or misleading outputs, as claimed in this case. Pieces advertised extremely low rates of hallucinations without adequately explaining how these metrics were calculated.
- Regulatory scrutiny: The FTC and state attorneys general are increasingly targeting companies that misrepresent AI capabilities or fail to disclose potential risks.
- Transparency requirements: Under the settlement, Pieces must now disclose definitions, methodologies, and benchmarks when advertising AI accuracy metrics.
- Industry implications: AI developers must focus on their system performance and how they present those capabilities to the public. Misleading advertising, even if unintended, exposes businesses to significant regulatory risks.
What was said
In a June press release, Attorney General Paxton stated, “Any entity abusing or exploiting Texans’ sensitive data will be met with the full force of the law. Companies that collect and sell data in an unauthorized manner, harm consumers financially, or use artificial intelligence irresponsibly present risks to our citizens that we take very seriously.”
“As many companies seek more and more ways to exploit data they collect about consumers, I am doubling down to protect privacy rights.”
Why it matters
As AI becomes more integrated into the healthcare sector, organizations must improve their accuracy, transparency, and consumer safety when deploying AI systems.
The bottom line
The Texas settlement sets a precedent for how state attorneys general will address AI-related claims. Businesses should anticipate heightened scrutiny and comply with new measures to mitigate future regulatory and legal risks.
Related: The future of AI regulation
FAQs
Can AI be integrated into HIPAA compliant emails?
Yes, AI-powered features can be integrated with HIPAA compliant emailing platforms, like Paubox, to automate processes like patient consent management and sending personalized emails while maintaining HIPAA compliance.
Are there any limitations when using AI in HIPAA compliant emails?
Yes, healthcare providers must ensure that AI-powered features comply with HIPAA regulations and industry best practices for data security and privacy. Additionally, providers should evaluate the reliability of AI algorithms to avoid potential risks or compliance issues.
How can providers make Google Workspace email HIPAA compliant?
Providers must use a Business or Enterprise plan, sign a business associate agreement (BAA) with Google, and use a HIPAA compliant platform to protect patient information.
Learn more: How to set up HIPAA compliant emails on Google