Heriot-Watt Research Warns: Layering Generative AI Into ML Systems Creates Hidden Business Risk

Businesses rushing to embed generative AI into their machine learning systems may be introducing risks that are harder to detect than the efficiency gains they’re chasing. That’s the finding of new research from Heriot-Watt University, led by Professor Michael Lones of the School of Mathematical and Computer Sciences, examining four major use cases of generative AI in ML workflows — and finding compounding risk in each.
The core concern isn’t that AI makes mistakes — it’s that when large language models are stacked inside existing ML pipelines, their errors become opaque and difficult to audit. In regulated industries like finance, insurance, and healthcare, that opacity creates real legal and operational exposure.
Key Takeaways
- Four high-risk use cases: The study reviewed generative AI as a pipeline component, a system design tool, a synthetic data generator, and an output analyzer. All four carry risks — and those risks multiply when LLMs are used repeatedly within the same workflow.
- Agentic models introduce unpredictability: Agentic AI systems — those that autonomously use external tools to complete tasks — create interactions that become increasingly hard to monitor or predict at scale.
- Compliance risk is real: In medicine and finance, regulators expect explainable automated decisions. As Professor Lones noted: “As soon as you start using LLMs, that gets really hard.”
- Cost savings may carry hidden costs: The study warns that efficiency gains from generative AI may arrive bundled with technical debt, legal liability, and bias risks for underrepresented groups.
The research doesn’t argue against generative AI — it argues for restraint and governance. Before adding another AI layer, organizations should ask: can we explain what this system does, and can we detect when it’s wrong?
Read the full article on SecurityBrief UK
Stay in Rhythm
Subscribe for insights that resonate • from strategic leadership to AI-fueled growth. The kind of content that makes your work thrum.
More from Thrum
Additional pieces exploring adjacent ideas
