NHS England has reportedly begun hiding some publicly funded software as fears mount that fast-improving AI tools could turn open code into a cyberweapon.

The shift cuts against a long-standing rule: software built with public money should, in principle, remain publicly available. That approach aimed to promote transparency, reuse and better value for taxpayers. Now, according to reports, officials worry that increasingly capable hacking-focused AI models, including systems cited as Mythos, could scan open repositories, spot weaknesses and help attackers move faster than defenders can react.

The NHS appears to be weighing a difficult trade-off: public transparency and shared innovation on one side, and a rapidly changing cyber threat on the other.

The change lands at the center of a broader argument over how institutions should handle openness in the age of generative AI. Openly published software can help hospitals, researchers and contractors improve tools without rebuilding them from scratch. But the same visibility can also expose flaws, outdated components or weak configurations. Reports indicate NHS England now sees that balance differently as AI lowers the skill needed to probe systems for vulnerabilities.

Key Facts

  • NHS rules have stated that software created with public money should be publicly available.
  • Reports suggest NHS England is now restricting access to some software.
  • The policy change appears linked to fears about AI-assisted hacking.
  • Hacking-oriented AI models, including Mythos, have heightened concern over exposed code.

The implications reach beyond one health system. The NHS runs vast digital infrastructure tied to patient care, administration and research, making security failures especially costly. A retreat from open software could influence other public bodies that embraced open-source principles for accountability and efficiency. It also raises a deeper question: when AI changes the threat landscape, how much openness can critical institutions still afford?

What happens next matters because this looks less like a one-off exception and more like an early test of public-sector policy in the AI era. NHS England may face pressure to define clearer rules for which code stays public, which code gets shielded and how those decisions get reviewed. If officials cannot draw that line convincingly, the debate will spread quickly from health care to every government system that depends on software and public trust.