Advancing UK Aerospace, Defence, Security & Space Solutions Worldwide
  • Home
  • /
  • Security
  • /
  • NCSC chief warns security must come first with AI designs

Security

NCSC chief warns security must come first with AI designs

Lindy Cameron, CEO of the UK's National Cyber Security Centre (NCSC), today warned that security must be the primary consideration for developers of artificial intelligence (AI) in order to prevent designing systems that are vulnerable to attack.

Image courtesy NCSC

In a major speech, Lindy Cameron (above) highlighted the importance of security being baked into AI systems as they are developed and not as an afterthought. She also emphasised the actions that need to be taken by developers to protect individuals, businesses, and the wider economy from inadequately secure products.

Advertisement
Security & Policing Rectangle

Her comments were delivered to an audience at the influential Chatham House Cyber 2023 conference, which sees leading experts gather to discuss the role of cyber security in the global economy and the collaboration required to deliver an open and secure internet.

She said: “We cannot rely on our ability to retro-fit security into the technology in the years to come nor expect individual users to solely carry the burden of risk. We have to build in security as a core requirement as we develop the technology.

“Like our US counterparts and all of the Five Eyes security alliance, we advocate a ‘secure by design’ approach where vendors take more responsibility for embedding cyber security into their technologies, and their supply chains, from the outset. This will help society and organisations realise the benefits of AI advances but also help to build trust that AI is safe and secure to use.

“We know, from experience, that security can often be a secondary consideration when the pace of development is high.

“AI developers must predict possible attacks and identify ways to mitigate them. Failure to do so will risk designing vulnerabilities into future AI systems.”

The UK is a global leader in AI and has an AI sector that contributes £3.7 billion to the economy and employs 50,000 people. It will host the first ever summit on global AI Safety later this year to drive targeted, rapid, international action to develop the international guardrails needed for safe and responsible development of AI.

Advertisement
ODU RT

Reflecting on the National Cyber Security Centre’s role in helping to secure advancements in AI, she highlighted three key themes that her organisation is focused on. The first of these is to support organisations to understand the associated threats and how to mitigate against them. She said: “It’s vital that people and organisations using these technologies understand the cyber security risks – many of which are novel.

“For example, machine learning creates an entirely new category of attack: adversarial attacks. As machine learning is so heavily reliant on the data used for the training, if that data is manipulated, it creates potential for certain inputs to result in unintended behaviour, which adversaries can then exploit.

“And LLMs pose entirely different challenges. For example - an organisation's intellectual property or sensitive data may be at risk if their staff start submitting confidential information into LLM prompts.”

The second key theme Ms Cameron discussed was the need to maximise the benefits of AI to the cyber defence community. On the third, she emphasised the importance of understanding how our adversaries – whether they are hostile states or cyber criminals – are using AI and how they can be disrupted. She said: “We can be in no doubt that our adversaries will be seeking to exploit this new technology to enhance and advance their existing tradecraft.

“LLMs also present a significant opportunity for states and cyber criminals too. They lower barriers to entry for some attacks. For example, they make writing convincing spear-phishing emails much easier for foreign nationals without strong linguistic skills.”
 

Advertisement
Babcock LB Babcock LB
Peli launches 9730 RALS

Defence Security

Peli launches 9730 RALS

8 January 2026

Peli Products has launched the Peli 9730 Remote Area Lighting System (RALS), a next-generation lighting solution combining power, safety and portability.

Cranfield University continues collaboration with HMGCC

Defence Security

Cranfield University continues collaboration with HMGCC

7 January 2026

Cranfield University is continuing to help address national security engineering challenges through an ongoing collaboration with HMGCC (His Majesty’s Government Communications Centre) and its Co-Creation initiative: a partnership with Dstl (Defence Science and Technology Laboratory).

IFS to acquire Softeon

Aerospace Defence Security

IFS to acquire Softeon

6 January 2026

IFS today announced that it has entered into a definitive agreement to acquire Softeon, a provider of cloud-native Warehouse Management, Warehouse Execution and Distributed Order Management solutions.

Defence Medical Services awards Project Mercury contract to Avenue3

Defence Security

Defence Medical Services awards Project Mercury contract to Avenue3

6 January 2026

A £2.5 million contract to develop a Deployed Clinical Record system to enable defence clinicians to access military medical records anywhere in the world - Project Mercury - has been awarded by the Defence Medical Services, to Leeds based digital health-care solutions consultancy Avenue3.

Advertisement
Security & Policing Rectangle
Cyber action plan aims to bolster resilience of public services

Security

Cyber action plan aims to bolster resilience of public services

6 January 2026

Backed by over £210 million, a new UK Government Cyber Action Plan published today sets out how government will rise to meet the growing range of online threats, introducing measures that aim to make online public services more secure and resilient, so people can confidently use them - whether applying for benefits, paying taxes or accessing ...

Babcock leads new STEM pilot in Plymouth

Aerospace Defence Security

Babcock leads new STEM pilot in Plymouth

5 January 2026

Babcock International Group is to lead a new STEM pilot in Plymouth as part of a major UK Government £182 million national skills drive.

Advertisement
ODU RT
Advertisement
General Atomics LB