How to Address the AI Skills Challenge in Cybersecurity
Advancements in AI technology have provided significant opportunities to improve performance in all areas of organisations, including cybersecurity teams.
Vast numbers of security vendors offer AI-based security tools or have integrated the technology into existing solutions. The functions performed range from alert analysis to automated vulnerability remediation.
Additionally, large language models (LLMs) can rapidly speed up routine tasks such as report writing and creating security policies.
A PwC report recently found that AI is the top investment priority in cybersecurity budgets over the next 12 months.
As a result of these offerings, experts have frequently pointed to the potential of AI to help address the widening cyber skills gap by lifting some of the burden on security professionals.
However, it is becoming apparent that AI is creating a new set of skills and employment challenges in the cybersecurity sector.
In this article we dive into what new AI skills and challenges he industry must understand and get to grips with quickly.
AI Usage Requires New Skills
While it is true that AI can help reduce the workload on security professionals in certain areas, specific skills are needed to interact with the tools to ensure they are working according to teams’ needs.
David Ramirez, CISO, at fintech firm Broadridge, told Infosecurity that cyber professionals now require experience in AI to carry out tasks like scripting and coding effectively.
“Security professionals have got to ensure that AI tools are aligned to have more mobility and flexibility so they can provide prompts and basic information to the analysts to do things faster,” he noted.
AI governance is an area that organisations must ensure they are able to cover properly. This function is responsible for ensuring AI is used appropriately in the workplace, addressing issues such as shadow AI, AI regulatory requirements and privacy and security risks such as data leakage.
These tasks require a range of specialised expertise and skills. It is an area that cannot be ignored, with many organisations failing to address AI risks.
Making AI Training a Priority for Security Professionals
Despite the urgent need for AI skills within security teams, such expertise appears to be severely lacking in the cyber industry.
O’Reilly’s 2024 State of Security Survey found that 33.9% of tech professionals reported a shortage of AI security skills, particularly around emerging vulnerabilities like prompt injection.
This is an issue that is impacting Ramirez in his recruitment strategy.
“It’s not common to find analysts with security and AI work experience,” he noted.
Ramirez said the AI challenge is analogous to cloud adoption, where analysts had to be trained to learn about specific cloud vendors and their controls.
Organisations will need to have an expectation of AI expertise in security roles going forward.
“With AI, we’re going to have to go through a journey of training. Maybe two or three years from now it’s going to be an expectation that they have these skills,” Ramirez explained.
This means organisations and the wider industry must prioritise training AI skills for new and existing professionals.
This issue has been recognised by training and certification bodies. For example, in August 2025, ISACA announced its Advanced in AI Security Management (AAISM) certification.
The certification will focus on training security professionals to implement enterprise AI solutions while identifying, assessing, monitoring and mitigating risk in these tools.
Register interest for Europe's leading cybersecurity event
Stay updated with upcoming announcements and registration information ahead of Infosecurity Europe 2026, on 2-4 June at London ExCeL.
How AI Could be a Blocker to the Talent Pipeline
Another skills challenge with growing AI usage in cybersecurity is the impact on entry-level roles in the sector.
This is because AI is increasingly being used for roles traditionally undertaken by entry-level cyber professionals. This includes documentation, reporting and alert analysis.
Diana Kelley, CISO at Noma Security, expressed these concerns to Infosecurity: “I am a little worried about how this will impact the next generation – if entry level is replaced by AI – how will people get their foot in the door and build skills to advance?” she asked.
It is crucial that in this landscape, security teams continue to offer a pathway for cyber professionals into their teams, adapting the type of functions performed by entry-level staff, that enable them to gain experience for more technical tasks.
Conclusion
While there has been significant discussion around the potential of AI to reduce the cyber skills gap and ease the burden on security teams, it is vital the industry also recognises the new skills challenges the technology is creating.
Knowledge of using and implementing AI will be essential to ensure the technology works effectively for security teams. In addition, AI governance roles will be critical to mitigate risks such as data leakage, as well as new compliance requirements.
Security leaders also must act to mitigate the potential impact AI could have on the talent pipeline, ensuring entry-level roles and pathways into the sector are still available.
ADVERTISEMENT
Enjoyed this article? Make sure to share it!
Latest Articles
Keep up to date with the latest infosecurity news and trends in our latest articles.
Stay in the know
Receive updates about key events, news and recent insights from Infosecurity Europe.
Looking for something else?
