We recently looked at how generative AI, such as ChatGPT, is creating new cybersecurity challenges as it makes cybercrime more accessible. However, as with most new technology, it works both ways. Generative AI can also be leveraged by companies to enhance security. Here are three ways it can help.

1. Threat intelligence

Staying on top of the latest cyber threats and cutting edge developments in security is time consuming. The rate of innovation is high which results in vast amounts of information to digest. Generative AI is able to analyse these vast amounts of data and find patterns within them, to quickly identify emerging threats and help security teams to respond more quickly to attacks. This helps companies take an active approach to cybersecurity, identifying threats before they become a problem. 

2. Phishing email detection

Whilst criminals are turning to generative AI to create more convincing phishing emails, companies can use it to detect when AI has generated text. For example, OpenAI – creators of ChatGPT –  have trained a classifier to distinguish between text written by a human and text written by AIs from a variety of providers. Currently, it is able to correctly identify 26% of AI-written text, and they are working on developing it to become more accurate

3. Personalised cybersecurity training

Generative AI can also be used to simulate attacks – such as phishing emails – to help staff train how to respond. Like a real attack, employees will not know what to expect or when to expect it, making it a much more realistic and engaging training experience. It can also be tailored to each employee in the company much more easily, as the AI can learn individuals’ behaviours.

AUMINT.IO’s Trident system combines AI and machine learning to simulate real life social engineering scenarios and offers personalised training to each employee based on their own vulnerabilities.  

Generative AI is likely to continue evolving rapidly. This means that, although the threat from bad actors will likely become more complex, the tools to tackle it will also improve and become more accessible.