Could AI Security Prevent Hacks?

  • January 11, 2019
  • Feature
Could AI Security Prevent Hacks?
Could AI Security Prevent Hacks?

By Graham Smith, Head of Marketing, Curo Talent

Las Vegas hacking event, the Cyber Grand Challenge was the ultimate, and only, all-machine hacking competition. Each machine identified software vulnerabilities, exploited them and patched their own systems to protect against threats — all without a human programmer intervening. This article explains the role of automation in IT security and how it could address the skills shortage.

We’ve all heard of the wider IT skills shortage, but the lack of security skills in the industry is even more critical. According to a report by the Life and Times of Cybersecurity Professionals, IT workers that have specialist cyber security skills are approached with a new job offer at least once a week. In fact, 45 per cent of organisations claim to be severely lacking in this specific area of talent.

This high demand provides an opportunity for IT contractors to cash in. But, if the opportunity is so lucrative, where are these talented recruits hiding? And could AI be the answer?

 

A Threat, or a Blessing?

The rise of automation is one of the most talked-about topics in the realm of IT. Estimates suggest that up to 80 per cent of jobs in the sector could be at risk due to an increase in automated technology and the potential of artificial intelligence (AI). But, in cyber security, which is so severely lacking in talent, is this technology really such a bad thing?  

Ultimately, it depends on how you look at it. It is becoming increasingly difficult for human operators to manage all aspects of cyber security — particularly in areas that generate massive amounts of data, such as security testing.

The advantages of implementing automation for security testing are obvious, but the growth of machine learning also provides an opportunity for technology to proactively bolster an organisation’s cyber security, rather than just support it. An example of this would be data deception.

 

Data Deception

Cyber security specialists have long used deceptive tools to manipulate attackers that are trying to breach a system. For example, by creating false servers containing incorrect data, IT workers can trick hackers into perusing these assets and revealing their tactics. This method, sometimes referred to as the honey pot technique, is often used by security firms to gather intelligence on new hacking techniques. 

To be effective, honey pot techniques must be implemented on a grand scale. The real network will be littered with many small pieces of information to entice hackers. This includes seemingly valuable, but fake information such as customer details, login credentials and intellectual property. However, rather than leading hackers to data they can ransom or sell, this instigates a confusing process that directs the hacker away from any real, valuable data.  

Having only one honeypot is not nearly as effective or powerful as a system with several honeypots. While the configuration of a honeypot can be relatively straightforward, usually one simple algorithm, continually monitoring this maze of deception can be a longwinded process for IT workers, making manual data deception impractical.

Data deception technologies are beginning to take this responsibility away from workers. Automated products will routinely devise methods of deceiving hackers, using machine learning techniques to change and adapt over time.

Machine learning is a branch of artificial intelligence that enables technology to learn and develop through experience, reducing the need for manual programming. For example, if a system had previously experienced a specific type of cyber threat, it would develop methodologies to deal with similar attacks more efficiently in the future. It’s widely believed that machine learning tools could enable systems to spot and stop the next WannaCry attack, for instance, much faster than legacy tools.

Reducing the need for human intervention means that IT workers can dedicate more time to strengthening their own security efforts.

That said, these tools cannot completely replace humans. Instead, this technology should be used to automate the longwinded and repetitive tasks that are currently filling the workflows of IT teams, such as testing, basis threat analysis and data deception tactics.

There is a severe shortage of advanced cyber security skills in today’s IT talent pool and automation will not completely bridge this gap. However, embracing this technology will provide greater opportunities to develop the skills of the existing IT workforce. Which is, at least, a step in the right direction.

Learn More

Did you enjoy this great article?

Check out our free e-newsletters to read more great articles..

Subscribe