Over the past decade, the financial losses associated with cyberattacks have been devastating to organizations, municipalities and small businesses. In just three years, business email compromise (BEC) has cost organizations more than $26 billion, driving a global law enforcement movement called “Operation reWired” into action that specifically focused on disrupting and dismantling global BEC campaigns. And in October of last year, the FBI also warned that ransomware will become more targeted and costly, as witnessed by the 2019 attack on the city of Baltimore which cost an estimated $18 million to restore systems, and most recently, the December 2019 attack on the city of New Orleans forced officials to declare a state of emergency.
Unfortunately, as our world becomes more interconnected and digitally dependent, these types of attacks will continue to proliferate at a faster clip. Adding to these challenges is the fact that organizations still struggle to recruit the specialized security talent required to defend against these and other threats. In fact, it’s been so difficult to keep pace that the latest Cybersecurity Workforce Study from (ISC)² shows that there is a current estimated need for 4.07 million cybersecurity professionals globally.
This sobering number highlights the cybersecurity industry’s widening workforce skills gap and, more importantly, how steps taken in recent years to close that gap are not enough to adequately prepare us for future attacks. Instead, the cybersecurity industry needs to look beyond our siloed networks and embrace an industry-wide mentality of decentralized threat intelligence sharing in order to relieve the burden on a limited supply of highly trained cybersecurity talent, alleviate recruitment pains, and accelerate response times to a rapidly evolving threat environment.
The Woes of the Cybersecurity Workforce
Recognizing the need for individuals capable of protecting systems against malicious actors, the cybersecurity industry in recent years has taken steps to define new cybersecurity certificates and degrees, promote cybersecurity jobs, provide training, and emphasize the role employees play in their company’s cybersecurity posture. In fact, the elevation of cybersecurity isn’t just happening within organizations but extending down into our nation’s youth. For example, the advocacy of STEM programs focused on computer science and establishment of the Presidential Cybersecurity Education Award to honor educators in the field of cybersecurity are good first steps towards cultivating awareness over the long term.
However, in the near term, it’s not enough. According to a 2019 study commissioned by ICS2, the cybersecurity workforce gap is estimated to be nearly a half-million. And that gap is likely to only increase as the demand for skilled workers shows few signs of slowing down any time soon.
To overcome this challenge, the industry must start to think differently about how we address this critical issue beyond attracting and training more workers. Undoubtedly, the promise of Artificial Intelligence will improve our ability to do more with less, but it’s equally imperative that we look at different models that have been embraced by other industries.
Defining Decentralization for Cybersecurity
The concept of a decentralized platform is hardly a new concept in the technology industry and in the most basic terms, simply describes a scenario in which critical applications or services are carried out by individual computing devices or nodes on a distributed network, obviating the need for a centralized server and thus creating greater resiliency across the network as there is no single point of failure.
Uber decentralized the transportation system by allowing any person that meets certain threshold requirements to generate an income as an on-demand driver. The introduction of Bitcoin and its underlying blockchain technology has enabled parties to securely and transparently share transaction data on a distributed ledger, without the need for an intermediary to be involved. Accenture has even documented how the oil and gas industry are applying principles of decentralization to improve the efficiency of well production.
Beyond computing and network resources, decentralization also enables organizations to access talent and knowledge that they may not otherwise have easy access to. For cybersecurity, this means tapping into real-time information and threat intelligence from global SOC analysts and security teams in order to accelerate the detection and remediation of new threats – especially ones that have never been seen before in the wild.
For example, imagine an organization had access to hundreds of thousands of security analysts from around the world and the threat intelligence they see every day. By openly sharing this intelligence amongst each other, including attacks, permutations, and solutions, analysts could quickly flag a suspicious file or email and have it automatically and instantly distributed to every other analyst that is part of the network, enabling them to spend less of their limited time researching new threats and improving their ability to respond to new attacks. More importantly, a decentralized threat intelligence sharing model could free up analysts and security teams to focus on higher priority initiatives versus constantly putting out fires.
Ultimately, the difference in theory and application of true decentralization will require a collective shift in how we presently view threat intelligence sharing. Currently, members of government agencies, non-profits, and industry associations, such as the National Cyber Awareness System and the Information Sharing and Analysis Centers, benefit through sharing information. Additionally, new frameworks, such as Structured Threat Information eXpression (STIX) and Trusted Automated Exchange of Intelligence Information (TAXII) are used by security analysts to review cyber threats in order to make information more actionable. But in order for true decentralization to be successful, the industry must take this “community sharing” and extend it beyond siloes, across organizations and include both threat intelligence as well as practical solutions that are usable by the rank and file workers who are on the front lines.
The growing gap in the cybersecurity industry’s workforce underscores the lack of progress previous initiatives have taken in relieving the current workforce shortage and closing the gap between the supply of skilled workers and the industry’s insatiable demand for security practitioners. As a result, the cybersecurity industry as a whole should consider how a decentralized approach to threat intelligence could address this priority. Otherwise the industry’s workforce will continue to be overburdened as threats become more sophisticated, attack surfaces increase, and the pressure instilled from large financial losses continues to grow.