Background of server lights and silhouette of a man showing insider threat risk

A Cautionary Tale: Sharing Your Riskiest Insider Threats Is a Culture Killer

The 2022 Cost of Insider Threat Global Report from Ponemon Institute states a clear problem: “Insider threats have increased in both frequency and cost over the past two years.”

While insider threat continues to be a well-known risk amongst security practitioners, with the added increase in financial repercussions, it is also a risk that is beginning to get executive attention. With this executive attention comes the increased need for a solution and, as a result, a fluctuation of ideas. However, I would argue that this fluctuation of new ideas for solutions may be straying from the heart of the problem.

It is possible for an organization to apply an innovative insider threat solution with good intentions and all in the interest of heightening security posture. However, it is also possible that this solution ends up misaligned and implemented without proper intent. Not only is this possible, but it can be misaligned by improper incident and event analysis done at large-scale. When we cast a large shadow on what we consider to be an insider threat, the threat itself has just that; a bigger shadow. Does the actual risk grow with the size of this shadow? I would argue no, it does not.

Insider threat platforms seek to answer the problem, “We need to be able to detect insider threats faster and mitigate better.” To consider a detection and monitoring posture elevated, key objectives such as proper data access monitoring and control, strict DLP policy, IDS/IPS to allow for event prevention, SIEM or log analysis, strict IAM (Identity and Access Management), and a robust vulnerability management program must be implemented.

If all these things are implemented, maintained, and improved upon, what is failing to answer that problem? The answer is an increased need for a solution in the short-term. This is an entirely different problem that insider threat self-learning and data-visualization platforms answer as related to fear. Platforms may help with this problem, but really, they answer a different one; a problem that is developed through an ongoing and encroaching sense of fear that could infect team and company culture. The potential to wrongly list dedicated employees as a top insider threat has real impacts on organizational cultures.

Using caution when sharing insider threat risks

Our definition of an insider threat has grown too broad when seeking the answers to problems surrounding the increase in events. Simply put, if you broaden the sample pool too much, you spoil the sample.

If we define insider threats too loosely, such as failing to update a system, do we need to consider failed patch and vulnerability management programs also insider threats? One may suggest that, if an action taken by a user at any point involves another person, say a member of the IT team, it is no longer negligence of just the user. Events such as phishing or downloading of malicious files may not be considered events purely of user negligence if they result in the exposure of sensitive data.

When looking at the problem of insider threats, it is not the user who has become more negligent. It is a chain of events that caused an exposure of data. What an insider threat monitoring technology that depends on data visualization surrounding user behavior creates is a level of risk analysis that does not need to exist to cause action on raising security posture. UEBA can arguably be important from a data enrichment perspective, but it can cross lines quickly and damage company culture irreparably.

When we overcast on what we consider to be negligence as it relates to user analytics, it can blend actionable information with noise the security team has no actionable information to derive from. Not only could we be misaligned in our endeavors to uncover solutions to insider threat, but it could also create the existence for threats to fester.

Imagine that a SOC launches a new insider threat tool internally and allows all analysts to look at it. If you were given permission to look at a dashboard called, “Riskiest Inside Users,” would you feel good about where you are at in that list? Would you be surprised to see yourself on it? How about at the top? If you were a dedicated employee and now leadership sees this tool defining you as the top riskiest user, are you still as motivated to continue to try your best to do good work?

Improper implementation of a tool such as this can cause for loss of an employee’s drive just as quickly as a potential nefarious opportunity that outweighs the cost. Having witnessed this situation firsthand during my time in cybersecurity, I can confidently say that this reality is crushing for motivation and team culture.

There are infinite factors that could weigh into the increased likelihood of an insider threat existing. It is a risk we cannot forget exists, but it is one we cannot let damage the trust we must always maintain with our organization and the people involved in it every day. We must ask ourselves the question, “When you damage that trust, are you solving the problem, or adding to it?”