Security and Risks
Therefore the basic statement: First understand (identify) risks, then assess them and then migrate. Prioritization is particularly important in this process. Specifically, this means protecting against the most likely risks, understanding the top 10 risks well.
Recognize Risks
Companies often only recognize a Risk when it occurs
Risks are assessed differently in each case. People assess dangers taking into account many different psychological factors, especially from their own point of view. If one has not yet encountered a certain risk, then there is a great danger that one will underestimate it.
In the same way, other risks tend to be overestimated, perhaps because they have just been mentioned a lot in the press, or because they are generally hyped. It is easier to see a risk if it sounds sensible on the one hand and appears frequently in current reports on the other.
But people generally misjudge risks. We often see the world through rose-colored glasses, especially when it suits us. And people are very good at arguing away risks, especially when it suits us. The gut feeling is just fast, simple and very intuitive.
If you hang around stock market forums (or more likely crypto forums these days), the following pattern will be easy to spot: After an unexpected price rise or fall, the experts invent flimsy retroactive arguments to distract from their ignorance.
So that we do not have to rely on our gut feeling, there are risk and cybersecurity frameworks. Thinking seems to take up a large part of our working memory, intuition provides an instant solution without much effort. Inaccurate handling of danger and risk can have consequences for the entire organization.
«All war presupposes human weakness, and it is against it that it is directed.» - Carl von Clausewitz
But almost as dangerous as post-facto justification is the presumption of accuracy. Or better, the wrong accuracy. If, for example, one reads on a current occasion that temperatures in Europe will rise by 2.6°C by the end of the century, then one wonders whether we can really calculate that to the last decimal place out to about 80 years. Obviously, this is an average value and if you follow the asterisk, you will probably find the scientific study behind this statement, which works with statistical models, probability curves and temperature curves. But then what do you find on the website or in the newspaper article? A seemingly accurate figure of 2.6°C.
When you look at methodologies for assessing cybersecurity risk, there are often formulas and mathematical models for calculating the financial impact of a risk. This is enormously helpful because it allows justification of the requested cybersecurity budget. However, one should not fall prey to false precision, but always work very strongly with probabilities and approximations. There is already a great deal of imprecision in risk identification, and a great deal to a great deal of imprecision is indispensable in prioritization and migration scenarios.
«History doesn't repeat itself, but it does rhyme. » - Mark Twain
In his book «The Black Swan», Nassim Nicholas Taleb criticized the fact that society does not expect extreme events and consequently does not prepare sufficiently for them. People underestimate precisely these outliers, which is why, when we talk about cyber risks, we always try to start from scenarios. These are then evaluated according to probabilities, but we must not take the past as a model for the future and should accordingly also discuss superficially improbable scenarios.
«Don't Panic. » - Douglas Adams
In the spirit of Douglas Adams, I would nevertheless like to conclude on a positive note:
If enterprise risks and cyber risks are consistently aligned and if they are continuously prioritized in terms of their likelihood and thus concrete actionable scenarios for mitigation are created, then this forms a very strong basis for effective and pragmatic cybersecurity. Then it just needs to be implemented and executed efficiently and we are very happy to support your company in this.