Brakes Teach Risk Analysis
Key Points
- Brakes aren’t just for stopping; they enable high‑speed performance by providing a way to manage risk, just as risk controls let us take calculated risks safely.
- Effective risk analysis—identifying threats, gauging likelihood, and estimating impact—should be the first step in any system design, informing policy, architecture, implementation, and operation.
- Most organizations skip or postpone risk analysis, leading to ad‑hoc implementations that must later be re‑architected and audited, creating inefficiencies and potential failures.
- Our intuition often misjudges actual danger (e.g., fearing shark attacks over the far more common cow‑related deaths), highlighting the need for data‑driven risk assessments rather than gut feelings.
- A disciplined risk‑first approach ensures policies are grounded in realistic threat evaluations, which in turn produces robust, purpose‑aligned architectures and smoother operational outcomes.
Sections
- Untitled Section
- Self‑Knowledge Shapes Risk Tolerance - The speaker argues that effective risk analysis begins with understanding one’s own risk appetite, illustrating this with analogies ranging from train‑averse travelers to parachuting thrill‑seekers, and notes that organizations mirror individual risk preferences.
- Assessing and Responding to Risk - The speaker explains how organizations gauge risk by considering tolerance levels, asset value, likelihood, and mitigation costs, and outlines response options like avoidance or acceptance.
- Balancing Quantitative Error and Qualitative Risk - The speaker cautions against over‑reliance on precise numerical estimates that compound errors, promotes using high/medium/low qualitative risk assessments, and illustrates the principle with a meteor‑proof car example showing how cost influences the decision to mitigate a low‑probability risk.
Full Transcript
# Brakes Teach Risk Analysis **Source:** [https://www.youtube.com/watch?v=_c2L4z-v06g](https://www.youtube.com/watch?v=_c2L4z-v06g) **Duration:** 00:11:12 ## Summary - Brakes aren’t just for stopping; they enable high‑speed performance by providing a way to manage risk, just as risk controls let us take calculated risks safely. - Effective risk analysis—identifying threats, gauging likelihood, and estimating impact—should be the first step in any system design, informing policy, architecture, implementation, and operation. - Most organizations skip or postpone risk analysis, leading to ad‑hoc implementations that must later be re‑architected and audited, creating inefficiencies and potential failures. - Our intuition often misjudges actual danger (e.g., fearing shark attacks over the far more common cow‑related deaths), highlighting the need for data‑driven risk assessments rather than gut feelings. - A disciplined risk‑first approach ensures policies are grounded in realistic threat evaluations, which in turn produces robust, purpose‑aligned architectures and smoother operational outcomes. ## Sections - [00:00:00](https://www.youtube.com/watch?v=_c2L4z-v06g&t=0s) **Untitled Section** - - [00:03:03](https://www.youtube.com/watch?v=_c2L4z-v06g&t=183s) **Self‑Knowledge Shapes Risk Tolerance** - The speaker argues that effective risk analysis begins with understanding one’s own risk appetite, illustrating this with analogies ranging from train‑averse travelers to parachuting thrill‑seekers, and notes that organizations mirror individual risk preferences. - [00:06:08](https://www.youtube.com/watch?v=_c2L4z-v06g&t=368s) **Assessing and Responding to Risk** - The speaker explains how organizations gauge risk by considering tolerance levels, asset value, likelihood, and mitigation costs, and outlines response options like avoidance or acceptance. - [00:09:11](https://www.youtube.com/watch?v=_c2L4z-v06g&t=551s) **Balancing Quantitative Error and Qualitative Risk** - The speaker cautions against over‑reliance on precise numerical estimates that compound errors, promotes using high/medium/low qualitative risk assessments, and illustrates the principle with a meteor‑proof car example showing how cost influences the decision to mitigate a low‑probability risk. ## Full Transcript
Why do you put brakes on a car?
So you can stop, right?
No. It's so you can go really fast and if you don't believe me How likely are you to get in a car that has no brakes?
Not at all, right?
And if you think about it this way.
The fastest cars in the world are the ones with the best brakes.
They have to have that so that they can take risks.
So that they can manage risk.
We all have to take risks.
And brakes are essentially a mechanism for managing risk.
Ok. I want to take a look in this video at what risk analysis is about,
and how it relates to cybersecurity,
and in particular, stay through to the end because I've got two more trick questions for you.
So, pay attention.
Risk analysis is the process of identifying potential threats,
evaluating how likely they are to happen,
and estimating their impact,
so you can make informed decisions and reduce negative outcomes.
When should you do it?
Well, unfortunately, most people do it last, or they don't do it at all.
They start with implementation,
and then from there
they move into operation.
So that sounds fine if all you need to do is just run a system,
but in reality, You need to also take a look back and see if your system is doing what you intended it to do.
So that's an audit step.
Where we then do a look back.
What you will find during that step though, is probably since you started with no plan,
You don't really have a very operational system.
So in fact, what we need to do.
Is go back and re-architect.
This system.
Come back with a larger overarching plan
that makes the thing make sense.
Then we move to implementation and operation.
What should architecture be based on?
Well, it should be based on policy.
And what should policy be based on?
Now we're back to risk analysis.
Risk analysis is where all of this discussion should begin.
It shouldn't be the last thing we do.
In a lot of cases, people don't do it.
When they should.
So start with risk analysis.
And that should inform.
Our policy, which should inform our architecture, and therefore all the rest of the system.
If that's what we need to do.
Then the question is, how good are we at doing this kind of risk analysis?
Intuitively, if we just go with our gut, where are our instincts leading us?
Let's take a look at that.
What do you think kills more people in the U.S. each year?
Cows or sharks?
Well, would you be surprised to know?
That roughly 20 people are killed each year by cow-related causes.
Mostly trampolines and things like that.
So if it's 20 for cows.
How many do you think it would be for sharks?
Would you be surprised to hear?
On average.
It's one person each year.
So which one of these are we far more afraid of?
Probably sharks.
If you go by the numbers.
It ought to be cows.
If you go by the numbers, it shouldn't be shark week.
We should be focusing on cow week,
and now you'll never look at cows the same way again, will you?
Ok So how should we go about doing a risk analysis?
Let's start with some classic advice,
all the way back from 500 BCE when Sun Tzu wrote In the Art of War,
If you know the enemy and know yourself,
you need not fear the result of a hundred battles.
Let's focus on that second part.
The know yourself part.
If you know yourself,
then you have a better idea.
And that's also true when it comes to risk tolerance.
Or risk appetite if you want to think of it in those terms.
So let me give you an example of that.
Some people will not get in an airplane.
They will only take a train.
So if they want to go from the east coast to the west coast.
Then they may be looking at four or five days in the train and then four or five days back.
That's really slow.
That wouldn't work for me.
But that's their risk tolerance.
Very low.
In my case...
I've flown more than 4 million miles.
So I don't mind getting in an airplane.
What I do mind is getting out of the plane before it's on the ground.
And my neighbor has a different risk tolerance.
He likes to get out of airplanes while they're still in the air.
With a parachute, of course.
I don't want to do that.
So there you have three different risk tolerance models.
There's the train, a plane, and the parachute.
Now who's right?
It depends on who you ask.
Obviously each one of us thinks that we're the ones that are right,
but those are different tolerances for danger.
And different tolerances for risk.
Organizations are no different than individuals.
They can have different tolerance for risk,
and if you don't understand that...
You're not gonna know what level of risk your organization is willing to take on,
and you're not going to design the appropriate cyber security defenses for that organization.
So can't we just use some sort of industry solution that fits whatever industry that you're working in.
Well, it turns out that one size really doesn't fit all,
but there can be some common perspectives.
For instance
A manufacturing organization,
If you think about the CIA triad, confidentiality, integrity, and availability, which is what we're doing in cybersecurity.
They're probably going to be leaning into the availability side more.
When they look at risk.
They're concerned about availability.
They want to keep the manufacturing lines moving and operational.
That's their bigger risk.
On financial industry, probably it's much more about confidentiality.
They've got a lot of numbers that really matter to them,
and matter to you.
And they're going to look at very precise risk models.
They're going to use lots of numbers and spreadsheets and actuarial tables and things like that in order to make sure.
That they've got the risk managed to a point that they can tolerate,
and they tend to be very risk intolerant as an organization and as an industry.
And then another industry that you could take a look at is healthcare.
Where their number one concern, and I'm glad it is, is patient safety, so they're going to be concerned about confidentiality.
There are laws that Protect your personal information.
They're probably more concerned, predominantly concerned with your safety if you're a patient.
So they're looking at different kinds of things.
And they're gonna be very risk averse on certain aspects that involve patient safety.
Maybe not as risk averse.
In some other areas.
So you can see.
There's different perspectives by industry,
but I can tell you, you can line up three banks and they may have different tolerances for risk.
One is a train, one's a plane, and one's parachute.
Next, let's take a look at the value of what we're putting at risk,
because not everything is of equal value.
Think about this as a spectrum.
We've got things like the lunchroom menu.
Doesn't matter if anybody really sees that or not, does it?
And then we've got other stuff that's the keys to the kingdom.
That's the existential threat.
If somebody gets a hold of that, we're out of business.
I've got to consider the value,
because not all things are created equal.
Then we conside what's the likelihood is going to be lost or compromised?
Then what's the cost if that in fact occurs?
What's the cost of protection if I'm gonna try to do some sort of mitigation?
In fact, there is a lot of different things that I could figure into this and a lot a different responses.
What could our responses be to risk?
Well one of the things that we could do.
First off, is we could just avoid a particular risk.
Just say, you know what, that's too risky.
I'm not gonna do it.
I'm gonna get into that parachute.
I'm going to put myself in that position to begin with.
Another thing you could do Is accept a certain risk.
That's what I do when I get into an airplane.
I accept that I can't control that,
but I'm willing to accept it.
In an organization can simply choose to do that and realize that everything we do has a certain amount of inherent risk in it.
Another option is transfer risk.
An organization may document, and they may even get a legal contract that says,
If something breaks in this case, if this happens
It's not our fault,
and it belongs, the risk and everything is born by someone else.
Another thing we can do is indemnified.
Which is a big fancy word that basically just means buy insurance.
So that way, if something does occur then the insurance company pays us back,
and we're made whole again.
So that's another thing.
Now, the thing that...
Techies like me generally run to first off.
Is maybe the last thing to consider, and that's mitigating the risk.
Block the risk.
Put in some sort of compensating controls some way so that we don't have to just accept or mitigate or accept or indemnify or things like that.
In this case we're going to put in some kind of technological control or some type of procedural control
that makes sure that we're not going to see this happen to us,
or at least lessens the likelihood to a level that I can tolerate.
Another thing to consider in all of this is this debate about quantitative versus qualitative risk assessment.
A quantitative risk assessment would take a lot of this kind of information,
use those numbers, put them into spreadsheets and let those things guide all of our decisions.
It's not a bad way to go, but don't be a slave to your spreadsheet.
Don't let the spreadsheet make the decisions.
You should make the decisions.
So another way to look at this is instead of
going with all of these numbers, which have a certain amount of error included in all of them, and sometimes we get lulled into a sense of complacency
because we have all these numbers and we think well we've got all this level of precision,
but if each one of those numbers was based on an estimate
then we're just compounding the error as we add all of those things together.
So sometimes we do need to use our gut and our instinct on this to some extent,
and a qualitative risk assessment
involves things like assigning high, medium, and low levels of risk with things.
So we put all of these things into the soup
and out of that comes our risk analysis.
Now we understand
what it is we need to do.
Okay. Now for the third and final trick question.
What if I could sell you a capability that would meteor proof your car.
Let's assume I'm honest and let's assume this actually works.
That if you buy this from me,
your car will never be hit by a meteor.
Would you buy it?
Most people are going to say no,
and they say that because they think the likelihood of that Is extremely low,
and they'd be right,
but I'm gonna tell you a better answer instead of just saying no,
would be to ask me a second question,
and that is.
How much does it cost?
Because you assume that it might be expensive and it's not worth that level of risk,
but what if I told you the cost was just a penny?
Well, I think you probably would be smart to buy that defense.
Because it's a low risk, but it's low cost.
So it might work out for you.
If I was at the car dealership.
I might lay down a dollar and say.
Give it to me and the next 99 people that come by as well,
because I'm a big spender.
So there you go.
That's a discussion of risk analysis,
and you see that cost matters when it comes to these things.
This is not done in the abstract,
in the end, it's all about understanding risk and picking the appropriate response to it.
The result, if you do it right...
Better defenses,
and a tougher time for the bad guys.