Skip to content

Featured

Dr Shaun Davis image

Dr Shaun Davis

Belron

David Burrett Reid image

David Burrett Reid

Risk Communication Expert

Shaun: Welcome to this latest edition of the British Safety Council Health and Safety Uncut podcast with me, Doctor Shaun Davis. 

I'm delighted today to be speaking to David Burritt Reid, an expert in strategic communications. 

David is a visiting Fellow in Risk Communication at the Institute for the Public Understanding of Risk at the National University of Singapore. 

He was a member of the executive leadership team of global safety charity Lloyds Register Foundation for seven years. 

A former physicist, David is also passionate about communicating science and the need to inspire the next generation of scientists and engineers. 

He was the Director of the Times Cheltenham Science Festival and LED public engagement for various organisations including the Engineering and Physical Sciences Research Council and the Institute of Physics. 

He's the founder and executive Chairman of Matter PR, one of the world's only communications agencies specialising in helping science and engineering organisations. 

He is a board Trustee of the BRE Trust, an independent charity dedicated to improving the built environment for the benefit of all, and he's also on the board of the Global Initiative for the Future of Industrial Safety, which he helped to establish in 2021. 

Earlier this year, David published his first book, Running the Risk from Shark attacks to Nuclear Disaster: Understanding life's biggest risks and how we Build a safer future. 

Welcome, David. 

David: Thanks, Shaun. 

Shaun: Let's start by finding out a bit more about you and what does health and safety or risk more broadly mean to you. 

David: Thanks, Shaun. Yeah, so what is risk? 

You know, that was, that's really why I wrote this book, you know, because I started, I started doing some work in the risk space. 

And the first questions I was asking was, you know, what is risk and, and what are the biggest risks around us and, and what do we need to know about them? 

You know, and, and I found the story of risk really quite interesting. 

When you look at the sort of history of the, the concept of risk, it tells you quite a lot, a lot about it. 

So, you know, we, we tend to think of risk now as, you know, the possibility of something bad happening or that bad, something bad might happen. 

So, you know, risk is, I guess, a way that human beings have evolved of, you know, understanding the world around them and, and making decisions to keep themselves safe. 

You know, it's a way of visualizing the future and, and responding to it. 

You know, it's sort of associated with danger or hazard. And, and I think of risk and safety as two sides of the same coin. You know, so, you know, risk is about the future. It's about what might happen to us. And so it's not something definite, it's just a possibility. 

Whereas safety is, you know, the things we do here and now to keep us, to keep us safe in response to those. So they're, they're two sides of the same coin. And, and in sort of managing health and safety and risk, both are really important, you know, and the other thing I realized is that risks are, are everywhere. 

You know, there are all sorts of risks that all around us. They're constantly changing the way we think about risk. And our approach to it also changes over time. 

I sort of, I look back in history and I looked at, you know, how did the ancient Greeks and ancient Romans, you know, approach risk? 

What did risk mean to them? 

You know, and the biggest risks at the time were things like, you know, warfare, pestilence, disease, natural events like volcanic eruptions or, or floods. So mainly sort of what we think of as natural disasters. 

And but the ancient Greeks didn't really worry much about risks. You know, they felt it was just the gods interfering in their daily lives. So, you know, they believed there was nothing really they could do about it. So they were very, they were Stoics. 

And so they were, their philosophy was sort of keep calm and carry on, you know, nothing you can do about it. 

So why worry about things? 

You know, so they had this belief in, you know, divine intervention. So don't worry about it.  

Shaun: In fact, they did. They did and I'm very, I'm really interested. They did, but they also must have had some perception of risk management because of how they went to the temple and they celebrated feast days and they left offering. 

So they were, so even then they were thinking presumably about a form of risk management and mitigation. 

So what you say? And it spans thousands of years. 

David: Yeah, so, so risk management for the ancient Greeks. Well, they'd visit an Oracle. 

So, so the only way they felt they could manage the gods interfering in their daily life was to go and see an Oracle who had divine powers they could communicate with the gods and they knew what the gods were thinking. 

So they'd ask the the Oracle questions. 

And actually there's been some research which was done a few years ago to look at what were the questions the ancient Greeks asked the oracles. 

And actually they were very similar to the kind of questions people ask now. 

It's like, you know, should I get married and, you know, should I take this job? 

Should I go on a long journey? 

You know, it was a kind of risk management. 

And the Oracle would give them the verdict: no, don't marry that person because there's danger here or don't go on that long journey. 

Or, you know, so they had one of their ways, I guess, of reducing their anxiety about things was to ask the oracles, the wise women, you know, that that knew the minds of the gods. 

The Romans were different. 

So the ancient Romans had a completely different kind of approach to risk. You know, they were gamblers. 

They liked risk. They were, they had a huge risk appetite, and they loved games of chance. You know, they, they played dice games in Taverns. They were huge gamblers. 

Gambling was a big problem in Roman society. They tried to ban it, but it was too popular, so they couldn't. And, and actually, they sort of celebrated risk. 

The ancient Romans, they didn't understand it. They didn't think it was just the gods playing around, but they were comfortable with it. They they liked it. 

They felt it was heroic to take risks. 

So they sort of embraced risk and they faced risk head on. You know, when Caesar crossed the Rubicon, you know, saying the die is cast, he was taking a gamble and he used that language, the die as cast from dice games exactly to suggest, you know, we're going to take a big gamble and we're going to face it head on and we're going to go and fight the Gauls, You know, so they actually kind of they had this huge risk appetite, the ancient Romans. 

And I just thought that historical approach to risk was sort of was fascinating. 

Shaun: I suppose also the cyclical or developmental kind of nature of that. 

And also, then we’ll touch on later on about perception and societal and individual perception. So there's 2 examples there of kind of almost adjacent time scales with really different perceptions. 

Yeah, interesting. 

David: Yeah, I think and, and today, you know, we think of risk very differently. 

So, you know, the ancient Greeks might have been Stoics, you know, and just kept calm and carried on. 

The Romans might have really embraced risk. 

You know, in the modern world we have become quite risk averse, you know, we have, you know, a bit of a risk society where we like to analyse risk and manage risk and, you know, have risk registers and things. 

But also the nature of the risks have changed a lot. You know, for the ancient Greeks and Romans, the biggest dangers were environmental risks. You know, they were natural risks. 

Whereas since the beginning of the 20th century, most of the biggest dangers we face are actually man made, you know, nuclear bombs, runaway climate change, pollution, the threat of misuse of new technologies like robotics and artificial intelligence. 

You know, we now live in an era of sort of global risk, but of man-made risk, you know, risk,  

Shaun: Digital risk. 

David: Yeah. 

And also, risks are more interconnected, you know, whereas at one time most risks are local, you know, they only affected 1 area, one region. 

You know, the risks that we tend to face now, you know, in the world, the biggest ones anyway, you know, endanger the whole world, like pandemics, like COVID. 

Shaun: Interesting, interesting. So, going slightly off on a tangent, but on that theme, the Institute for the Public Understanding of Risk, what does that tell us? What does that tell us that there's a need for an institute? 

Does that tell us that there's a need for people to understand more? Why is it specific to Singapore? Is there things over there that we could be learning here? 

That's, that's interesting for me, both in terms of content and language and perception. 

David: Yeah, no, interesting question. 

So, 5-6 years ago, so I was working at Lloyds Register Foundation and, and we're a global safety charity and, you know, our mission is to make the world a safer place. 

But we were asking a lot of questions about, well, you know, why is the world not safe? 

You know, where are people dying or having accidents that we can do something about, you know, what are the risk hotspots in the world? 

And we were trying to focus our strategy at the time on those risk hotspots, looking for maybe opportunities where other charities, other UN organizations or others weren't really tackling those problems. 

And one of the problems we discovered was this, this issue of risk perception, the idea that, you know, in order to keep people safe, we have to understand what they think and feel about risk, not just what the risks are, but what real people in local communities think and feel about those risks. 

And we realized there was no good source of data on that. 

There was nowhere we could look. That said, this is how people think and feel about risk. It's how they perceive risk. This is what they feel the risks are. 

And so we did two things. 

We partnered with Gallup and we started something called the World Risk Poll, which is the world's biggest survey of how people think and feel about risk. And it, it surveys people in 144 different countries over 140,000 people. 

So it's a huge data set and it's been going for six years now. And the poll happens once every two years. 

So we did that to try and get some real data on understanding two things. 1 is people's perception of risk and two is their experience of risk, so their own particular experience of risk in, in their families or in their communities. 

And the second thing we did was start an institute that an institute for public understanding of risk in Singapore. 

And one of the things we noticed is that there's a lot of hotspots in risk in Southeast Asia, in different areas, in construction, in fishing, in ferries, in all sorts of different areas. 

And we thought Southeast Asia was a good place to focus. 

And, and what that institute does is it's a specialist institute that really tries to understand the perception... why people have beliefs, judgments and attitudes towards risks, how they form those belief, judgments or attitudes and what influences their behaviour to make them safer or less safe. 

And that's, and there, so the institute is composed of, you know, it's an interdisciplinary research centre, NUS, it's composed of sociologists, psychologists, people from environmental sciences looking at environmental risk from the School of Public Policy, looking at public policy aspects of risk and risk perception. 

So, it's a, it's an interesting centre that tries to approach risk from that human, human angle  

Shaun: And for anyone listening to this, that's interesting that is that is the open-source information there then is there resources that people can use? 

Is the can they access the survey for example to look at where they sit in the in the perception range? 

David: Yeah. So, the World Risk Poll is all online. All the data is publicly available. It's open source. 

So yeah, if you just Google World Risk Poll, you can find all that data. It's all out there. There's the whole data set's publicly available for the last six years, and so that's three polls over six years. 

Shaun: So you've recently released this fabulous book, Running the Risk, and it starts off with the question, did you risk your life today? 

And the answer is almost certainly yes. 

As health and safety professionals in the industry, we tend to have a fairly good understanding about the risks that we come up against in the workplace. 

And we put in place risk assessments, as I've mentioned before, and strategies to mitigate and manage that. What about hidden risks? 

What, what's, what can you tell us more about that? About hidden or underappreciated risks and how you might perhaps get people to think about that? 

David: Yes, I guess I didn't really mean hidden. I suppose in the book they're more ignored. They're more things that we don't pay attention to than hidden. 

So, you know, the one of the examples I use in the book to try and illustrate this is, you know, in 1975 when the film Jaws came out, there was huge widespread panic. 

You know, the public were absolutely terrified of sharks all around the world. There was this incredible reaction to the film. So, something for some weird reason about Jaws and about the sharks in it caused this, this huge public panic, public fear response. 

You know, and this happened in countries like Britain. People wouldn't go to the beach. 

You know, nobody had ever seen a shark in their life, but they wouldn't go to the beach because they'd watch Jaws. 

You can kind of understand it a little bit in South America or Florida, you know, where there are sharks that people wouldn't go to the beach. People started forming hunting parties to go out and kill sharks. 

In response to the film, Steven Spielberg had to apologize to environmental groups because the shark population, and these are highly endangered species of shark, took a massive nosedive because people were going out hunting them in response to the film, you know, mimicking the the the sort of guys in the film that did that. 

And that tells us something quite interesting about how people respond to certain types of risk and what drives their worries, their fears. So, people hugely fearful of sharks after Jaws. 

And yet actually the number one risk in everybody's life is driving a car or crossing the road. You know that, you know, road traffic accidents are the number one source of risk for every single person in every single country. 

So if you look at the world risk poll, it tells you for 144 different countries what the number one risk is and it's road traffic accidents in every single country. 

You know, 3700 people die every day in a road traffic accident, and that's 1.2 million people a year. 

Shaun: Yeah. So not being too flippant, my mind went straight to. 

I wish that people would be paying as much attention to road safety and driver behaviour as they are to trying to kind of single out sharks in terms of get into the thing that can you really can make a difference around focus your efforts on things that are going to be, you know, proportionate and beneficial. 

I thought that's a really interesting.  

I hadn't obviously I didn't know that, but I hadn't fully appreciated that kind of group think media film influence that that it could have. 

So that's a fascinating perspective to think about. 

So sorry, no, go on. 

David: I was going to say, I guess that that's exactly the issue is, you know, why do people worry so much about shark attacks that they've never even seen a shark and it's highly unlikely. 

And yet they have this huge response to it where they really panic about it. They worry about it. They want to know what how to keep themselves safe. 

And yet the number one risk in most people's life is road traffic accidents. 

And we get in our car, we go to work every day, we don't think about it, and we cross the road, we don't panic about it. 

So, we're not really. 

And that's what I meant by the hidden risks, is that there are lots of risks around us and risks are changing all the time. 

And they're constantly emerging, they're constantly getting higher or lower. But we don't understand, you know, what are the risks I face and how does one little risk relate to another? 

You know, do I need to do something about it? Should I not do something about it? 

So, it's, so that's what I meant in the book about hidden risks. It's just that we live in this risk world. How do we know the things that we should pay attention to? 

Shaun: Yeah, fascinating. Again, that's yeah. 

And then if we if we take that down to the operational level, I noticed that workers in in quotes is a common theme throughout the book. 

So, do do you think there's more that employers should be considering when it comes to Safety at Work or you know, or is it more simple? 

And how do we, is it that simple, not more simple... Is it that simple? 

Can, can employers do more? Is it a collective responsibility? 

How, how do you think we can take risk, risk perception and management down to the operational frontline where we're going to get, we're going to make the difference? 

David: Yeah, I mean, I guess there are new and emerging risks all the time. 

So, you know, in the world of work, you know, new technology is being introduced all the time. So, you know, one of the things we think about in in the Global Initiative for Industrial Safety is, you know, the future of the workplace. 

You know what, you know, particularly in manufacturing, you know, we're introducing AI, we're introducing robotics quite quickly. 

You know, the sort of manufacturing processes and supply chain is changing very, very fast. 

And, and what tends to happen when new technology is introduced is that, you know, and you will know this regulation always lags behind. You know, technology changes very quickly. 

But our understanding of how human beings respond to that technology, how they work with it, and what we need to do to regulate it and keep people safe tends to come later. 

You know, we tend to worry about that when something goes wrong. 

And, and I think what we need to do is we need to do more foresight, do more scenario planning, look into the future and think about, you know, what are all the possible risks that could happen in a workplace. 

That could happen when we introduce new technology or when we introduce other new things, you know, that surround A workplace. 

And we always need to design the workplace for all the possible things that could go wrong in order to keep people safe and make sure that people know what to do when something does go wrong. 

You know, I think one of the examples I use in the book is the Fukushima nuclear disaster in Japan that happened after, after a very large earthquake there. 

And one of the things I found really fascinating, you know, looking into that is, is that there were two nuclear power plants only 60 kilometers apart, the Fukushima nuclear power plant and the Onagawa nuclear power plant. And nobody's ever heard of Onagawa because it didn't fail. 

It didn't meltdown. 

It actually experienced a higher height of tsunami waves than Fukushima, but it was built very differently. Fukushima had been built in a very traditional way. 

You know, engineers just thinking about how do you get a nuclear power station to work? What's, what does it need to do? 

Very functional and very traditional. 

Yes, they knew that there was some tsunami risk and so they'd built some protections in, but they didn't do a huge amount of work on thinking about all the possible things that could go wrong. 

The one at Onagawa had been built with a completely different design philosophy, a much more, I guess, modern design philosophy, one that that effectively designed it to fail safely. 

So they decided, OK, we're going to look at everything that could go wrong with this nuclear power plant because a nuclear power station is such a high risk thing, you know, yes, it's highly unlikely to go wrong, but if it does go wrong, it could be catastrophic. 

So they did a lot of scenario planning, a lot of foresight work. 

One of the things they modelled because they knew that the area was very prone to earthquakes and tsunamis. So they looked at all the possible tsunami heights. They built in a huge margin of safety. 

And as a result, they were safe, they were protected when that tsunami came and they didn't meltdown because they had this approach to design, to safety and they carried that through not just to the design of the plant itself, but into the culture and the ways of working inside the plant. 

So, they, they had done a lot of scenario-based sort of emergency response planning with the team in the in the plant so that if a tsunami came, everybody knew exactly what to do and who should do what. Decision making was devolved to the local plant managers. 

Whereas with Fukushima, they, they had to wait for a decision to come from Tokyo, from their bosses before they could do anything. 

And so, they had just thought about everything that could go wrong and what they need to do, both in designing the plant to fail safely and in training people to be able to respond to it in the event of that disaster and as a result. 

Shaun: I think if I play that back what I heard then you can correct me if I'm wrong. 

I heard process safety, cultural safety, leadership management engagement, so all the things. 

David: All those things. 

Shaun: That make perfect sense and actually if I link this back to the British Safety Council and to for example, the International Safety Awards, the Sword of Honour, Globe of Honour, Shield of Honour, they are the threads that run through those award and I can say that because of my involvement with those schemes. 

So, I think to have that compare and contrast between those two plants. 

To have that academic overview that you gave then and to have that kind of operational aspect. 

I mean, that's, that's, that's that was a big kind of light bulb moment for me then. 

So that's very helpful. 

Sorry to cut you off I got over excited then as you can tell. 

David: No, I think I think Onagawa is just a great example of how we should build these, you know, critical infrastructure that is high risk, you know, needs to be built like that. 

And I think it's just an exemplar of good practice. 

Shaun: Yeah, yeah, top to bottom right, yeah, yeah. 

Now if we talk about, if we talk about the health of health and safety. 

So a lot of work has been done recently in, in our industry, in fact in many industries to minimise stress, anxiety at work. 

And how can we ensure that we continue to work together to across organisations and across our teams to, to understand and, and perhaps keep in proportion the risks that we face, how we can redefine people's views of safety and risk and risk resilience. 

And, and I suppose one area I'm really interested in is this analysis paralysis where it becomes, it feels that big, risk issues seem that big or that cumbersome that it either gets people to do nothing or to think, just ignore it all and just let's hope for the best. 

Which I've seen to varying degrees throughout my career where people say, oh, if you're slowing us down too much so I'm not doing it or just put your head down and run forwards and just hope that nothing goes wrong. 

So how would you fit your, your publication, your experience, your perspectives into that? 

David: Interesting, the you know, nobody wants to be risk anxious, you know, nobody wants to go around, you know, worried all the time about everything that might go wrong. 

So it's a difficult balance to sort of get right is, is the balance between, you know, just understanding risk and be able to put it in perspective and be able to take the right decisions to keep ourselves safe, you know, in the workplace or in our lives versus, you know, not worrying too much about risks and not focusing so much on them that we become, you know, paranoid and anxious about them. 

You know, my view is that one of the things we need to do is just put human beings more at the centre of our understanding of risk. 

I think too much risk management and risk analysis is about, and it comes back to that statistics thing. 

It's about the process. 

It's about, you know, the risk of exposure, the severity, that it's often about numbers. 

And yet we don't often take much account of how human beings respond to some of those things and what we need to do for real human beings who, you know, sometimes behave irrationally, in a surprising way, in a counter intuitive way, in an emotional way, not always in a scientific, logical, you know, people are weird sometimes. 

And, and you know, if you really want to keep people safe, you have to really understand human beings and you have to put them at the heart of your, your, you know, risk analysis and risk management processes. 

And I think that's, that's where, you know, the public understanding of risk becomes really important because it's about understanding not the risk itself and the details of the risk, but about how people respond to it, whether in logical ways or in strange, surprising ways, and trying to understand what can we do about that? 

You know, what are the pathways for doing something around that? How do we make people a bit more risk empowered? 

You know, how do we give them the the tools they need to be able to make good decisions about risk and keep themselves safe, as well as creating environments that keep people safe? 

Shaun: Yeah, I think there's a few just so I can unpack a couple of bits there. 

The educational aspect I think is important about and giving people the right information to make the right decisions from. 

David: Just going back to your point about, you know, the bureaucracy of risk, you know, I think I think one of the things that doesn't happen, well, one of the things that that should happen at a lot of organizations is, yes, you've got a great risk register and you've identified all the risks beautifully and you've scored them, you know, and everybody knows what it is and it's been well communicated. 

But if the culture of the organization doesn't change, you know, if people actually on a daily basis in the way that they work within an organization aren't doing the right things as a result of the the risk register, the risk analysis that you've done, then there's a disconnect between those two things, You know, and in my mind, that's a very similar to the sort of disconnect we experience, you know, in the risk perception world between, you know, you can give people all the risk information, you know, in the world, you know, the statistics, probabilities, You can that there's a there's a movement to improve people's risk literacy, for example, and make people risk literate, but they just don't know what to do, You know, they don't know how to keep themselves safe. 

They don't know what to do to help keep their families safe or their communities or how to build greater resilience in their community for, you know, disasters or for other risks. 

Shaun: There's a few things you've said now that I would encourage listeners, and I'm going to take away myself, which are phrases I've not heard before but really resonated risk anxiety and risk literacy, which are short words for big ideas, as it were. 

But I think there's a lot. I think there is a lot for listeners to take away and think about that. And I'm doing that, I'm doing that myself now as I'm speaking, I'm thinking, yeah, what, what could we be doing better on managing risk anxiety and empowering people to own and manage risk and what could we be doing more to improve risk literacy? 

So they're, to me, are I think some real Nuggets for people to take away. 

OK, if we, if we talk about another area that I'm I'm interested in and that is the privacy paradox. So we, we talked on the podcast previously about employers accessing employees private data. We looked at hidden disabilities, for example. 

And in in your book, you talk about the privacy paradox. 

How serious an issue do you think that might become between employers and employees, and what can be done to ensure that worker well-being isn't affected by this element of risk? 

David: Yeah, I think data privacy is, is, is one of the biggest risks that we face now. 

I think what I was trying to do when I was researching the book was, I was looking at, you know, what are the new risks? 

What are the emerging risks that we now face in, you know, in the modern age? 

So, so as opposed to risks that have been around for a long time, like food-borne disease or crossing the road or, you know, I was just thinking, well, you know, we live more and more of our lives online, digitally, we work in a very different way. 

So, what are the risks we now face, you know, the things that could really causes harm in this digital world that we live in. 

And, and the misuse of our data is one of those big things. 

You know, we, we saw that with the the Facebook Cambridge Analytica scandal where they were, you know, mining people's private data without permission in order to sway people's votes in a presidential election. 

You know, so you have manipulation, you have misinformation, you have disinformation, you have the use of people's data to manipulate and, and to get them to, to do certain things. 

You know, salespeople are constantly using people's data to get people to buy products. You know, and, and you know, one of the biggest risks actually in a place like Singapore, if you do a survey locally of what people's perception of the biggest risks are #1 in Singapore comes out as digital fraud and scams. 

People are really worried about losing money or, or, or, you know, financial scams, you know, happening digitally. 

So, we have this whole new set of risks that have emerged from the way we now work and how much more digital everything, everything is. 

And, and you know, I think data privacy is, is a huge issue. 

The whole privacy paradox thing that I kind of mentioned in the book was another illustration of where what people say and what people do can be very different when it comes to a risk. 

So, you know, with data privacy, you know, if you survey people and you ask them, you know, overwhelmingly people are very worried about this. 

It's a very important issue for them. 

You know, every survey that's done, people say, yes, we're very worried about our data privacy. 

We want to keep it safe. We don't know what to do. And yet, you know, people use their birthdays as their passwords. You know, they write things down in the wrong place. 

They, they save their passwords on their computer, or they share it with a friend or so you know, there's, or they'll give away their data for, for money, you know, we'll give away our data if somebody offers us, you know. 

Shaun: A 10% discount on yeah. 

David: Absolutely. 

And when we buy or we buy an app and we don't read the instructions, we just agree in consent to everything. You get those privacy pop ups now. 

So you know, there's a disconnect between our behaviour that says we don't value our privacy, our data privacy online versus what we say we care about in surveys. 

So there's a difference there and that might just be confusion and people that it's just complicated, people don't understand the small print of data security and data privacy. 

And so again, that's a kind of a sort of behavioural difference in how people perceive risk, how people behave and response to risk, and also what people I think know about about risks. 

Shaun: One other element that kind of came to mind then was fast changing issues and, and for example, COVID. 

I don't particularly like to go back to that, that time a lot because I think it's it's over exploited in at times personally. 

But I do think it's a good, it's a good area to unpick from risk. With risk, the risk agenda developing quickly, how organisations and societies and governments responded to that. 

And I guess if there is anything that in in researching the, and writing the book, any elements that came out for you in that space around the importance of owning risk, managing risk, communicating risk? 

David: Yeah, so I had lots of thoughts while you were talking the big question, just to come back to the first part of your question, which was kind of, you know, what's driving people's perception of risk when it comes to data privacy. 

One of the big issues is lack of control. 

People, you know, people perceive the risk to be higher if they feel they have no control over and people feel they have no control over their data. 

So I think that's one of the things that drives people's concerns about about data privacy. 

Going back to the example of Jaws, you know, the sharks, well, it's a really good example of what drives people's risk perception. 

So the reason people are so afraid of sharks in response to that film, you know, tell us a little bit about what's controlling people's risk response. 

And there are sort of three big factors. 

One is the unknown. You know, people really are afraid of the unknown. And sharks are mysterious. We don't understand them. We don't understand their world. 

I suppose the digital world like data privacy, nobody really understands what's happening inside their computers or phones. 

So that's a bit unknown, a bit mysterious. So anything that's unknown, mysterious produces a much higher risk response. 

The second thing is around the dread factor. 

It's been called the dread factor. You know, by eaten by a shark is such a horrific death that it provokes a sense of dread, very powerful emotion. But dread is also connected to this control thing. 

When we feel we can control something, we worry less about it than if we can't control it. 

Shaun: Well, the dread, the dread that yeah. So someone, someone hacks your bank account, you log into your bank account and find you've been clear wiped out, right. 

So it's kind of digital world. Kind of fear, dread, I'm going to get everything I've got taken. 

So there is a there's a read across from the shark to the to the... 

David: Yeah, and that brings us on to the third thing, which is intent. 

You know, the in the film Jaws, you know, this doesn't happen in the real world. You know, sharks are not schemers. They don't have it out for human beings. But in Jaws, you know, the shark clearly has it in for the protagonists in the film. So the idea that there's intent behind it, that the shark's evil makes it even higher. 

And, and the same is true with, you know, scammers or fraudsters. 

You know, they have got intent. They are out to to, you know, defraud us. 

And that's the third factor in, in kind of risk response. So the unknown, the level of dread, you know, and intent are the sort of three big things that really control our risk response. 

Shaun: Interesting. And in in in the book, in running the risk, you talk about a broad range of risks and dangers. 

As you said, you we've talked already about shark attacks, but you also talk about Hurricane Katrina, volcanic eruptions and then, as you said, digital risk then. 

If we look to the last chapter in the book, which you call Surviving the Future, how can we as individuals, workers, teams, survive the future? 

What, what what you hoping we will think, feel, do now? 

David: Yeah, I think, I mean, you know, in one word, it's resilience. But I think a lot of people talk about resilience these days. 

I'm not convinced everybody really understands what they mean by resilience. And I think it's worth unpacking resilience a little bit. 

You know, what is resilience and how do we build it, You know, because what matters is not necessarily what it is, but actually how do we generate more of it, you know, either as individuals or as communities or as workplaces. 

And how do we make individuals more resilient? How do we make workplaces more resilient? How do we make communities in countries more, more resilient? 

And I talk a little bit about resilience in the book, but I try to get under the skin a little bit of, of what resilience actually is. 

And, and you know, how we build it, you know, and I think there are, there's three big, big things in, in resilience that's important. 

You know, one is, is social capital, you know, is our, our neighbours, our fellow workers, the people around us often are the people we turn to when something goes wrong, when there's a disaster or an accident. 

And, and the, the strength of that social capital is really important in making people and places able to cope with things when they go wrong. 

And so I think that that social capital is really important. And one of the key drivers of that is trust. You know, trust is incredibly important. 

And I think, you know, in recent years, I think, you know, the public have you coming back to your question about COVID, You know, the public in some countries really lost trust in their government and in experts to keep them safe during COVID. 

You know, people in Japan after the Fukushima nuclear disaster completely lost faith in their government. 

You know, the government told them that nuclear power stations were perfectly safe and nothing would ever go wrong. And it did go wrong and people felt betrayed by their government. 

And, and you know, 10/12/13 years later after Fukushima, people still don't trust the government in Japan because of that. 

You know, that the social contract between the government and the public was completely shattered in Japan because of Fukushima. So, to trust is absolutely critical when things go wrong and it's so important for that social capital. 

There are a couple of other things, I think in resilience. 

One, particularly when disaster strikes in countries, your financial capacity to absorb that is really important. And obviously poorer countries get hit harder when disaster strikes because they haven't got the financial capacity to either personally or in families or as a country to cope when things go wrong. 

And then the last thing is that psychological individual resilience, you know, how do you build greater individual sort of psychological resilience to be able to cope with things when they go wrong? 

You know, that's, that's a really interesting topic. 

And one of the things I discovered when I was doing the research for the book was, and I really liked this, was how much optimism helps, you know, the idea of growth mindsets. 

You know, actually there's a lot of research out there that shows that, you know, when people have a growth mindset, when they're optimistic and positive, they respond better to risks and they overcome risks quicker, you know, and they go back to normal quicker or even, you know, in their communities or in their families, you know, become even more resilient after I think something goes wrong. 

Shaun: I’m seeing that, seeing that a lot in in the positive mental health space, in that a lot in terms of the work that's been done on a positive mindset, gratitude, kind of developing kind of personal resilience. 

That's yeah, really, really interesting. 

Well, on... in your book, and I've got a copy here in front of me.  

So it's running the risk, but you say understanding life's biggest risks and how we build a safer future. 

Understanding has been a theme that we've that we have talked about throughout this this episode here. 

What more do you think people could do to improve, develop, grow their understanding of risk, be that prefer professionally, personally what? 

What's your thoughts on that? 

David: So I think so we have this phrase and there's a project that I'm involved with called the Risk Know-How project. 

And, and this kind of I talked about risk literacy earlier that the idea of risk know how goes beyond sort of just knowing what the risks are and knowing what the likelihoods are to actually, you know, really having good risk. 

Know how you know good know how about, you know, the relative risks that you might face in your daily lives, what you can do about them, how other people have faced those risks in the past, where you can get help or information to listen to or not listen to, who you can trust when it comes to information or experts and all of that, you know, sort of combined, you know, for for me is risk know how you know, and I think, you know, my sort of one big message in the book that I leave the book with is actually risk know how is really important. 

You know, one of the difficult things with this kind of book is that people don't always want to know about risk. 

You know, one of the common responses I get is don't tell me, you know, you know, plane crashes, shark attacks, nuclear disaster... I'd rather not know. 

You know, I just don't want to know because if I know about it, I'm going to worry about it. 

Shaun: OK, the risk anxiety piece from earlier on, yeah. 

David: Absolutely. 

But actually, you know, my view is the more you know about risks, the less you worry about them, the less anxious you're going to be because when something goes wrong, you're going to know what to do, which will.... 

Shaun: Going back to your control point. 

David: Yeah, again. 

And, and you can keep your family safe. 

And also, the more you know about the different risks in your life, the more you can make trade-offs. 

You can say, well, actually, look, you know, the, the risk that's really important to me and my family is this thing. 

So, you know, I'm going to do this for my family because that's more important than that thing, you know, and, and I guess there's no right or wrong for me in terms of risk perception. 

You know, people's risk perceptions are what they are, even if the statistics say they're worrying about things that are not very likely, that's still how somebody feels and how they think, you know, so there's no sort of right or wrong and risk perception. 

You know, the scientists might have a different view to the public in certain areas. 

You know, people might worry about nuclear power stations or food borne disease or getting struck by lightning, you know, which are really unlikely. 

And, and scientists may say, well, actually you should be worrying about this, this and this, but you can't engage the public in that kind of way. 

You know, you've got to understand how, you know why the public think those things. 

You know what's driving those attitudes and beliefs and what's happening in their families or their lives or their communities and to make them think like that. 

Shaun: So reflecting back on the book now it's published, if you were to come at it again as you're thinking, are you, are you as you're thinking changed any as it deepened, is there anything you would think, write, do differently or is it still a bit too early with being recently published? 

David: No, that's a great question. 

It takes quite a while for a book to come out. 

And, and one thing I noticed actually is how quickly some of the information dates, you know, because risks change all the time, the statistics change, you know, things people are worrying about change. 

You know, there are all these global risk reports that look at different risks and analyse different risks and every year they highlight different risks. 

So, so one of the things I notice is how fast-paced the whole risk environment is. 

The whole risk landscape is constantly changing. 

So, you know, if you wrote this book today, you know, I started this book, you know, two years ago and the risks right now are slightly different. 

You know, some of them, some of them are constant, but they change so fast, you know, and the pace of change of risks is, is incredible. 

And I think that's, that's another reason why sort of foresight and scenario planning are quite important to, you know, we can't wait for a risk to be really important before we respond to it. 

So we have to be thinking, well, you know, what are the risks in 10 years and 20 years and 30 years and... 

Shaun: That’s why horizon scanning and kind of knowing kind of the environment you're in is so important. That's great. 

David: And which one should we focus on? 

Because we can't, we haven't got unlimited resources. 

So, you know, you can't, you know, we had a great debate in in my team recently about how important space-based events are as a risk. 

They're an important risk, but, you know, how much do we want to worry about being hit by an asteroid or a coronal mass ejection from the sun knocking out all of our communication systems? 

It is a risk, but should we be focusing on that or should we be focusing on the likelihood of the next pandemic or the next earthquake or, and where do we put our limited resources? 

You know, these are kind of, I guess, big questions for, you know, governments and UN agencies and others to be asking who are distributing money and resources.  

At a more human level, we just need to know what the risks are in our daily lives. 

You know how we do about those. 

Shaun: So just as we bring this to a close, then what's come to mind for me is the positive elements of risk, can risk from your perspective, does risk have a positive element? 

Can risk be ‘good’ in inverted commas? 

David: Yeah. So, I mean, you know, earlier I kind of said that, you know, all risk is about something negative happening about harm. 

But, you know, there are different types of risk. 

You know, there are, there are, there are risks we just want to eliminate. 

So, you know, the risk of heart disease or the risk of getting knocked over by a car or contracting Botulism. 

You know, there are some risks we just want to minimize or eliminate. 

There are other risks where, you know, climbing a mountain, for example, you know, yes, there might be a lot of risk involved in climbing a mountain, but there's a benefit of getting to the top, getting a beautiful view of, you know, conquering something that nobody else has conquered before. 

You know, risk is really important in business, you know, nobody if you want to start a company or grow a company, you can't do that without taking a huge amount of personal, financial, reputational risk. 

Risk is actually really important in scientific research, you know, for innovation, you know, a lot of the biggest scientific discoveries were high risk research. 

You know, so risk can actually be good. It can be really important. 

And, and there's also different cultures and different countries view risk very differently. 

If you ask people what they think of the word risk in different countries, most places in the world associate risk with something bad, with danger, with hazard, with something negative. 

But there are a couple of countries in the world, in particular the Middle East where actually and, and places in East Southeast Asia like Singapore, where more people associate risk with positive thoughts than negative thoughts. 

They think risk is a good thing and they want to take risk. 

And I think that probably explains a little bit about why some of those countries are some of the best places to set up a new business or a new start up in the world. 

Yeah, fine. So, each episode I, I like to ask our guests what one take away they'd like the listener to have from this conversation. 

So one thing you'd like that you'd like the listener to think or do differently as a result of this conversation and your, your research. 

So what's your one big take away? 

David: So I guess my my big take away is is everybody needs to have risk know how. 

You know, I end the book really talking about the importance of understanding risks, knowing what your personal risks are, understanding your risk environment, your risk landscape. 

It might be you personally, it might be in your workplace, it might be in your family or your local community. Just knowing what those risks are, knowing what you can do about them are really important. 

And not being afraid of risks and not being risk averse. 

Well, risk averse is the wrong word, but not being risk ignorant, you know, we almost need risk intelligence, you know, to be able to cope with risks and to become more resilient. 

Shaun: Thank you so much. 

Thank you for your time that that was a really fascinating walk through all those different perspectives on risk. 

And I think certainly for me, you've just reinforced for me that that risk and risk management is, is necessary, engaging and when done the right sort of way, exciting and, and gives us all an opportunity to just keep getting better and better at what we're doing. 

So thank you very much and good luck with all your work in the future. 

David: Thanks, Shaun. 

Shaun: Running the Risk by David Barrett Reed is available from all good book shops now. 

 

Links will be in the episode description.