111- Deep Dive with Shayoni Lynn, Part Two: Misinformation, Disinformation and the PR challenge

I am thrilled to have on the podcast Shayoni Lynn, in a special double-bill all about behavioural science and its application in PR.  Shayoni is founder and CEO of Lynn Group, a fast-growing communications consultancy powered by behavioural science.

Shayoni is a CIPR and PRCA Fellow, and Chair of PRCA Cymru. With over 15 years’ experience, Shayoni is an industry leader in data-driven strategic communications and applied behavioural science. She sits on the UK PR Council and is Vice-Chair of CIPR’s Behavioural Insights Interest Group. Shayoni sits on CIPR and GCS committees and is an Associate Lecturer at Cardiff University.

In this ‘Deep Dive’ double-bill we will cover;

Behavioural Science and its application in PR campaigns

Misinformation disinformation and conspiracy theory and how it relates to modern PR

This episode explains the differences between misinformation, disinformation, conspiracy theory, and how as PR practitioners we can spot this and react.

Let’s dive in!

Links mentioned in the episode;

Misinformation frameworks

APEASE

INCASE

Wall of Beliefs 

Lynn Group report

BS Monitor

 

Liked Listening today?  What to do next:

Get my FREE roadmap to get more strategic with communication activity in your business.

Listen to more episodes, take some training, or download a resource: Find out more here.

Hire my expertise

Whether that’s support with a one-off comms project or an entire strategy for your business, drop me a line if you want to explore this further.  You can also work with me 1:1 as a trainer and mentor – emma@henbe.co.uk

Work with me closely

If you’d like to work with me to develop and implement your communication strategy through 1:1 work, podcasts, workbooks, sharing ideas, and lots of accountability and up-skilling, then email me at emma@henbe.co.uk to register your interest for you or your entire team.

Leave me a voicemail on my Speakpipe page I would love to hear your feedback on this episode and thoughts on any topics I could include in future ones too.

Full transcript (unedited)

Emma Drake 00:06
Hello, and welcome to this episode of communication strategy that works with me, Emma Drake. Hi, everyone. How are we all doing today? I hope you’re doing okay. Well, welcome to this second part of the two-part series with Shayoni Lynn, founder of Lynn Group. And we’re going to talk all about misinformation, disinformation and a little bit about conspiracy theory. And Shayoni is sharing her insights with us and why this is such an important topic for PR practitioners. If you don’t know Shayoni, she’s a CIPR and PRCA Fellow, and Chair of PRCA Wales. With more than 15 years of experience, she’s an industry leader in data-driven strategic communications and applied behavioural science. She sits on the UK PR Council, and he’s vice chair of the CIPR’s Behavioural Insights interest group. She also sits on the CIPR and GCS committees and is an associate lecturer at the Cardiff University. I’m absolutely thrilled she’s made the time to come on the show today. So without further ado, let’s crack on with the second in a two-part series. So let’s dive in.

I’d like to transition neatly into talking a bit about misinformation, disinformation, and conspiracy theory. And I know that behaviour change is linked. I know it’s a big part of Lynn group in terms of your practice. So do you want to perhaps explain what those three things are, first of all? That would be a good starting point.

Shayoni Lynn 01:47
Yeah, sure. Misinformation, disinformation and conspiracy theories. In some ways, the difference is academic: It’s about the intent of the individuals spreading it. A piece of false information spread by an individual without an intent to harm is misinformation. Where the false information is being spread deliberately, to be misleading, that is disinformation. And conspiracy thinking is a way of seeing the world that supports the spread of false information. So it’s a belief system that connects unconnected dots explaining complex events with simple stories and involving secret cabal – individuals, police, etc – and it’s the foundation for everything from climate change denial to white supremacy. I think what’s more useful for us to think about the way we certainly look at mis- and disinformation at Lynn is through the lens of “harm” or harmology – so what is the real-world harm, that these damaging narratives and misleading content can have to our public, whether in the space of health or climate, or democracy, or safety? It’s about understanding what these narratives can create in terms of both online and offline harm?

Emma Drake 03:15
Okay. I can imagine that’s really useful to know. And there is lots of information on the Lynn Group website, about these three things. So do go on and have a look at those. In my experience, that’s probably a bit of context here. So I have a teenager and social media is a huge part of our lives, in terms of everyday communication, and also keeping her safe. We get a lot of information from a school, etc. (this is my little corner of the world – I know, there are other topics, but just to make it a bit realistic.) I think this probably falls into – in terms of changing behaviour – am I right in thinking that a lot of misinformation and disinformation with this belief system wrapped around it, is actually changing young people’s behaviours? If it’s a negative thing, it’s a very far right or a group or if it’s a particular individual, who I don’t want to mention, because I don’t want to give him any airtime. Is this the big worry? The issue with these things is that we don’t really know what’s true and what’s not, on the receiving end.

Shayoni Lynn 04:41
Well, that’s right. If you think of it, we are curating our own media diets, right? So in the world of social media, we each go onto our platforms, and we follow people we like, we connect with people we like or agree with, which in itself is a bias, right? So it’s a confirmation have our own beliefs. And by doing this, whether intentionally or unintentionally, we’ve created a bubble around us. What’s happening on the lens of disinformation and radicalisation happens to all ages – young people get radicalised and so do old people. I don’t think some segments would be more susceptible. But I wouldn’t put a generational lens on it. I would put more contextual lenses on it, if that makes sense. I think where the challenge and the worry is, and something I’m talking a lot about recently, is how sophisticated in the weaponisation of behavioural science disinformation actors are. They have created an art of how to create these narratives, which start in fringe communities and take some while to bubble into the mainstream. But then very easily radicalise communities, audiences, segments groups etc. to create these false narratives. We’ve seen during the Trump era of alternative facts, and maybe they’ll return. Who knows? We’ll find out later. But I think they are really sophisticated in their tactics of how they’re using the behavioural levers to radicalise individuals and groups who don’t expect that radicalisation. It’s done in such an integrated, embedded manner to that media curation that we cultivate for ourselves, that you can’t see it from the surface level. When you look at the flip side of the response and how communicators are responding to this disinformation, (again, being very clear, that disinformation is where the real threat lies. Misinformation is more the proliferation of this disinformation) we are seeing it still through the lens of crisis, right? So we only respond once it becomes an issue on online communities or if it hits the media. But by then, it’s too late. We as communicators have a moral obligation to protect our audiences, so we need to be far more proactive in how we protect and inoculate our audiences from disinformation. So to do that, we need to really understand what mis/disinformation and conspiracy theories are, and be able to create strategies that can prevent that false narrative from encroaching into our communities, and proactively and preemptively build trust with an audience so they trust us as the messenger, more than the disinformation actors. But I think we’re still being really slow, and really tactical in our response where the other side is absolutely racing ahead in very sophisticated ways, and slowly but surely radicalising more people.

Emma Drake 07:58
Yeah. So we’ve not caught up with that is to kind of in summary, I think was what you’re saying. The reaction to it hasn’t caught up with the sophistication of the problem.

Shayoni Lynn 08:09
We need to be far more proactive. I think we need to stop looking at it through the lens of crisis. We need to think about, specifically in areas like health and climate, that we know there’s going to be disinformation. How do we get ahead of it?

Emma Drake 08:23
That’s really interesting. I think there’s an example in your newsletter around the LGBTQ+ community and some work you were doing on getting into those groups. Want to talk a bit about that?

Shayoni Lynn 08:39
I’m trying to pull up the report, because I don’t have the exact insights on this. Let me just see if I can find it quickly. But this was, if I remember correctly, we were doing some analysis in far-right groups, and we’d noticed these anti-LGBT conspiracy theories, and there was a spillover in Congress. This was during pride month. In the month of June, the misinformation cell embedded itself in fringe radicals’ spaces, and they were tracking six of the most notorious far-right communities in the UK. This particular subset was a combined active membership of over 50,000. And the cell analysed 240 pieces of anti-LGBTQ+ content, and specifically identified 12 super spreaders responsible for propagating over 50% of these, as we call it, the adversarial narratives recorded. So this analysis found that was, for example, Telegram groups – but not being used to actively radicalise or find new recruits – they were being used to reinforce hateful narratives in who’s already committed to extremist ideologies. So the group, we’re all right-wing echo chambers – two of which we found were visibly drifted further right during the pandemic. Conspiracy theories and other fringe beliefs were also then prevalent within those groups. We then found, interestingly, a spillover in Cardiff. So I think it was six months in, that some of these narratives that we had found in very far right communities in different corners of the world, spilled over, and there was an anti-LGBTQ+ demonstration in Cardiff with those very narratives that we had identified. This one example shows how something that started in Germany can very quickly proliferate into Africa. And while I was in Anthropy, which is the conference (or it’s not called a conference, it’s the meeting of minds…)

Emma Drake 11:21
Yes, you can’t call it a conference…

Shayoni Lynn 11:25
One of the sessions that I attended was looking at health care. And this was specifically something that we looked at in how, during the pandemic, far-right narratives, or narratives that were anti-vaccination that were emerging in, say, Germany, were very quickly moving into Africa. And communities that we previously would never have thought would be exposed to this narrative, were, and were then refusing the vaccine. We were talking about unintended consequences. It’s very important to consider through a couple of frameworks in behavioural sciences, to look at unintended consequences. [The BCW lists criteria to apply when making these judgements under the acronym, APEASE: Acceptability, Practicability, Effectiveness, Affordability, Side-effects, and Equity] criteria that fall within the COMBI framework (Communication for Behavioral Impact), or there is a Wall of Beliefs Framework by GCS (A toolkit for understanding false beliefs and developing effective counter-disinformation strategies by the Government Communication Service’s Behavioural Science Team.)?

Emma Drake 12:20
I haven’t heard of that, actually. Well, tell us a bit about that.

Shayoni Lynn 12:24
So that’s a framework on misinformation essentially. And thankfully, it appears that Lynn has been using similar methodology for the past year. We’re working very closely with the Cabinet Office, including the team that created the Wall of Beliefs framework, but I would recommend looking at that and misinformation. Going back to the unintended consequence framework that GCS created: It’s called “in case.” It’s about anticipating unintended consequences. So you’ve got three resources: Apease, In Case, and Wall of Beliefs.

Emma Drake 13:22
Yep. So that’s pretty good. I can put some links in the show notes for those resources. I mean, there’s a lot to think about isn’t there? And depending on what area you work in, there must be some practitioners that are coming across or mitigating? Well, I work in the built environment. It’s on a small level compared to the things you’re talking about with far-right extreme groups, although a lot of pressure groups are very well-funded. What was the phrase I heard the other day? “Astroturfing.” Which I hadn’t come across. So, funded groups that are presenting as grassroots organisations, but they’re not. Anyway. So we do have a lot of misinformation and trying to counter that is sometimes very tricky. We are using very traditional methods, now you’ve pointed it out. But actually, they’re not very sophisticated groups often. It’s just really the core matter of misinformation, and I think a lot of businesses probably deal with that a little bit in various sectors. What advice would you have for practitioners that are dealing with this today on the ground?

Shayoni Lynn 14:57
I was asked this question on a panel and I think the logical answer for me to provide is share accurate information. But without sounding like a pessimist, I would say that that’s far enough. It’s absolutely right that we share accurate, credibly-sourced information from the right sources. But, I think we’re past that point where that is effective at changing, as we say, in our industry, hearts and minds. So as communicators, certainly we need to be thinking about what intelligence can we gather? How can we identify these false narratives? How can we counter these false narratives? How do we embed inoculation and protection into our campaigns, which are designed to build trust. I appreciate that this is not common school, right, as you say, and I hope that in 10 years, we will have this as a common skill (misinformation strategy). But until then, sadly, communicators either will have to upskill themselves (reading a lot of literature from academia and thinking about how that’s applied in practice) or purchase services. And that’s not a sell by any means. But that is the reality for very few organisations, whether in-house or service-side that are providing the solutions. It’s about being proactive and gathering the intelligence before it becomes a crisis.

Emma Drake 16:29
I would probably add some really good advice, I would probably add to that, that that’s it’s helpful to have a slightly informed client. It’s good to skill up anyway, because if you’re going to embark on any of this, we know that the best projects come from the best clients, right? So you know, the ones that get what you do,

Shayoni Lynn 16:49
I think more and more people aren’t getting what we’re doing, which is why we are where in terms of growth so yeah, the future looks positive for this approach. And I think more and more so, CEOs and leaders of organisations are going to insist on ROI on spend, as purse strings get tighter, the more we have to demonstrate the effectiveness of budgets. So it’s really important then to be able to go beyond things like your traditional metrics of reach engagement, conversion, and start thinking, “What is the attributable change we have achieved?”; “How have we protected the reputation of our business through the lens of mis/disinformation?” These are questions that will very soon be on our desks, and we’d love to answer them.

Emma Drake 17:36
Well, that’s fantastic. I think that’s a great point to end on today. It’s been absolutely fascinating talking to you today, Shayoni. Thank you very much.

Shayoni Lynn 17:47
Thank you, Emma. Thank you so much for having me. I’ve absolutely enjoyed it. And this slot has flown by,

Emma Drake 17:53
I know, it really has. I just looked at the clock. So I’m going to put a number of resources and links in the show notes today for those listening today. And I’ll just say bye for now Shayoni, and thanks very much for joining us.

Shayoni Lynn 18:10
Thanks, Emma. Lovely to be here.

Emma Drake 18:13
Finally, thank you for listening to this episode of Communication Strategy That Works. Don’t forget to check my show notes for those links that I mentioned. And I’d really love it if you would subscribe to my podcast and leave me a review. Also, if you think there’s someone that could benefit from listening to this podcast, please share this within your networks. So I’ll just say bye for now, and see you next time.