GCHQ

38m

The Code Breakers

Brian Cox and Robin Ince are joined by comedian Katy Brand, as they transport the cage of infinite proportions to the home of modern day cryptography and codebreaking., GCHQ. They'll be discovering how far we've come from the days of the humble code book and the birth of machines like Enigma. and how the new digital era has turned us all into modern day code breakers and cryptographers, without us even realising it.

Producer: Alexandra Feachem.

Press play and read along

Runtime: 38m

Transcript

Speaker 1 This BBC podcast is supported by ads outside the UK.

Speaker 2 What's going on, California? It's Bluff here. You know what's better than a relaxing day at the beach?

Speaker 2 A relaxing day at the beach with SpinQuest.com, America's number one social casino, with over a thousand slots and table games available to the comfort of your own phone, with instant cash prize redemptions, and new users get a $30 coin package for only $10.

Speaker 2 That's SPINQUEST.com. I'll see you there.

Speaker 3 SpinQuest is a free-to-play social social casino. Voidwear prohibited.
Visit spinquest.com for more details.

Speaker 4 Want to stop engine problems before they start? Pick up a can of C-Foam Motor Treatment. C-Foam helps engines start easier, run smoother, and last longer.

Speaker 6 Trusted by millions every day, C-Foam is safe and easy to use in any engine.

Speaker 4 Just pour it in your fuel tank.

Speaker 4 Make the proven choice with C-Foam.

Speaker 7 Available everywhere.

Speaker 4 Automotive products are sold.

Speaker 4 See Foam!

Speaker 8 Fall is Crush Season in California wine country.

Speaker 8 For a limited time, sip stay and savor crush-worthy getaways with up to 30% off and a bottle of local wine at destinations like Pasaro Bill's Inn, Abola Lighthouse Suites, Baspera Resort on Pismo Beach, and Sheraton San Diego Resort.

Speaker 8 Each day celebrates harvest season with wine and exclusive savings. Explore and book now at crushgitaways.com.
You can also enter to win a Lux trip to Napa's Silverado Resort.

Speaker 8 Visit crushgitaways.com to start planning your fop crash. That's crashgitaways.com.

Speaker 9 This is the BBC.

Speaker 6 I'm Robin Ince.

Speaker 10 And I'm Brian Cox, and this is the Infinite Monkey Cage. And

Speaker 6 by

Speaker 1 ye.

Speaker 12 What are you doing?

Speaker 1 I'm speaking in a code that I've made. Did you understand any of it? No.
It's a brilliant code, then, isn't it?

Speaker 12 How does that code work?

Speaker 1 Well, as opposed to saying the words that are in my head, I just kind of make a random selection of noises. That's what I do.

Speaker 10 Yeah, but how might I decipher those sounds?

Speaker 1 You can't. That's why it's a brilliant code.

Speaker 10 And it's brilliant. But no one knows what it means.

Speaker 1 No, even I don't. See, it's impenetrable.
I'm like a young Alan Turing.

Speaker 9 Anyway.

Speaker 10 Well, hopefully, Robin will have understood what he's not understood, or even understood that he's not understood something after the show, because we are at GCHQ in Cheltenham to talk about codes and code breaking.

Speaker 10 What makes a good code? What makes a good code breaker? And how has the art and science of code breaking changed since GCHQ and its predecessor, the Government Code and Cipher School, began in 1919?

Speaker 1 And if you press your red button now, you can hear the entirety of this show in Morse. Now, joining us on our panel today, because we're at GCHQ, are Ms.
Smith, Ms. Smith, Ms.
Smith, and X.

Speaker 1 We may not hear from X.

Speaker 6 And they are.

Speaker 12 Oh, hi, I'm Ian. I'm the Deputy Director for Counterterrorism here in GCHQ.

Speaker 12 And my favourite code is the genetic code because I think it's amazing that you can just have a big string of letters and that's enough to understand how to make a person. And yeah.

Speaker 11 I'm Tony Comer, I'm GCHQ's departmental historian. And my favourite code is the German High Sea Fleet code book of 1914, Da Da Dit Da.

Speaker 10 Everybody says that.

Speaker 5 Everybody.

Speaker 9 I'm Katie Brand. I shouldn't really be here.

Speaker 9 Won't go into why. No, no, don't worry.

Speaker 5 Don't panic.

Speaker 5 This now sounds like you're a double agent.

Speaker 1 You're not going to be let out.

Speaker 9 I'm really regretting it already. I'm talking myself into a serious small room.

Speaker 9 No, I am a writer and a comedian, and my favourite code is the alphabetical code that me and my friends made up, where we created a whole new alphabet set of symbols when we were at school so we could write rude messages to each other about the teachers.

Speaker 10 And this is our panel.

Speaker 10 Tony, before we begin, I wanted to ask you the German high sea fleet code book.

Speaker 10 What was that?

Speaker 11 This was the code book that the German Navy was using in 1914.

Speaker 11 And the Russians, who for a very short period of time were our allies,

Speaker 11 captured a copy of the code book and sent it to the UK.

Speaker 11 And in November 1914, for the first time, all the different bits of work that go up to make what GCHQ does today were done in the same place at the same time.

Speaker 10 Is that one of the first examples of military codes being important,

Speaker 10 or the breaking of military codes being important?

Speaker 11 The military have been using codes of one sort or another since since ancient times, but it's the first time in the modern era, after the invention by Marconi of wireless, that all of the military powers were beginning to use encryption techniques to send their messages to their forces and command and control them in a way that other countries wouldn't be able to know what they were saying.

Speaker 1 So, basically,

Speaker 1 once Marconi's invented the radio, that's when an organisation like this becomes a necessity.

Speaker 11 Yes, but the UK didn't realise it for 16 years.

Speaker 1 It's a grand tradition, and we continue to do that.

Speaker 7 But

Speaker 10 it was 1919 when the predecessor, Ian, to GCHQ, was set up. So what what what how was that functioning at the time?

Speaker 12 So I mean back back then it was a it was a very different world really to today, you know, the very much more kind of simple technology environment obviously, you know, it was it was back then.

Speaker 12 If you think about people like you know, Marconi and and folks like that thinking about how to get the transatlantic kind of wireless and telegraphy kind of systems to work, information around the world was started to flow.

Speaker 12 Our job really back then was very much more about supporting kind of the military and also thinking about the ways in which information and radio in particular was a thing that actually was carrying information that we needed to know about, right?

Speaker 12 And over the years, as you sort of look at the whole kind of history of GCHQ, and we've got our centenary next year, right?

Speaker 12 So we've got a kind of hundred years of this to look at, it's kind of unbelievable really to think how we as an organisation has had to transform what we do to keep track of the way that technology has moved.

Speaker 12 And I think if you as you you know, you sort of come to a a building like this and you look around and see the kind kind of people we've got working here and the kind of ways that we go about doing our work, it's frankly completely unrecognizable from kind of the days of 1919.

Speaker 12 And not just because of the technology, really, because of the problem.

Speaker 9 Is a genius always a genius? Do you think? Like, even if Valentine was here now, would he be able to apply his level of innate genius to whatever new technology is around?

Speaker 9 You could apply it in whatever age, anywhere?

Speaker 12 Yeah, I don't know.

Speaker 12 I mean, I think certainly for us and the sort of, we've had a number of people who I think over the course of our history you you genuinely call a genius and I think the thing has been the way that they think not necessarily the context that they're thinking in it.

Speaker 12 So Turing was brilliant because he could think around corners really.

Speaker 12 You know he could he was looking at what appeared to be a completely intractable problem so enigma is the kind of the classic thing and you know yes he obviously was deeply steeped in sort of very very expert in in mathematics in particular but I think one of the things that really characterized him was he just kind of thought laterally and he thought about a different way into the problem you know, and in the course of doing so, kind of invented a whole load of technology that never happened before.

Speaker 12 So, I think that sort of ability to think around corners and that ability to sort of imagine a completely different way of solving a problem that most of the world would probably have imagined at the time was just, you know, was unbreakable

Speaker 12 in those terms.

Speaker 12 I think we still do that here today. It's really, it's really fundamental.

Speaker 12 And yeah, and I think it characterizes the work that we do. You know, we are, I think genius is,

Speaker 12 for us, it's not just about having one or two geniuses, it's about having a whole bunch of really, really talented and diverse people who come together and sort of create that kind of collective genius that allows us to do what we do.

Speaker 1 Tony, I want to, again, just talking about that idea of the loner, because is this a problem with mathematics as a whole? Which is the ones that are written about most are very often the quirkiest.

Speaker 1 And therefore, people, I think about, I was reading a book about Goethe the other day, you know, most famous for his incompleteness theorem.

Speaker 1 And some of the things about him was that his diet, for instance, was baby food, butter, and laxative.

Speaker 1 And he wouldn't go out any days if another mathematician was in town because he feared he'd assassinate him. And he thought he was going to die from refrigerator gases.

Speaker 1 So, quite often, people think, I'm going to go into maths, and then they read about one of the most famous ones and go, I'm not really sure I've got what it takes or

Speaker 1 whether I even want to have what it takes.

Speaker 9 So, is that something that's the same with touring comedians to some extent?

Speaker 1 It's an interesting thing, isn't it, in history, where sometimes those who are elevated most are not necessarily the most ultimately representative of perhaps what might be required.

Speaker 11 I think already by the middle of the 1940s, GCHQ had this reputation of attracting idiosyncratic, quirky people.

Speaker 11 And I think you need to turn this on its head and say that this was an organisation that, in its time, was almost revolutionary in thinking about saying we don't want stereotypes, we don't just want a load of people who are all the same.

Speaker 11 You know, great minds don't think alike. How do you bring in enough people who think differently

Speaker 11 so you can actually get to the heart of your problems and then add something about not telling them how to live their lives, you know, not demanding a particular set, shall we say, of social skills or of dress sense or anything like that?

Speaker 11 It's not why you're employing people.

Speaker 10 Can I just say for the radio listeners, the entire audience is made up of GCHQ employees at the moment. So this is, I'm not sure whether this is comfortable or uncomfortable in this conversation.

Speaker 7 Carry on, Tony.

Speaker 5 Yes.

Speaker 11 I've been here for a long time.

Speaker 10 And you look magnificently tailored. Thank you.

Speaker 11 You don't look so bad yourself, Frank.

Speaker 1 What is that equation that's tattooed on your naked chest that you're showing us at the moment?

Speaker 12 If I told you, you wouldn't believe this.

Speaker 10 There's an Enigma machine, actually.

Speaker 10 We mentioned Alan Turing there.

Speaker 10 Could you first of all go back to the origin of codes and perhaps describe the simplest of codes and then how those codes have evolved over the years?

Speaker 11 I think the simplest sort of codes are ways of taking, for example, a book, taking a dictionary, and just assigning a value to every word, or maybe make it a bit more complex and throw in some phrases and assign just a simple numerical value to them.

Speaker 10 So you have a code book there, which is just phrases, and on the other side, it's actually sequences of letters in that case, isn't it?

Speaker 11 Next to them, just they can be letters or numbers, numbers, and they can be more or less random depending on what you want to do with them.

Speaker 10 Now, that would be a Katie-type code, wouldn't it?

Speaker 10 That would be the kind of code that you developed to

Speaker 1 have your code.

Speaker 9 Just do those little squiggles next to each letter.

Speaker 12 Yeah, that's what we did.

Speaker 9 Funnily enough, we became quite fluent in it because we wrote in it several times a day, every day, for a few weeks. And then, because we were 14, or you know, we got bored of it, and I don't know,

Speaker 9 some pop star caught our eye. God, I I sound about 100 years old.

Speaker 5 Someone at the top of the hip parade took our eyes.

Speaker 1 I suddenly saw Tommy Steele, and I thought, my life's changed.

Speaker 10 Do you remember the squiggle? No, not Juran Duran or something, or whatever it was.

Speaker 9 No, I did. It was literally each letter.
I mean, it was quite extraordinary that we had the patience to do that in the end. We must have really wanted to be rude about the teachers.

Speaker 9 But yeah, we learned this whole new alphabet and we became very fluent in it.

Speaker 9 But the interesting thing was that as soon as we stopped, even for a few days, we immediately, I mean, we forgot all of it very, very fast.

Speaker 9 So, you know, if I didn't write anything for a few days, I wouldn't forget the alphabet, but I guess I'd be surrounded by the words all the time.

Speaker 9 But is that a common problem, or is that just because we were particularly stupid? I mean, if you stop using your code regularly, do people tend to just forget very fast?

Speaker 9 Does it go too deep or too shallow?

Speaker 10 I can answer that because what you do is you have a code book.

Speaker 5 Okay.

Speaker 10 That's the point. So is that called a substitution code? Is that what it's called? Where you just.

Speaker 11 sort of this sort of code, yeah, where you're just substitute you're substituting

Speaker 11 each word or each phrase by a small group of letters. But in itself, that's not fantastic.

Speaker 11 It's very a very good way of shortening your messages, for example.

Speaker 11 But what you really want to do is

Speaker 11 add something to those numbers that make them unrecognisable as the numbers that were in the code book, and that's a key. You're going to add something to them.

Speaker 11 It could be, for example, from a pad.

Speaker 11 This is a one-time pad, as it's called. You'd only use it once.
You and the person you're sending a message to share these books of random numbers

Speaker 11 and you add the value of these numbers to the value that you've got out of the code book and come up with a third number. That's what you send to the recipient.

Speaker 10 So if A was very simply, let's say 11, and you looked in there and you had 12, then you'd add them together and you'd write 23 down. Exactly.

Speaker 11 And the person at the distant end would see 23, he'd have the same book, he'd subtract 12, and he'd get back to 11 and know that you're talking about A.

Speaker 1 Yeah. So, how long would they, I mean, when would that have stopped? When would that numerical that that was just no longer of any use?

Speaker 11 Using these one-time pads, it's not stopped.

Speaker 1 It's not stopped at school.

Speaker 11 These are the safest way of sending a message that you can be pretty certain nobody will ever be able to break if you use them properly.

Speaker 10 And so, it was the main problem, I suppose, is sending the pads around, isn't it?

Speaker 11 The distribution of the pads and the sheer tedium of having to add large numbers of numbers to large numbers of other numbers.

Speaker 10 I should say, we've got a very beautiful thing here, which is one of those

Speaker 10 pads

Speaker 10 printed on silk.

Speaker 11 That's right, that was one that was produced for use by MI6's agents during the war. The people who were being

Speaker 11 parachuted into France, into the occupied countries, they obviously couldn't take code books or one-time pads with them because if they were searched, they'd be discovered.

Speaker 11 But a silk sheet can be sewn into your clothes.

Speaker 11 If you need to get rid of it, it can be just burned. It'll burn much better and more quickly than paper.
You can scrumple it up and just put it in your shoe or something like that.

Speaker 9 You could wear it as a nice pair of tights.

Speaker 11 You could, but it would be perhaps slightly more visible than you'd want it to be. Right.

Speaker 1 Which one's the spy?

Speaker 5 I think we're.

Speaker 1 The woman in the numerical tights.

Speaker 9 I'm starting to get a sense of why I never got a tap on the shoulder.

Speaker 10 So, why are these

Speaker 10 one-time pads, which is a very simple idea,

Speaker 10 why is that still

Speaker 10 effectively?

Speaker 10 Well, I should ask you actually, is that unbreakable? You said it's very, very secure indeed.

Speaker 11 As long as you've generated the numbers in a completely unpredictable manner, so they're effectively random, as long as you only use them once, and as long as you use them properly,

Speaker 11 for practical purposes, yeah, these are completely unbreakable, but they're in difficult to use quickly.

Speaker 11 The temptation to reuse numbers is always going to be there because of the difficulty of sending them, sending fresh pads around.

Speaker 11 And it just takes too much time. Nowadays there's far too many messages that people want to send to be able simply to write them all out by hand and then start adding these numbers.

Speaker 10 So all the more complex technologies we have now, like the Enigma machine, for example, that we had in the 30s and 40s, are really

Speaker 10 attempts at making more convenient codes rather than more

Speaker 10 safer codes. Yeah.

Speaker 11 Essentially, an Enigma machine, I mean, it looks like a typewriter, but as you press keys,

Speaker 11 it lights up letters. There's a complex

Speaker 11 electrical pathway through which

Speaker 11 the current goes when you press a key to light up a lamp on the lamp board.

Speaker 11 Fantastic machine, 10 to the 138 permutations possible, but there's a fundamental flaw in its design.

Speaker 11 Because it was designed for people to be able to use quickly and simply, it's reciprocal, which means that for any given setting, if you press a P and it gives you a Q,

Speaker 11 at the same setting, if you press a Q,

Speaker 11 it'll give you a P.

Speaker 11 That reciprocity brings another problem along, and that's that a letter can't encrypt as itself.

Speaker 11 And that fundamentally is the weighing for a cryptanalyst to breaking a message encrypted on that machine.

Speaker 1 There is that

Speaker 1 the story of how it was broken was it is it right about one particular phrase that by repetitive use? Is that true? That's that story that ultimately led to the breaking of it, or

Speaker 11 it's much more complex. It's much more complex because

Speaker 11 there are a whole set of things that you need to be able to understand. And a cliché is

Speaker 11 German messages ended with Heil Hitler, so you'd look for the last ten letters of a message, and if none of the letters match the letters of Heil Hitler in the right order, that must be the message.

Speaker 11 In practice, it's looking for things that say weather forecast, because you know that's a bit more common, or just the way that military bureaucracies send messages.

Speaker 11 They have a very stereotyped format,

Speaker 11 exactly the same as emails do today, because emails derive essentially from the way the military message. So they'll begin to somebody, from somebody, they'll have a serial number, a date, a subject,

Speaker 11 you have who's the copies, you'll have a classification.

Speaker 11 So, a whole load of stuff at the beginning of a message that you expect to be there, and that's the way in rather than the content of the message.

Speaker 10 So, these are clues because I was going to ask, Ian, I suppose a naive

Speaker 10 person would think, well, with the computing power that we have today, what you do with a code is you throw a computer at it and just try it. But

Speaker 10 that clearly is not the case. There's a lot of human intelligent input into breaking a code.

Speaker 12 Yeah, absolutely.

Speaker 12 And you know, certainly kind of modern encryption algorithms are the sorts of things that underpin the way that you know the internet works and the way that you're able to do internet banking safely and things.

Speaker 12 I mean, there's no doubt, you know, I mean, to implement those and

Speaker 12 to use them, you need enormous amounts of computing power and things to get the kind of security. It's all about how hard it is to kind of you know break the keys and things like that.

Speaker 12 But you know, it's it's a really important point.

Speaker 12 I mean, all of these systems at some point rely on people so people implementing them people designing them and actually people using them and that's and as you said earlier, you know that the thing about the one-time pad which just makes it a nightmare is it's just massively annoying to use to actually do anything useful.

Speaker 12 You know, and if you were trying to send the sort of you know huge amounts of data around the globe that we now all sort of know and love, I mean trying to, you know, obviously you're not going to do that with a one-time pad, right?

Speaker 12 So you have to invent schemes that are that work at scale, that can deal with volume and are very secure.

Speaker 1 In some ways, in terms of observing what is going on in the world, in one way,

Speaker 1 has it become more difficult, but equally, are you also being offered going, I cannot believe we are able to observe this form of observation?

Speaker 12 In terms of just the amount of information that is just kind of out there and kind of.

Speaker 1 Yeah, in terms of what people, you can go, hang on a minute, if that Google search is coming up there, there, there, and there, that may well suggest, oh, you know, and then you can build up a pattern.

Speaker 1 Can that happen? Is that possible? And not necessarily Google, by the way. Other search agencies I use, by the way, if I'm ever been covert, I just go and ask Jeeves.

Speaker 5 But

Speaker 1 that to me, I'm just fascinated sometimes when I read books about big data, what is given away about human behavior and how useful that can be to someone like you.

Speaker 12 Yep, absolutely. So,

Speaker 12 you know, if you work in an organization like GCHQ, it's a slightly different question because what we're about is trying to say, okay, there is all this data out there.

Speaker 12 99.999, very large number of lines, of it is of no interest whatsoever to us as an intelligence and security organization.

Speaker 12 But a very, very, very small percentage of it absolutely is.

Speaker 12 And that very, very small percentage of it is something that we need to be able to find and we need to be able to use, you know, to save lives or protect children and all the things we do.

Speaker 5 For us,

Speaker 12 the thing that makes our work very technically challenging, our job is to figure out how to only go and get the bit that's that's going to make the difference.

Speaker 12 So that's not only is that a computer science problem, it's a question of policy, it's a question of ethics, it's a question of many other things.

Speaker 12 It's absolutely not the case that the availability of data that we all kind of use is the thing that helps us do our job because so much of that is actually of no interest to us whatsoever.

Speaker 9 And just to continue this, and it's really serious what you're talking about, you know, saving and keeping the country safe and

Speaker 9 the privacy versus surveillance issue. So, I have a really, really serious question that I know troubles me and a lot of my friends, which is: can our phones hear us us talking about mattresses

Speaker 9 and then put a targeted ad for mattresses creepily and randomly in the middle of our Facebook feeds?

Speaker 12 So, certainly, if they do, there's nothing to do with us.

Speaker 5 I'll say that.

Speaker 5 To be really clear, it turns out you're actually run by Beds R Us, Shapland, and Gloucester.

Speaker 1 The whole of GCHQ is actually nothing more than a front for an enormous mattress company and also easy chairs.

Speaker 9 But you know what I mean? When everyone taught remarks on this, you know, I was really weird.

Speaker 9 I was talking to my friend the other day about maybe going on holiday to Cyprus, and we were both sat there and we didn't put anything about it online.

Speaker 9 And then I got home and I had an advert just popped up that sold me a holiday to Cyprus.

Speaker 9 Now, I know this isn't necessarily in your remit or your day-to-day work, but just from a pure technological point of view, could you just help us to understand

Speaker 10 counterterrorism and targeted marketing? Yeah,

Speaker 5 and so

Speaker 9 this is my only chance to ask an expert because it bothers a lot of us, Ian.

Speaker 5 It bothers a lot of us.

Speaker 12 Well, I'll give it a go. So, so absolutely, you know, you will know if you're, you know,

Speaker 12 you're online, if you're using the internet, and

Speaker 12 you're searching for buying something on a particular website, you will notice that when you're then looking in your email, there seem to be adverts for the thing that you were looking at there, right?

Speaker 12 There's a whole very interesting kind of

Speaker 12 machine, essentially, that runs the way that advertising works on the internet. It's really interesting.
Some fascinating technology behind it, actually. It's really, really cool.

Speaker 12 The next stage in that that we're starting to see now is, well, what about if it isn't just about what you're clicking on when you're sat on your laptop?

Speaker 12 But if you've got your smart speaker sitting in the corner that you can use the keyword to get it going and

Speaker 12 it makes the curtains open or whatever it might do, well, actually,

Speaker 12 can that and the things that you are saying and doing in that environment also be used to feed that kind of advertising ecosystem. So, a really, really important part.

Speaker 12 Again, it makes that point about the link between security and humans and people, right?

Speaker 12 So, you know, a really, really daft way to implement something like that would be to have a system whereby the thing thought that you were you just asked to switch it on, you know, the keyword, and it was a really common word, right?

Speaker 12 So every time you said hello,

Speaker 12 you were talking to your friend. Actually, your smart speaker thought you were talking to it, and then you start talking, it goes, oh, well, I know what you're on about, right?

Speaker 12 So that would be a really, really poor implementation of a kind of the privacy settings on a device like that. And obviously,

Speaker 12 it's in nobody's interest, I don't think, for any of the kind of people who are building this technology to start putting those sorts of things in people's homes.

Speaker 12 I mean, nobody's going to want to do that. But

Speaker 12 this is where these questions of cybersecurity come in. Because what if somebody

Speaker 12 with a sort of nefarious purpose has figured out how these things work and is attempting to get that speaker to do precisely that?

Speaker 12 Well, actually, a lot of the work that we do here is to work with industry and people that make these things to make sure that actually no, that really isn't a thing.

Speaker 12 And it isn't just that they've told us that it can't be done.

Speaker 12 We try and make sure that those sorts of things where cybersecurity starts to really touch on the fundamental ability for us all to be sort of you know to live and work and be safe online.

Speaker 12 Um, that's that's what we try and do. But it's but it's not out of the box, right? This is a this is why we have people here working on these things, right?

Speaker 10 One of the pieces of um security technology we all use every day is public key encryption, is our bank accounts or secure, well, HTCPS, anything on the web.

Speaker 10 So, could you talk a bit about just outline how that works?

Speaker 12 So, there's a couple of sort of basic things that are going on here.

Speaker 12 So, as Tony was saying, with the one-time pad system, one of the real challenges with that is how do you get the pads to the people that need to see it?

Speaker 12 And PKI is almost like the modern version of the infrastructure that means that you can pass the sort of modern versions of pads around in a way that's kind of secure and understood.

Speaker 12 And a lot of this rests on some actually not really difficult to understand concepts that come from maths and number theory. So prime numbers.

Speaker 12 And there's a thing about prime numbers that basically says if you've got two really, really, really big prime numbers and you multiply them together, that's an easy thing to do.

Speaker 12 Figuring out which two prime numbers you multiply together to get that massive number is a really, really, really difficult thing to do.

Speaker 12 And you're talking about the sort of thing that computers would take till the end of the universe to kind of do that factoring of the primes problem. So, how does that work in terms of encryption?

Speaker 12 And so, basically, if you want to exchange a message between two people, there's a couple of ways you can do it. One obvious way would be to say, okay, we've got a box and it's got a padlock on it.

Speaker 12 You have a key, and I have the same key. Let's not worry about how you got it, but we've both got those keys.
I can write my message, I can put it in the box, lock it, send you the box.

Speaker 12 No one else can open the box, so you get it. You've got the key, you open it up.
That's called symmetric.

Speaker 12 When people talk about public and private keys, which is the thing that underpins how the internet works, it's slightly different. So, in that model, I've got a key.

Speaker 12 What I do is I give you the box, but the padlock's open. So, you can write me a message and you can put it in the box and shut it.

Speaker 12 Once you've shut it, you can't open the box again, neither can anyone else. But you can send me the box, I've got the key, open it up.

Speaker 12 So really the whole thing about public key cryptography is a mathematical implementation of that process, really.

Speaker 12 The whole way that this works on the internet is about putting in place infrastructure, what we call the trust infrastructure of the internet, that means that you can trust the fact that the box that you've got is actually the box that you're expecting.

Speaker 12 So you've probably seen this, but certificates and things. You go in your web browser, you see a whole load of things about certificates and things that have been signed and so on.

Speaker 12 So basically that is all about the infrastructure that exists to make sure that those keys and those padlocks and those boxes you can trust where it came from.

Speaker 12 And then, the maths behind all that is to do with the way that essentially factoring the primes and those kind of things.

Speaker 10 So, Tony, it sounds as if codes, encryption, these things are pretty much uncrackable.

Speaker 10 But, as Ian said, it can be the humans that are the wheat link in the chain.

Speaker 10 Could you give us some examples or an example historically of

Speaker 10 mistakes that led to the breaking of a code?

Speaker 11 Right at the very beginning, in November 1914, the way in which the Germans added a key to the values from their code book wasn't understood.

Speaker 11 You know, they just saw these messages that even when you compared the values to the captured code book, nothing was coming out. And it was one of the

Speaker 11 people working in the Admiralty who just thought he his father was German, his name was Rotter,

Speaker 11 and he thought the Germans are a methodical people and they send messages, every one of them is going to be numbered sequentially.

Speaker 11 So, if we've seen message number 151 and 152 in plain text, we can guess that the next one is going to be number 153, even though it's enciphered.

Speaker 11 And it was using that as a starting point that he started to look: well, how could this set of letters or numbers represent the number that I think the message is going to have, and then working back from that.

Speaker 10 So it's it's uh understanding human behavior as much as the mathematics and yep.

Speaker 11 One of the earliest uh working aids that was developed at Bletchley Park was a book of German swear words.

Speaker 11 And that's because on variants of the Enigma machine that have the rotors with letters on, when you set up the machine to send a message, you have three rotors that you have to set at any random setting.

Speaker 11 But it's humans and they don't like random settings, and it's 18, 19, 20-year-old young men who've been conscripted into the German army, and they really hate what they're doing.

Speaker 11 So, what are they going to do? They're going to use girls' names, or they're going to use swear words, or they're going to use

Speaker 11 the setting where the rotors got up to at the end of the last message because it's easier than to change anything just to go that way.

Speaker 11 And so, there's a whole set of ways of guessing where things might be starting that are based just on your observation of what people do. People do daff things when you let them.

Speaker 1 Again, I want to because I know the imitation game may well not have been filled with veracity, but it's

Speaker 1 this is in terms of the fact that once a code has been broken, the decision of when you risk changing your actions that may well give away that the code has been broken. I mean,

Speaker 1 that morality, that ethics.

Speaker 1 Have we seen that in history?

Speaker 1 I mean, in that film, they talk about not changing, I believe, the direction of a boat because that would give away, and that would be in terms of their doing a whole balance of how many lives saved.

Speaker 1 They're looking at a pragmatic. So, do we see that in history, that ethical dilemma?

Speaker 11 The imitation game gets two things absolutely right. There was a Second World War, and Turing's first name was Alan.

Speaker 11 To use it as a guide for the way we work would be sort of slightly odd. That sort of decision on the balance of the use and the protection of intelligence has been in our business.

Speaker 11 Again, if you look at December 1914, when the first indications are that the Germans might be going to shell towns on the North Sea coast, how do you get that information to those towns without letting the Germans know that you're reading their ciphers?

Speaker 11 This is a constant, and there's a whole process of saying how can you get a plausible alternative source for the information so you could make it look as though an agent had got hold of it, or you might have you know a convoy is going in a particular place, so you send a plane up to spot the convoy, even though the plane is in danger then of being shot at itself.

Speaker 11 At least you've got something

Speaker 11 that would give another reason for finding that information. The biggest myth of all is that you'd sacrifice lives, you'd sacrifice British lives just to protect your secret.

Speaker 11 We've never, ever done that. The most potent myth is that Churchill sacrificed Coventry to protect the fact that they'd started reading Enigma.
And it's not true.

Speaker 11 It wasn't true then, and it's never ever been true. Because, yes, there is the dilemma of how do you protect your sources, but there's an absolute bottom line:

Speaker 11 not at the cost of British or allied lives.

Speaker 9 Just before we move on from the film that perhaps I feel awkward even naming now, I won't name it, I won't say it again, I won't say it again.

Speaker 9 But one of the things that came out of that and Bletchley Park, and I hope this isn't a myth, because I'm looking around the room now here at all of these people in here, and actually the gender split is pretty good from what I can see.

Speaker 9 I mean, it's got it could even be approaching 50-50 or 30% women, perhaps. And there was this sort of association with Bletchley that women had a big role there and were recruited.

Speaker 9 And for the first time, women mathematicians

Speaker 9 were given a huge role. And do you think that has continued to what I can see around the room now, which is a massive female presence here?

Speaker 11 At its height, at the end of 1944, the workforce at Bletchley Park was 76.3%

Speaker 11 women.

Speaker 11 The only two jobs that weren't done by women were the armed guarding of the site and the senior management. And that's really the clue as to,

Speaker 11 if you like, the wasted opportunity that nobody conceived of the idea that women could be right at the top of the organization, even though they were able to demonstrate that they could do any of the work that was there.

Speaker 10 That raises an interesting question, actually, Ian, which is what is what makes a good code breaker?

Speaker 12 Yeah.

Speaker 12 So, I think we sort of touched on this a bit earlier, but I think to be a good code breaker, or

Speaker 12 to be the sort of person that GCHQ needs to do what it does, because we do a lot of things, right, and code breaking is obviously a big part of it, but it's not all of it.

Speaker 12 I think

Speaker 12 there are a couple of things that I think we really learn matter. And probably the thing I would call out more than anything else is kind of a problem-solving mindset, really.

Speaker 12 So, there's something about people I think who are interested in solving problems. They don't have to be technical problems, particularly.

Speaker 12 They just have to be, you know, that sort of like tenacity to want to know, to want to solve a problem, to understand it.

Speaker 12 Again, we've mentioned this before, but teamwork. So, teamworking feels like a really, really important thing here at the moment.

Speaker 12 And again,

Speaker 12 the different kinds of ways that people think and being able to work together to solve problems rather than just being on their own, I think, is really, really important.

Speaker 1 It's all very normal here. I went past the cafe and it just said, today's special baby food, laxative and butter stew.

Speaker 5 So, um,

Speaker 9 can I ask one quick question?

Speaker 9 And I'm sorry if it's a silly question, but I'm dying to ask.

Speaker 9 Right, a friend of mine had high-up security clearance in number 10 and had to have lots of clearance chats and instructions about what she couldn't couldn't do in the uh in the office.

Speaker 9 And I always wondered, I wasn't ever sure if she was pulling my leg or not about this, but she said that it was possible the technology existed that if she was talking about about something classified in the building, that someone could shine a laser onto the window pane and it could interpret the vibrations of the window pane as she was talking, and then that could be turned into words that someone could understand what she'd said.

Speaker 9 Is that true?

Speaker 10 He's not going to answer that.

Speaker 12 I listen to nor deny that.

Speaker 12 It's definitely, but it is in a lot of movies, right? So it probably is true. Bit like the imitation game, right?

Speaker 9 Damn it. Well, you're going to put me through another twenty years of not knowing.

Speaker 1 Your friend who's high up in government, is it actually Judy Dench?

Speaker 6 That's right.

Speaker 1 Um we asked our audience, uh GCHU audience, uh a question and uh we asked them who would you like to bamboozle and why?

Speaker 1 And uh

Speaker 1 redacted.

Speaker 10 Redacted.

Speaker 10 Well this is a very good answer. Everyone except the recipient.
That is the point of the secret code.

Speaker 1 My eight-year-old daughter, she thinks she knows everything.

Speaker 10 Codes? None of that nonsense. I'm an engineer.

Speaker 1 The people who wrote the GCHQ puzzle book, because it's too darn difficult. If I was going to ask you one, Brian, you know everything, don't you? Which element is missing?

Speaker 1 Arsenic, astanin, bismuth, carbon, copper, iron, krypton, neon, ogenesin, phosphorus, silicon, tenosine, tin, xenon. You have seven seconds.

Speaker 9 i bring him here to you to see his doubtful face it doesn't happen often i was thinking through that book earlier on and i can't do any of it but then i was heartened to find that uh there's one page which says can you identify this person and it's a picture of victoria wood

Speaker 9 so i thought yeah maybe i am the ideal candidate for gchq after all

Speaker 1 does anyone know the answer to that in this book by the way They can't tell you, but you can turn to the answers section. There, Lou.
Now, I'm not saying that you might not be the best spy.

Speaker 5 How on earth can I break break this code index? Well no,

Speaker 12 you've said it on the radio.

Speaker 10 You read it.

Speaker 10 People want to know what the answer is now. I'm just saying helium because it wasn't in the list.
Let's see.

Speaker 6 A reasonable chance.

Speaker 1 Who would you like to actually bamboozle with your secret code and why Brian? Because mathematics beating physicists is a fun fact of life.

Speaker 10 Thank you very much to our panel and that brings us to the end of this series.

Speaker 1 We're going to leave you with two problems and you have until the next series to come up with a solution, which will be probably January 2019.

Speaker 1 Whoever sends us the answers first will receive one of Brian's out-of-date Mensa membership cards. He knows the sequence of any shapes.
Anyway, what is the next letter in this sequence?

Speaker 6 M-V-E-M-J-S-U?

Speaker 6 N.

Speaker 1 You don't have to wait for the next series.

Speaker 1 It's N.

Speaker 5 And I love that.

Speaker 1 You knew. Here are a group of people who have to hold in secrets.

Speaker 7 I can't hold that in.

Speaker 5 I definitely know what that is.

Speaker 1 So it was a lovely moment when someone's just lost their job.

Speaker 7 Yeah, you need to keep an eye on that.

Speaker 10 And if that's too easy, how about Robin and I went to the cinema in cryptic cross-word clue terms? The film we saw was £6,000.

Speaker 10 First word, six letters, second word, seven letters. What was the film? Four monkeys now.

Speaker 10 We'll give you the answers at the end of the start of the mention.

Speaker 10 So, that is the last Infinite Monkey Cage of this series. I will be joining you again in January.
Robin won't be with us because he wanders off at lunchtime and now he knows too much.

Speaker 5 Goodbye.

Speaker 1 I did look in the secret room.

Speaker 1 In the Infinite Monkey Cage.

Speaker 5 In the Infinite Monkey Cage.

Speaker 11 Till now, nice again.

Speaker 14 When was the first time that you saw your face tattooed on somebody's body part?

Speaker 11 This summer, don't forget to pack a castaway or two.

Speaker 5 There's a constant voice that says, oh, stop it. I'll just give up.
Rubbish.

Speaker 14 You're a ball of energy.

Speaker 8 Well, anyways, because I'm excited to be here.

Speaker 11 Travel with Matt Smith, Charlie Brooker, Nicola Adams, or over 2,000 other companions.

Speaker 13 I stepped into the center of the ring and when I won, they said it was equivalent to the same amount of noise as a jumbo jet taking off.

Speaker 11 Search for Desert Island Discs wherever you get your podcasts.

Speaker 15 The Mercedes-Benz Dream Days are back with offers on vehicles like the 2025 E-Class, CLE Coupe, C-Class, and EQE sedan. Hurry in now through July 31st.

Speaker 15 Visit your local authorized dealer or learn more at mbusa.com/slash dream.