The Program — audio series — My Turing-complete life

My Turing-complete life

Enhance readability: Off On

Powered by Beeline Reader

Rate this episode:

(1 poor, 2 so-so, 3 good, 4 great)

Total: (loading...) (? votes)

IMS: Hello, this is IMS, the author of The Program audio series. This episode was a lot of fun to make. It was recorded two weeks before the 2020 pandemic, so we had a lovely time doing it in person and even had two extra-large pizzas for everyone. In case you’d like to reimburse us for the pizzaz, you can find ways to do so at programaudioseries.com/support. Cheers!

ANNOUNCER: Today we present an episode of an old talk show, broadcast a few years before the launch of the Program. Originally aired locally somewhere deep in the North American continent, to this day there is no consensus if it’s authentic, or simply an elaborate hoax.

[radio static followed by radio jingle]

MAX: Hello dear listeners, and welcome to our show. I am Max Mustermann.

KARI: And I’m Kari Nordmann, and this is Pascal’s Wager, a show about philosophy, psychology, sociology, and everything that makes us human.

MAX: Which is a fancy way of saying we ramble on about any topic that crosses our minds for 45 minutes.

KARI: [laughs] I don’t know about you, but I’m not rambling!

MAX: I mean, is that really the best pitch? “Anything that makes us human”? You could describe literally anything that way!

KARI: [laughs] We’ve been doing the show for five years, and now you voice your objections over our tagline?

MAX: We have a special guest with us in the studio tonight Kari! Do you want to introduce him?

KARI: I do, but we first need to make a short community announcement. Mrs. Wheeler, who you may know from the downtown hardware store, her cat Chairman Meow got lost! Chairman Meow is a brown tabby, wearing a red collar. If you see him, give us a shoutout and we’ll alert Mrs. Wheeler… And now onto our guest.

MAX: Following a communist cat.

KARI: The guest on Pascal’s Wager tonight is John Smith, and he was born in our little town but moved to the city for his studies.

JOHN: Hi Kari, hello Max. Actually, I’m not with the University any more.

MAX: Well there you go, first minute of the interview and we already screwed something up!

JOHN: Well if it makes you feel any better, until just a few months ago I was studying at the university! Double major in physics and computer science. But I decided to drop out and devote myself full time on a venture of my own.

MAX: Hm, a venture, eh? Does this mean you’ve invented something?

JOHN: Well, it’s not an invention really. I prefer the word discovery.

KARI: Why so?

JOHN: I guess it boils down to another question - was mathematics invented, or was it discovered? I for one would say mathematics is something objective that exists irrespective of our awareness of it.

KARI: So you’re saying that the thing you discovered also existed before we became aware of it?

MAX: And apparently it’s the same category of human achievements as mathematics!

JOHN: [laughs] Potentially correct on both accounts!

MAX: So what is it?

JOHN: I discovered an algorithm that foresees the future.

MAX: Oh, for the stock market?

JOHN: What? The stock market?

MAX: Yeah, you created a bot that helps determine which shares to buy and sell.

JOHN: No, that’s not what I meant. What I made is an algorithm that predicts what’s going to come up.

KARI: Oh, like predictive text on our phone?

JOHN: No, nothing like that! The algorithm knows what’s going to happen before it happens.

MAX: Oh so it’s like my mother-in-law then.

JOHN: My algorithm is nothing like your mother-in-law, my algorithm sees into the future!

[pause]

MAX: Are you saying you can see the future?

JOHN: Well, not exactly see it, but in effect that is correct. I can tell you what will happen.

KARI: So you can forecast historical events?

JOHN: Not quite historical events. I can only foresee what is going to happen with individual users.

KARI: Individual users?

JOHN: Yes, individuals like you, me, Max. Anyone really.

MAX: I don’t suppose there’s a way to prove your claims, is there?

JOHN: Of course, I didn’t expect you to believe me on my word alone! That’s why I brought my laptop with me… Er, what’s your studio’s wifi password?

MAX: I don’t know, let me see if our producer knows, hang on. Hey Natasha, do you know?

NATASHA: I have no idea.

MAX: Natasha Petrovova ladies and gentlemen, our producer extraordinaire!

NATASHA: That’s actually our wifi password. IHAVENOIDEA. All caps, no spaces.

JOHN: [laughs] This is why you never do a live demo!

KARI: Or a live show! [laughs]

MAX: So you see the future and you couldn’t have foreseen this?

JOHN: I never said I was clairvoyant! I have to specifically check a particular instance of what’s going to happen. For example, I can vocalize the words you’re going to say and play them back to you before you actually say them.

KARI: What? What do you mean?

JOHN: Give me a moment and I’ll show you. [sound of keystrokes] Okay, so I set a 400 millisecond head start for whatever you’ll be saying next. What this means is that whatever the two of you utter next is going to be heard from the laptop speakers just a bit before you say it. The effect will start after I confirm the command. Are you ready?

KARI: Yes...

MAX: Go for it.

JOHN: Okay, start.

MAX: So, should I go first? Holy-moly. This is... This is crazy! Kari, you say something!

KARI: I don’t know… Oh, my god! This is… This is incredible.

MAX: It’s like the echo you hear on long-distance calls, but kinda reversed!

KARI: Yeah, it’s like my thoughts have a shadow, but one that is cast in front of me so I can see it.

MAX: I feel like I should trick it somehow. Like I should... stop... talking... and then suddenly start again. Please, turn it off, turn this off!

JOHN: Okay I’m turning it off now.

[sound of keystrokes]

MAX: Hello?

KARI: Test, test… Okay, so how did you do that?

JOHN: As I said, I can read what you’re going to do right here in the code.

MAX: May I see?

JOHN: Sure.

[pause]

MAX: This looks like a computer code!

JOHN: That’s exactly what I’ve just said! It’s a computer code that simulates reality before it actually happens.

KARI: Wait, are you implying the future is set on a strictly defined path that be... Computed?

JOHN: Of course. I mean computers follow the same principle everything else in nature does. It all comes down to IF-THEN.

MAX: IF-THEN?

JOHN: If-this-happens-then-that-happens. Meaning if you know how one atom moves, you know how it influences another one. And another one. And another one. All you really need to simulate the future is to simulate enough atoms.

KARI: John, what you're describing is Laplace's demon.

JOHN: Whose demon?

KARI: Pierre-Simon Laplace. He was a French philosopher. He imagined an intellect which would know all the forces that set nature in motion and which could - in theory - calculate the movements of all the bodies in the entire universe.

MAX: Yeah, we talked about it in one of our previous shows, right Kari? You said that for such an intellect nothing would be uncertain and the future would be present before its eyes just like the past.

JOHN: Funny you say that, in software a demon is a computer program that runs as a background process, which is to say that it is not under direct control of the user.

MAX: So you're saying Laplace's demon is real?

JOHN: Oh it’s real alright. It's called physics.

KARI: But physics is inherently random! I mean, we cannot even predict whether it’s going to rain tomorrow, let alone foresee events at the quantum level!

JOHN: Well, have you ever considered that we aren’t able to predict such perfectly natural phenomena simply because we don’t yet have the appropriate tools? Or I should say - we didn’t have them.

KARI: John, do you have any notion of the Pandora’s box of philosophical questions this opens?

NATASHA: Actually Kari, we have a caller waiting on the line, so why don’t we hear these philosophical questions directly from our listeners?

MARIA: Hello, can you hear me?

MAX: Indeed we can, you’re on the air! Could we have your name please?

MARIA: This is Maria speaking.

KARI: Maria, the wife of Juan Perez, who runs the bar at Warden Street?

MARIA: Yes, that’s how most people know me. As Juan’s wife.

KARI: Sorry, I didn’t mean anything by it. I just sometimes go for trivia nights there so I remembered.

MARIA: It’s okay. And it is accurate to say that Juan has always been the social one in our relationship. I’m the sort of person who says things like “I like long walks in nature” unironically.

MAX: Hey there’s nothing wrong with long walks!

MARIA: Well that may be true, but an exciting life it does not make. But I didn’t call you today to complain about how boring I am. First of all, I want to congratulate you on a great show and John on his incredible invention.

JOHN: Actually, it’s more of a discovery.

MARIA: Well however you wanna call it, it’s great.

MAX: So Maria, what is your question for John Smith?

MARIA: Well, what I’d like to know is the following. I’ve been having this suspicion for some time now that Juan… Well, that he’s not being honest with me. That he’s been going behind my back.

JOHN: You think your husband is cheating on you?

MARIA: Well, is he? That’s precisely what I’d like to know. Your gizmo can look ahead, and I presume this becomes known in the future - whether my husband is having an affair or not. So I’d like you to take a look and tell me.

JOHN: Sure, I can do that easily… What’s your name?

MARIA: Maria.

JOHN: I need the full name.

MARIA: I was born as Maria Rossi.

JOHN: Alright, Maria maiden name Rossi. Let me just look you up in the simulation here… [sound of keystrokes] Aaaaand… Ugh.

MARIA: What?

JOHN: I’m afraid your suspicion was warranted. Your husband is having an affair.

MARIA: Oh my! That’s… That’s not easy to hear! Could you tell me when I find out about it?

JOHN: Well… Now.

MARIA: I mean without you!

JOHN: But, I am here. I have always been here. It’s not like I’m interrupting anything.

KARI: What do you mean you’re not interrupting? If it weren’t for you, Maria wouldn’t have known of her husband’s infidelity!

JOHN: But what you’re saying is tautological! It’s like saying if it weren’t for the iceberg, the Titanic wouldn’t have sunk. Yes, you’re technically correct, but you’re not saying any kind of a profound truth - the iceberg was there!

KARI: So you’re the iceberg in this scenario?

JOHN: Maybe it’s not the best metaphor.

MAX: Well I think we should stop talking about catastrophes and let Maria go on about her day.

KARI: Agreed! Sorry Maria that we didn’t have better news for you.

MARIA: [wailing] That’s… That’s okay. I think I’ll go on a looooong walk now.

[hangs up]

KARI: Have you ever heard of a kind lie, John?

JOHN: Why would I lie? Lying won’t change anything. I mean, if you’ve found out that Laplace’s demon is real and it exists, wouldn’t you prefer it were honest?

KARI: The definition of wisdom is to know which truths to say and which ones to omit.

JOHN: Doesn’t that negate the whole idea of truth?

MAX: What I’d love to hear is a little bit more about how you came up with this? Where were you when that proverbial lightning struck?

JOHN: Well, it all started with a blind date. My roommate set me up with his friend, and I haven’t been on a date - let’s just say in a while. I have to admit, I didn’t socialize a lot at Uni, but was spending most of my time developing for the E-wave.

KARI: E-wave?

JOHN: That’s the quantum computer the University got last year. I only had one slot at the machine every 6 weeks! So in between I was devising and refining an algorithm that could calculate probability outcomes.

KARI: Trying to do what exactly?

JOHN: The algorithm’s goal was to simulate throwing dice. If quantum computers are good at anything, it’s at hacking and simulations. I mean, it makes sense to simulate quantum physics with actual quantum physics.

MAX: Sounds reasonable.

JOHN: But then I found out about the blind date. And I had an idea - what if I could simulate the date before it actually happened? And in a bolt of inspiration, I completely rewrote in 5 days what I’ve been working on for 5 months! Once I’ve had a chance to actually test the redesigned algorithm, not only could it simulate throwing dice, but it was able to simulate any event before it actually happened! So I named it the Blue Algorithm - patent pending.

KARI: Why blue algorithm?

JOHN: The name is of no importance.

MAX: Then allow me to ask you something of the utmost importance - I mean all of our listeners are definitely curious: so, how did the date go?

JOHN: [laughs] I haven’t gone! I was so exhausted I spent the weekend sleeping!

KARI: And now for the real question: I still don’t understand exactly how this tech works?

JOHN: Well, the main principle is similar to a search engine. Just like a web crawler, the Blue Algorithm indexes the entire reality and broadly caches what is going to happen. Then when a specific instance is looked up, it simulates that particular action in full detail.

KARI: Wow, that’s fascinating.

JOHN: A fun fact, the first version was only able to simulate the future up to 3 seconds in advance. By version 4 it was already simulating months. I’m at version 6.2 now and it’s able to simulate up to 50 years in the future!

MAX: So you’re developing this technology alone?

JOHN: Yes. I’ve been a lone hacker ever since I can remember.

MAX: And what’s your goal?

JOHN: Ah yes, I almost forgot! I am proud to announce the exciting next step for the Blue Algorithm. Beginning next week, the Blue Algorithm will become available worldwide! Arriving in the form of a responsive website, it will allow anyone to look up their future, anywhere, anytime!

KARI: Let me check if I understand this correctly - you’re going to commercialise an exploit of reality?

JOHN: I don’t think commercialisation is the right word.

MAX: So it’s going to be free then?

JOHN: Well, not exactly. There are operating costs to cover. Quantum computing is expensive!

KARI: So how are you going to monetize?

JOHN: I plan to sell ads. I mean it should be easy, with knowing what people will buy.

MAX: Speaking of which, John, why are you with us in the studio and not on a yacht made of cocaine?

JOHN: [laughs]

KARI: True, why didn't you simply sell the algorithm?

JOHN: Two reasons. A) I'm not interested in money; and B) I looked into the future and saw I haven't done it. But B is probably precisely because I'm the kind of person who thinks like A! [chuckles]

NATASHA: Guys, we have another listener waiting on the line.

MAX: Thank you Natasha, let’s hear from them. Hello, you’re on the air!

PIERRE: Hello, this is Pierre Tremblay.

MAX: Well hello Pierre, you old dog! I haven’t seen you in ages! Pierre and I went to the same high school together.

PIERRE: Well you went to it. I was mostly hiding behind it smoking cigarettes.

MAX: Yeah you were a Marlboro man from the very earliest of ages! Still working as a cowboy are you?

PIERRE: Yeah, still working on the farm. Continuing the family tradition.

MAX: So what do you think of John’s invention?

JOHN: Discovery.

PIERRE: Oh it’s amazing. And to be frank, I was hoping I could make use of it as well. I sure need some advice right now.

MAX: What’s up, Pierre?

PIERRE: Well I’ve had health problems for some time now. Last month I finally went on a physical and had a CT scan. Seems being a Marlboro man caught up with me. I have lung cancer.

MAX: Oh no! Pierre, why didn’t you say so?

PIERRE: It’s not something I want to burden others with. I’ve only told Jane. She’s my fiancée now.

MAX: Jane and you got engaged? I had no idea!

PIERRE: I’m telling you, that woman is a saint! She’s been my rock.

MAX: You’re blessed to have somebody by your side.

PIERRE: Oh Jane’s the best. Which is exactly why this cancer business rattled me that much - the thought of our time together being cut short is too much to bear.

MAX: What are the doctors saying?

PIERRE: Last month I started therapy, but it could go either way. My chances are pretty much fifty-fifty. Which brings me to my question for Mr. Smith. Which fifty is going to be mine?

MAX: Now that’s a heavy question. A question like that is probably something you don’t want a categorical… [sound of keystrokes] … Answer to.

JOHN: Okay, so you said Pierre Tremblay, right? I’ve found you in the simulation, and I have good news and bad news. Which do you want first?

PIERRE: I mean, do I even have a choice? I thought that was the whole message here.

JOHN: Right. So I’ll give you the bad news first. Ready?

PIERRE: As ready as I’ll ever be, I suppose.

JOHN: The bad news is that you will die.

PIERRE: Oh no.

JOHN: The good news is that it will be many, many years from now. I mean, we all will. In fact, when you use the Blue Algorithm, that’s the thing that strikes you the most. You realize that all of us are already dead - we’re just waiting for our turn!

PIERRE: Oh. Wow. That’s… That’s such a wonderful thing to hear Mr. Smith! Well, not the all-of-us-being-already-dead part, but the fact that I get better definitely is!

MAX: I’m so incredibly glad for you Pierre!

PIERRE: Thank you! You know, I have to confess, ever since I found out my prognosis, I’ve been engulfed in some pretty black, bleak thoughts. I’ve even caught myself thinking… Wow, this is difficult to even admit now, but I was contemplating the easy way out.

MAX: Oh please don’t do anything rash, Pierre, please!

PIERRE: Don’t worry, I haven’t done anything, obviously! But it’s fascinating really how much time I’ve spent staring at my gun locker since I’ve learned about my illness. It had this strange effect on me, pulling me like a… Almost like a magnet.

MAX: In that case, why don’t you get rid of your arsenal of yours for the time being, would you? You know, just to be on the safe side. You’re going through a rollercoaster of emotions, it might be better to disassociate yourself from weapons until the storm has passed.

PIERRE: Yeah, that’s probably smart. I’ll make sure I do so! Thank you guys. And thank you Mr. Smith! This has been very helpful. Cheers!

KARI: Cheers Pierre! And cheers to your fiancée!

MAX: Yes, send my love to Jane, would you?

PIERRE: Sure thing Max. You’ve got to come to our place for dinner.

[hangs up]

MAX: So tell me John, are you purposely trying to be a dickhead, or does that just come naturally to you?

JOHN: What do you mean? This one went well!

MAX: Yeah maybe so, but did you have to break the news to Pierre like that? “Bad news is that you will die?”

JOHN: Oh that. Sorry about that. You see, when you can tell the future, it’s easy to forget how it feels to be surprised about something.

KARI: But seriously John, let’s talk about this a bit more. What’s this deal about not being able to change one’s future?

MAX: See, that’s what I’m having an issue with. I mean, surely it should be possible to change the future once you hear what’s going to happen? For example, you tell me I’m going to raise my left hand - can't I just do the opposite, and raise my right hand instead?

JOHN: You can’t, since your awareness of the fact I told you what will happen will simply be one more input in the chain of events. An input that the "demon" - if that’s what we choose to call it - already accounted for when making the prediction.

MAX: So if I asked you if I will raise my left hand or my right hand, what would you tell me?

JOHN: I would tell you that if I say you will raise your right hand, you will raise your left hand, and if I say you will raise your left hand, you will raise your right hand. Or - to borrow the demon metaphor again - the best demon would simply answer "you intend to do the opposite of whatever I tell you." Either case your future is set.

KARI: John, I don't like the limitation of what you're saying implies. Are you aware you are undermining the very concept of free will?

JOHN: Free will?

KARI: Yes, freedom to make choices in life!

JOHN: I don’t understand what you mean by choices. Were you free to choose your height? Your parents? From where I'm standing, the Blue Algorithm changes nothing. The only thing this tech does is uncover what has so far been hidden from us. And how can knowledge ever be worse than the alternative?

KARI: Knowledge is neither inherently good nor bad. It’s what people do with it that counts.

JOHN: But that’s just the thing, people overestimate how much doing they actually do. I mean, when was the last time you've made a decision about your life?

NATASHA: If I may jump in for a moment. I’ve made a huge decision - the one to move to Canada. I used to live in Russia ‘til I was 25.

JOHN: And how did you end up here?

NATASHA: Ah well, I applied for PR, which is permanent residentship, in Canada and Australia. Well guess who processed me first.

JOHN: Which goes to prove my point! I mean, when you think about it, were you the one who made the decision to move to Canada, or has Canada made the decision for you when it admitted you before Australia?

KARI: But if what you’re saying is true, then we don’t have control over our own lives! Then we are just… Just characters in a play!

NATASHA: I guess that would make this a radio drama!

KARI: [chuckles]

JOHN: I honestly don’t understand why are you so bothered by this limitation of reality, and you're not bothered that gravity makes it impossible for you to fly?

KARI: Because in the second case, we are talking about a physical limitation.

JOHN: From where I’m standing in both cases the limitation is physical! I mean, what are thoughts but a bunch of neurons moving in synchronized patterns? Are you suggesting neurons are not bound to the laws of physics?

NATASHA: Guys, as much as I’d love to get ontological again, we have a new caller. So could we hear from another listener now? John?

JOHN: Absolutely, my pleasure.

MAX: And hopefully for the listeners too.

JUAN: Hello, did I get through?

MAX: Yes sir, we hear you loud and clear in the studio!

JUAN: My name is Juan Perez, and I’m a bar owner. I’m also the husband of Maria Rossi.

MAX: Oh… You’re the gentleman with the affair.

JUAN: Yeah, nice work ratting me out.

KARI: I apologize Mr. Perez, the show is taped live, there’s no way we could have predicted what was going to happen… Well, I guess there is a way, but we hadn’t known that at the moment.

JUAN: Either case, the horse is out of the barn now. Anyway, I did not call you to lament. I know why what happened happened and I don’t need to explain myself to you.

MAX: So why are you calling then, Mr. Perez?

JUAN: I’m calling because you have now caused me a great dilemma. Now that Maria knows her future, I need to know my future as well, don’t I? I mean we need to have information parity here, or I’m at a disadvantage.

JOHN: Certainly Juan, I can check your future for you. You just need to tell me which aspect you’d like me to look up and I can run the query.

JUAN: Well, I’m specifically interested to hear how my life with my lover is going to be. Is our relationship gonna be… Is it gonna be, I don’t know, I guess fulfilling is the right word.

JOHN: Well, that is kinda subjective, wouldn’t you agree? All I see here are facts, I don’t have any insights into your emotional state.

JUAN: Just tell me what’s gonna happen!

JOHN: Certainly sir. [sound of keystrokes] Hmm… [sound of keystrokes] Well, what I can tell you is that you and your lover will separate in an antagonistic manner. So I’d venture to say that “fulfilling” isn’t the adjective you’d likely use to describe the relationship?

JUAN: That… That just cannot be! She understands me! We would never split up like that! Especially not now, after I left my wife to be with her!

JOHN: I’m not making a judgement of you or your relationship. I am merely stating the facts.

JUAN: I’m telling you your facts are FALSE. And I will prove you wrong!

KARI: Mr. Perez... Juan, listen. Even if the prediction turns out to be true, you will leave because of one reason and one reason alone: because you are unhappy.

MAX: Kari is right. Breaking up doesn’t mean you failed. In fact, two people staying together when they shouldn’t is a much bigger failure in my book. And I’m sure you understand that, in your current position with Maria?

JUAN: I… I… I guess you’re right. It’s not like I haven’t tried explaining to my wife about the difficulties in our relationship. But when your grievances fall on deaf ears you feel more lonely than actually being alone

KARI: I understand what you are saying Juan.

JUAN: This has been a long time brewing. And you know what, perhaps it’s for the better. Thank you.

KARI: You’re welcome.

[hangs up]

JOHN: Was that better than the last time?

KARI: Well I guess it depends on your definition of “better”. The bar had been set pretty low.

MAX: I for one don’t have much sympathy for the guy. He is cheating on his wife after all. I mean, his ex-wife. I mean future-ex-wife. Yeah.

KARI: It does open an interesting question of moral responsibility in light of this invention.

JOHN: Discovery.

KARI: I mean think about what this means for morality. If we are not in control of our actions, there can be no moral consequences. After all, it's not my fault that I stole my neighbour’s car if that was my fate all along.

JOHN: Not necessarily. In a way nothing changes - you will still get punished, because that's what our moral compass tells us is right. You could say we are no more free to not punish the thief than the thief wasn't free to not steal the car.

KARI: But what does this mean for personal liability? Let’s talk about a more extreme case: imagine you learn that your neighbour is going to murder you, so you decide to kill them first. Are we talking about legitimate self-defense in this case?

JOHN: That’s… That’s a good question.

KARI: What we need are good answers. Because these are the situations you will have to deal with once you launch your app.

JOHN: Look, I am aware that in a limited number of cases people might find their future... Disturbing. But luckily there’s an easy fix for that.

MAX: An easy fix?

JOHN: Moderation.

MAX: Oh, you mean using your app in moderation?

JOHN: Oh no, no, no, no, no. Moderation as in moderators. The Blue Algorithm app will employ a team of moderators, ensuring all sensitive content is promptly removed, and creating a safe and welcoming environment for all.

KARI: Honestly, this sounds more like you’re trying to waive responsibility rather than grapple with it...

JOHN: But that’s just the thing - we’re not responsible. I told you, the Blue Algorithm does not determine your future. The only thing it lets you do is see it. The app is just a platform.

KARI: But how do you think the authorities are gonna react to the fact that you know what your users are going to do? For example, if you know someone will shoot a bunch of kids? Shouldn’t the authorities pre-emptively arrest them?

JOHN: That just won’t come up as a possibility. I can personally assure our users that all their information is going to be encrypted and governmental agencies will have absolutely no access to it.

NATASHA: Okay cats, we have another call. So perhaps we could ask the caller for her opinion on these thorny topics?

KARI: Absolutely Natasha. Hello, who do we have with us tonight?

JANE: Hello, this is Jane.

MAX: Jane? Jane Doe? Pierre’s Jane?

JANE: Yes, Pierre’s Jane. Hi Max.

MAX: Guys, this is Jane, Pierre’s fiancée!

KARI: Well isn’t this a coincidence! But I guess our guest tonight would say there are no coincidences.

JOHN: Good evening Jane.

MAX: So what have you been up to Jane?

JANE: Nothing much… I’ve been working as a waitress for the last year. And taking care of Pierre since the… The prognosis.

MAX: Let me say Jane, I think the way you have stood by Pierre during his illness is admirable. I’m glad the Blue Algorithm had good news for you two.

JANE: Actually, that’s why I am calling you. I have to ask the Blue Algorithm something myself.

JOHN: What is it Jane?

JANE: Oh, wow, this is… This is actually very hard to ask. Okay, so I’m just going to say it. Should I leave Pierre?

MAX: What! Come again?

JANE: Well, I don’t feel I’m in love with him. Don’t get me wrong, I do love him, but more as someone who is very near and dear to me, you know? This is how I felt about him right from the start. He just has that aura of a friendly person. As you can surely attest to yourself Max, with how long you guys have been friends… And then his illness appeared, and I felt like… Like I’d be the worst person in the world if I left him! [snivels]

KARI: Jane, you are being very brave. Thank you for your sincerity and courage.

JOHN: And luckily for you Jane Doe, your dilemmas are going to be resolved in an instant! Let me just run a quick query through the Blue Algorithm. [sound of keystrokes]

KARI: John, may I… John, may I have just a quick word with Jane first?

[sound of keystrokes stops]

JOHN: By all means.

KARI: Jane, you don’t need the Blue Algorithm to tell you whether to stay with Pierre or not. You know what the answer to your question is. And even if you don’t know it, you can feel it. We don't need a program to tell us our future. We are our own future.

JANE: I guess you’re right.

KARI: You know I am. You feel I am.

JANE: I do. And I know what I have to do. I have to swallow the pill and make the tough decision. No matter how painful it’s going to be. I just wanted John’s machine to give me strength, that’s all.

KARI: Of course Jane. And I guess John’s algorithm is useful in that regard. It can give us courage to think of all the possibilities ourselves, and then let us do the one we know is true. No matter how hard it is.

JOHN: Guys, while you were talking, I’ve found Jane Doe and Pierre Tremblay in the simulation and the answer is no Jane, you shouldn’t leave Pierre. You stay together.

JANE: … What?

JOHN: I see your entire future with him right here. You decide to settle and marry Pierre. In the months after the wedding you are still restless if you’ve made the right decision, but then you have kids - twins, in fact. You die in old age with devastated Pierre singing to you gently at your bedside. He loved you more than you’ll ever know.

JANE: But… That can’t be… This must be some kind of mistake!

JOHN: I’m afraid the Blue Algorithm doesn’t make mistakes.

JANE: Well I’m telling you, this is just not possible! You see I… [whispers] I’m already with someone else.

MAX: You’re what?

JANE: I’m seeing someone who is not Pierre. He owns a bar where I work. And that person just left his wife for me!

MAX: Oh Jane… Are you… Are you talking about Juan?

NATASHA: Jesus, did I move to Santa Barbara?

JANE: I’m so sorry! Believe me, falling for another man was never my intention! But I have… I mean, you keep talking about all of us having an immutable destiny. Juan and I didn’t choose to fall in love, so please, please don’t blame us for it! [snivels]

JOHN: I’m not applying any moral judgment whatsoever. I am simply obliged to repeat to you what I already told Juan - that the future of your relationship is unfortunately… inauspicious. This will become apparent to you and you will stay with Pierre.

JANE: This is… This is bullsh- [hangs up]

KARI: Before you ask John, that one wasn’t “better”.

JOHN: Well, she did get an answer to her dilemma.

KARI: But how can we know that if we’ll never see what would have happened in an alternative scenario?

JOHN: There you go with alternatives again. I’m telling you, there are no alternatives! This is exactly why this tech has such a potential to do good. It can free us from our illusions. It can do away with the tyranny of choice. Imagine all the anxiety turned into relief when we finally understand that no matter what we do, everything is going to turn out just the way it was always supposed to. The Blue Algorithm will at long last allow us to simply relax and enjoy life!

KARI: I’d say that the psychological effect of knowing your future is fixed would trigger exactly the opposite emotional reaction - it would lead to hopelessness and depression. Especially in young people. Just imagine being a teenager and seeing your classmates having bright futures while yours is not what you had hoped for. Wouldn’t you be dejected?

JOHN: See, I’d argue the reality check provided by the Blue Algorithm would do more good than harm. I mean we already have the majority of teens not growing up to become influencers and streamers. Are you suggesting all of them are emotionally scarred for it?

KARI: But there’s a qualitative difference between this realisation setting in gradually over time and learning your dreams got cancelled! At least as you age new dreams take the place of the old, and someone doesn’t just steal them all overnight!

JOHN: Steal them?!

NATASHA: Ladies, ladies! We’ve got another caller. Well, it’s actually the same caller - Pierre is calling us back again.

MAX: Oh shit - he must have heard Jane saying all of that! Let me take his call, he’s my friend after all... Natasha, let him through.

PIERRE: [tearful] Hello?

MAX: Hello Pierre.

PIERRE: Hi Max.

MAX: So, Pierre, I trust you’ve been listening to the show?

PIERRE: Indeed I have! And let me tell you, I’m glad you told me to get rid of my guns.

MAX: Pierre, don’t say that!

PIERRE: You’re the ones who keep repeating the importance of honesty! The honest truth is that who knows what I would have done if I hadn’t just tossed my entire gun collection over a bridge.

MAX: But when you think about it, ultimately it’s all good news. First you’ll beat cancer, and then you and your love will grow old together!

PIERRE: Yeah, apparently she doesn’t have an alternative.

MAX: Oh mate, don’t look at it that way! She did have an alternative whether to stay with you while you had cancer!

PIERRE: Well, technically speaking, I still have cancer!

MAX: Well you know what I mean! Jane decided to stay with you while she still hadn’t had the information that you’ll get cured! Surely that has to account for something?

PIERRE: I guess. But it’s very hard to think about it that way. Sure, the end result is important, but so is how you got to it, you know?

MAX: Pierre. Jane made a mistake. Let her make it right and prove that your fate - your fate together - it’s not a mistake!

PIERRE: I… I guess I owe her a chance.

MAX: Absolutely! You know… How do I put this? Rarely fate gives us a soup and a spoon at the same time. We have to make the best with what we have.

PIERRE: That’s a nice way to think about it. Thanks Max.

MAX: Got any more questions for John and the Blue Algorithm here?

PIERRE: I think I’ll leave Mr. Smith alone and quit while I’m ahead, thank you very much.

MAX: Probably for the best...

[hangs up]

JOHN: See, it’s a happy ending!

KARI: Speaking of which - John, what’s the end goal here?

JOHN: What do you mean?

KARI: Is the goal to one day simulate all future?

JOHN: What you’re effectively asking is is the future finite? Is there an end of time?

KARI: Well, is there?

MAX: And if so, can you compute everything?

JOHN: The best I can answer is... Not yet. But eventually probably yes.

MAX: What happens when you do?

JOHN: I don't know. Maybe nothing. Maybe the universe crashes. Maybe we discover there is no universe.

KARI: There is no universe?

JOHN: Well, I’m sure you considered the logical extension of what the Blue Algorithm indicates.

KARI: The logical extension?

JOHN: Well, there is a possibility that the reason why the future can be simulated, is because it is a simulation to begin with.

MAX: Simulation? Are you saying that we’re living in... inside a program?

JOHN: No, what I'm saying is if that’s true, we are a program.

KARI: Okay, let’s just say that’s the case, and we really live in a computer simulation. How could we even tell?

JOHN: For starters, look for optimization. Any computer always optimizes during rendering to save on processing. So for example, it’s almost certainly expensive to meticulously render light particles in our world, which is why when nobody's prying they're simply rendered as waves. Another giveaway is the existence of absolute values, like maximum cold and maximum speed.

KARI: You mean 0 Kelvins and speed of light?

JOHN: Again, characteristics of our known reality that don’t really make sense if you think about it.

MAX: But what if I want to get out of this simulation?

JOHN: Get out? Where?

MAX: Well, outside of it!

JOHN: There is no outside. I mean, maybe there is, but there is certainly no you outside.

MAX: But what if we don't want to live like this?

JOHN: Well I guess the same thing you can do if you don’t want to live now. If you ask me, you’re being too dramatic about it. What difference does it make if underneath it all are atoms and neurons, or bits and bytes?

MAX: It makes a world of a difference - I mean literally!

KARI: Before we get all nihilistic here, let’s consider a much more important question: if all around us is indeed just a program, is there a programmer?

JOHN: Depends what do you mean by programmer?

KARI: I mean if this is a simulation, who is simulating us?

JOHN: Most probably, we are.

KARI: What? Why?

JOHN: Well, aren't we on the verge of creating AI? And aren’t we kind of freaking out if it will kill us all? Well one way of averting that would be to give the AI the experience of being human. To have it learn by simulating the history leading up to it. And by that I mean ALL the history. The best way to do that would be to split the AI into billions of tiny pieces. To experience how it was to be a scared and hungry ape in the African savanna. To experience how it was to be Genghis Khan. How it was to be every one of Genghis Khan’s victims. To experience love. To experience loss. To collectively experience EVERYTHING. Simulating ourselves collectively would be a smart way to develop a wise - and forgiving! - artificial intelligence that might just decide not to vaporize us the first chance it gets. So not only are we quite likely in a simulation, it's likely we are AIs and we are living through the lead up to our own creation.

MAX: Are you pulling our leg?

JOHN: The honest answer is: I don’t know. With the current state of quantum computing it is just not possible to get to the answer. But to be clear, what I said is the good scenario.

MAX: So what’s the bad scenario then?

JOHN: Maybe people of the future are simulating freakish alternate timelines, for example one in which Great Britain left the EU. Or perhaps our whole world is an environment to test different strains of diseases and vaccines for them. Or maybe we weren’t planned at all. Maybe the goal of simulation is simply to simulate the Universe from the Big Bang forward. Maybe life forms in this simulation are just a side effect.

MAX: Great, so we are left behind in a reality where someone forgot to flick the OFF switch.

NATASHA: Guys, guys, guys, we are running out of time, and we should definitely hear from one more listener.

KARI: Thanks Natasha! John, try not to make this one cry if possible.

MARIA: [crying]

JOHN: Hey, don’t blame this one on me!

KARI: Hello, who do we have calling us? And why are you crying?

MARIA: [in tears] Hello, it’s Maria... Maria Rossi again.

KARI: Maria? Juan’s wife? I thought you went on a walk!

MARIA: I did. But I continued to listen to the show on my phone’s radio. I didn’t even know phones had a radio! [bawls]

KARI: I’m so sorry Maria! I’m sure listening wasn’t easy.

MAX: Perhaps there’s something else you could ask John’s algorithm? Hey I know you’ve been given a lot of pepper, but I am sure there’s some sugar in your future as well!

MARIA: Oh, no no no no no no. I’m not calling to ask what to do. I’m calling to tell you what I’ve done.

KARI: And what is that?

MARIA: I shot the cheating bastard!

MAX: You shot Juan?!

JOHN: Which is why I told him his relationships are going to end badly.

MAX: Wait, you KNEW this and didn’t say anything?

JOHN: Hey, it was you two who told me to be less of a dick! Remember that bit about a wise man omitting stuff?

KARI: So the reason you’ve been telling Jane she doesn’t have an alternative to Pierre is because Juan is dead?!

MAX: Why the hell didn't you warn him?!

JOHN: For the last time, it wouldn’t have changed anything! He'd still get shot, and I’d still get blamed for it!

MAX: Well you ARE getting blamed for it!

JOHN: Which is again what I’ve been telling you! Raising the right hand or raising the left hand, remember? No matter what we do, the outcome is the same!

KARI: We can return to this later - right now I have a more pertinent question for Maria: How the hell did you end up shooting your husband?!

MARIA: [sobbing] Well it’s a funny story actually...

MAX: Is it?

MARIA: I told you I went for a walk, right? So I was walking under this bridge, when all of the sudden from the top of it, a bag full of guns got tossed!

MAX: Pierre, you idiot…

MARIA: It was then that I knew what I had to do. I mean, guns falling into my lap from the sky - this was definitely a sign!

MAX: Oh it’s a sign all right - it’s a sign that PIERRE IS AN IDIOT.

KARI: Okay, okay, okay, let’s all take a deep breath here… Maria, shooting Juan I can understand. Not condone it! But definitely understand. What I cannot comprehend is - why are you telling us this?

MARIA: Because I'm in front of your studio and I'm coming to get your little fortuneteller too!

MAX: You’re what?

[sound of keystrokes]

JOHN: She’s telling the truth. She’ll be here in 30 seconds.

MAX: Natasha, lock the door!

JOHN: There is no key.

NATASHA: There is no key!

JOHN: It will take them too long to get here.

KARI: Anybody listening to this, call the police!

JOHN: 15 seconds.

NATASHA: Holy shit. Everyone, stay cool!

[sound of studio doors opening and closing]

MARIA: So, you must be John Smith... See, I don’t have a fancy algorithm but I knew that!

JOHN: Um, how’s it going?

MARIA: Don’t ask me that like you don’t know! I mean, don’t you know everything?

JOHN: Actually, no I don’t. It's not like I've got a too-long-didn't-read version of the future in front of me.

MAX: John, you're making a classic mistake of confusing speaking with thinking.

MARIA: But you know what I’m here to do, right?

JOHN: Um, if it was supposed to be a surprise reveal you kinda spoiled it already...

MARIA: Oh, you’re the right person to talk about spoilers!

KARI: Maria, please…

MARIA: Don’t you dare to try to talk me out of this! Don’t you see that this… Sleazy soothsayer doesn’t give a flying fuck about any of us? He would literally sell our future to make a few bucks! He has to be stopped!

KARI: MARIA WAIT! Wait! [calmly] So, the reason you want to shoot John is because you think the Blue Algorithm really works, correct?

MARIA: Yes.

KARI: Why don’t we then ask the Blue Algorithm if you shoot him or not? I mean, if everything is preordained, the Blue Algorithm will tell you to shoot John if you really have to. Then all you have to do is not shoot him to prove the Blue Algorithm wrong and that we are all in charge of our own destiny!

MAX: Great idea! Go on John, tell her to shoot you!

JOHN: Guys, all of this is irrelevant. It doesn’t matter either way.

MAX: John, this is not the time to channel your inner three-eyed-raven.

MARIA: Well, of course he is cool about all of this! Don’t you get it? He had simulated his future in advance! The bastard knows what’s going to happen!

KARI: But wait, wait, wait, wait, wait. If he indeed did that and he is so cool, wouldn’t that imply he knows he doesn’t die today?

MAX: There you go, that means Maria doesn’t shoot him and this whole situation is resolved.

KARI: Right, right.

MAX: Problem solved!

MARIA: That is true…

MAX: Beautiful!

MARIA: But I bet he doesn’t know if I shoot ONE OF THE REST OF YOU!

MAX: What? No, no, no.

MARIA: Okay Nostradamus, if you are so certain about your fate, TAKE A LOOK IF I SHOOT KARI, MAX, OR NATASHA!

NATASHA: Gospode! if I wanted to get shot I would have moved to the United States!

MARIA: Quite frankly I don’t give a damn who I shoot, as long as someone gets shot. It just has to happen.

KARI: But now you’re promoting the same fatalistic philosophy as John. Nothing has to happen!

MARIA: That’s what I’m telling you - it’s precisely because this doesn’t have to happen that I’m doing it!

MAX: But wait, didn’t you just say it has to happen?

NATASHA: Pardon my intrusion. Since we are brainstorming here, is Maria shooting herself also a possible solution?

KARI: Yes!

MAX: You’re on to something there.

KARI: I like that.

JOHN: For fuck’s sake, enough bickering already! Look at you, yapping like the universe cares what any of us think! Regardless if I get killed or not, it doesn’t matter. The Blue Algorithm will prosper either way.

MARIA: And why is that?

JOHN: Because I’ve set up a time-bound release of its complete code base in case I get incapacitated.

MAX: Holy fuck, he set up a dead man’s switch!

MARIA: Dead man’s switch? What the hell are you talking about?

JOHN: It means that if I don’t manually extend the time every 24 hours, the complete code repository of the Blue Algorithm will get uploaded to the Internet, effectively making the Blue open source.

MAX: Which means anyone will have access to this technology...

JOHN: Raising the right hand, raising the left hand.

MARIA: Okay... Sure, the code will be released. So what? It doesn’t mean that others will know what to do with it. After all, it is Mr. Prophet here who is the inventor!

MAX: Actually discoverer.

JOHN: You keep calling me a prophet, but the fact of the matter is that I am irrelevant. You see, if this is indeed a prophecy, it is a self-fulfilling one.

MARIA: And what do you mean by that?

JOHN: Think about it. If anyone can access the Blue Algorithm, then anyone can look up the end product. By that I mean the Blue Algorithm as it looks in 5, 10, or even 50 years from now. And then they can simply copy-paste its future code into the present. When you think about it, you could say that Blue is actually building itself.

MAX: Like a computer scientist that was itself a super-intelligent computer!

JOHN: You asked me why I named it the Blue Algorithm. The truth is, I didn’t. I just saw it was already called that.

MARIA: Well this just further proves that you don’t know what the hell you’re doing!

KARI: Maria, I know it sounds scary, and believe me I...

JOHN: Oh Kari don’t you see? She may talk about decisions but she has determined what she will do before she even walked in here! It’s not the first time a big idea encountered small minds.

NATASHA: For fuck’s sake give me the gun I’m gonna shot him myself!

JOHN: Maria is simply playing her historical role. As am I. When you think about it, it’s one of the oldest thoughts that have ever been written down.

KARI: And which one is that?

JOHN: No man is a prophet in his own town.

MAX: So what are you saying? That we should just let her shoot you?

JOHN: And why do you think I came on this show?

MARIA: … Right. So I guess it’s settled then.

KARI: Maria, don’t do this.

JOHN: Leave Maria to do what she has to do! History is a foundry - what does it matter who of us is the anvil and who the hammer?

MAX: Oh I get it, that’s why your surname is Smith!

JOHN: Oh... No, I didn’t even realize that until now. That’s just a fluke.

MARIA: Christ, can we just get on with this before I change my mind?

KARI: I just want to say that just because I’m content with my destiny, it doesn’t mean I’m looking forward to it. But satisfied or not with the role that has been given to us, we must play our part. [exhales] All right. I’m ready… Oh, actually, before you pull the trigger, just one more thing…

Chairman Meow is hiding in the shed behind Mrs Wheeler's house. She'll find him in two days.

[sound of a gunshot]

[The Program main theme]

ANNOUNCER: This episode of The Program was made by ten people: Michael MacEachern, Barb Sybal, Phil Sampson, Daria Tazbash, Martha Breen, Chris Peterson, Frank Salvino, Marlo K. Shaw, Christien Ledroit, and IMS. Visitprogramaudioseries.com for more details. The show is completely self funded, which means we depend on your support to continue. So may we please suggest donating a dollar per episode you’ve listened to? After all, a podcast worth listening to, is a podcast worth paying for.

WRITTEN, DIRECTED, EDITED AND PRODUCED BY

Ivan Mirko S.

CAST

JOHN SMITH - Michael MacEachern (Instagram)
MAX MUSTERMANN - Phil Sampson (email)
KARI NORDMANN - Barb Sybal (website)
NATASHA PETROVOVA - Daria Tazbash (email)
MARIA ROSSI - Martha Breen (CV)
PIERRE TREMBLAY - Chris Peterson (website)
JUAN PEREZ - Frank Salvino (email)
JANE DOE - Marlo K. Shaw (website)

ORIGINAL MUSIC AND ADDITIONAL SOUND DESIGN BY

Christien Ledroit (website)

REFERENCES:

original art by Carlos Costa
Courtesy of Andrew Hoover