Not Everything Is A Clue – Ch 77-79

Chapters 77-79: Is Joon the DM? Is Batgirl Catwoman?? Are insects videogame characters???



Playing Roy from Rick and Morty

The Time Loop RatFic is “Mother of Learning”

An early TBC episode we touched on



For next week — 80-83

80. The Princess and the Pea
81. Musings on the Elder God
82. Aboard the Lion’s Tail
83. The Familiar and the Foreign

Cakoluchiam’s stellar Character Sheet

Steven’s Predictions – Everything is a Clue

Worth the Candle can be read at AO3 or RoyalRoad.

You can support this podcast at our Patreon. Alexander Wales can be supported at his Patreon. We have a Discord as well.

Bookmark the permalink.


  1. I loved the discussion about insects, AI, and simulated suffering. Alpha Go and lot of AI nowadays do include negative reinforcement as part of their learning. Of course they’re not literally being zapped or hit, all that’s happening is that when they make a mistake the strength of the connections between the neurons that fired when they made that mistake are weakened.

    By itself that doesn’t seem anything like pain but these neural networks are inspired by biology, when you are in pain (mental or otherwise) your brain releases chemicals that alter the strength of connections between your neurons to make you less likely to get in that situation in the future.

    Alpha Go reportedly has millions of neurons while fruit flies and other insect’s neurons number only in the hundreds of thousands. With this in mind I don’t thinks its outrageous to conclude that Alpha Go feels as much pain as an insect (for however much that means anything).

    That being said we don’t fully understand how brains work. While artificial neural networks are inspired by brains they are much more simple. Instead of just having simple positive and negative reinforcement, brains have numerous chemicals that work in different ways. Serotonin, dopamine, endorphins, and oxytocin are all chemicals associated with positive reinforcement and while we have a general understanding of what they do on a large scale the exact mechanisms of how anyone of them works is not fully known. It is possible that our versions of neural networks are accurate enough that scaling them up will result in beings just like us but we could also be missing some unknown key ingredients.

    The other, much more philosophical, factor is whether you believe simulated beings can even experience in the first place. Personally, I believe that any sufficiently advanced emulation of a feeling is identical to having that feeling. I don’t mean that TLOU’s Ellie has feelings anything like we do. When the writers wrote the game they had an idea of who Ellie was in their heads and when they wrote scenarios for her to be in they imagined how she would act in them. You can of course do the same and take everything you know about Ellie from the game and try to imagine how she would act in a new scenario. You might imagine how she would act based on your thoughts on how people with similar personality traits and experiences to her would act. This is a form of emulation but it is not very detailed.

    In real life how a person acts is based on so many factors. With the way that past memories and experiences compound and build on each other its impossible perfectly know how a person would act without knowing everything that has happened to them (and what chemicals their bodies produced at the moment and numerous other factors). There is a shear incomprehensible complexity to human emotions. While I don’t think that feeling is ultimately a binary on or off based on some threshold of complexity, saying that Ellie is experiencing some form of feeling is like saying the amount of push-ups I can do is comparable to the amount of stars in the universe. It is technically true, they are both numbers and therefore able to be compared but it is a bit misleading.

  2. Mini Fanfic, just for fun:

    “Oh, the hells aren’t mine, they belong to my Dad, I’m just supposed to give him a great variety of souls to staff them. My Dad’s a doctor, he’s trying to find a cure for dwarves. Dwarves kill half a percent of all Gaias you know, dwarves killed one of my great aunts.

    Uh, how to explain… Oh yeah, Demons are like an immune system. They torture pathogens’ souls in order to lure them into invading the afterlives, where they can’t hurt the Gaia.

    Aerb is a petri dish to make souls for the hells, which are my father’s laboratories for understanding pathogens like humans, dwarves, elves and so on. They are full of demons harvested from numerous other Gaias. We’re trying to train some really effective demons, maybe some who can cure even Bronzemurder.

    But lately Aerb’s population went down too much and souls stopped entering the hells. I play with it for fun, but I also want you to fix it, so it gets more population growth, so my Dad gets off my case about his experimental subjects.”


    Serious now:

    My personal theory is the following:

    CYOAing Kalaidoscope. The DM didn’t create this world. The Multiverse is infinite, somewhere in it there was bound to be a world containing all the things Joon made up. Knowledge is power, the DM put him somewhere he would be God.

    Instead of bribing a weathermage, he just put Joon into a world where there just so happened to be an archery contest on that day.

    The DM could fix the suffering of Aerb, but it’s more time efficient to empower nice people who do it for him. The emulation stuff, that is just a manifestation of Joon’s Gamer power, Aerb DOESN’T run on the DM’s computer, but Joon’s empowering apparatus does.

    Now, in order to motivate Joon to the max, the DM is lying to him, provoking him, riling him up.

    Because Joon was acting unheroic in his reluctance to help the Locus. Joon didn’t care about the Locus, Joon was unmotivated. That is why the DM is talking to him now, to give him ambition and drive.

    The DM isn’t playing DnD with Joon, Joon is just one of the DMs many MANY pawns.

    • Basically if the DM told Joon the truth, he wouldn’t give it his all, he would think he’s invincible and do something stupid, ect.


      Oh, and also: I thought ‘Exclusion Zone’ was some kind of Grand Magic, that you need a Council of Archwizards to cast. And getting the Council of Archwizards to come to a decision before Zom… er, the Risen eat half of Anglecynn was the great big politics thing.

      In my theory it could also be something so dangerous, God/DM has to intervene before it depopulates Aerb and when the DM can’t keep up with that work and sees that the world is doomed, then they send in an agent like Uther or Joon.

  3. I looked into the “getting into the bottle” problem, and I figured out how they did it. When they meet Solace and go into the bottle for the first time, they touch the inside of the bottle’s rim, which causes them to warp down into it. It’s barely implied (in the chapters up to this point) that this is a power of the bottle itself, not druid magic. So the real problem is surviving the fall, and getting out. We saw how they survive the fall, and I think teleporting out could theoretically work; they aren’t actually shrinking when they go into the bottle, it’s just warped space.

  4. Obligatory hot take: I genuinely believe that thermostats have experiences and that the Linux kernel is probably conscious.

  5. But of course, Joon DIDN’T ask for this. The DM simply retconned in a scene where he did, then fluffed it out as “memory removal” ;)

    (or did he…?)

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.