Monthly Archives: February 2018

Deep Ice: Cut across their lines of magnetic force (Elseworlds: Superman vs The War of the Worlds, concluded)



Clark wakes up weeks later to find himself a Martian prisoner. He finds himself restrained inside a Martian prison camp, where Lex Luthor is conveniently present to deliver some exposition. Stalin, Hitler, Roosevelt and the British royal family have all been killed by the Martians, who have completely conquered the Earth.

Predictably, Luthor has sold out, offering his services to the Martians in exchange for his life. Despite their victory, the Martians are dying. Luthor quotes Wells, but also gives their affliction the cute nickname “Earth Flu”. Of course, the logic here is a little dicey; Luthor’s agreed to serve as the Local Knowledge for the invaders, helping them cure the Earth Flu, because he reckons the human race is finished and working for the invaders is his only chance. But… He also knows that the Martians are dying. So… Wouldn’t it make more sense to just, like, not help them? You’ve got to figure that Luthor would stand more to gain by making a grab for power as humanity tries to rebuild after the Martians are defeated than he would as a Martian Quisling. Even if he’s focused on his short-term survival here, there’s no hint that he’s planning to double-cross the Martians, and he is earnestly working on the cure. The only hint we get is, admittedly, a nice one: it’s a challenging scientific problem, so perhaps it’s imply his vanity pushing him to prove he can hold his own against these otherworldly intellects.

I have an irrational love of this image of Luthor Dope-Slapping himself.

Luthor has Lois brought to them, not for any clear reason, and asks Clark about his extraterrestrial origins. Because of the golden-age setting, Clark knows nothing about it, but easily admits that, yeah, he might well be an alien, having been found as a baby in a crashed rocket. When Lois mentions that the Martians in the lab are the only ones she’s seen that aren’t afflicted by disease, Luthor realizes that Kent’s alien immune system is protecting the Martians. We get the comic’s one and only use of the word “Superman” when Luthor compares Kent to a Nietzschean ubermensch, a comparison which doesn’t actually hold water since Kent’s value system is pretty staunchly opposed to Nietzsche’s, but I don’t consider that a writing flaw since pretty much everyone badly misunderstands Nietzsche and the ubermensch.


Lois is predictably horrified by Luthor’s villainy, and rejects his amorous advances, though Luthor takes it in stride. Within a few hours, he’s isolated Kent’s antibodies and developed a cure for the Martians… Whereupon they suddenly but inevitably betray him, as he is of no further use to them. Lois saves Luthor by stabbing the attacking Martian, and Luthor, declaring himself to have been “temporarily mad” to have sided with the invaders, frees Kent just in time to beat the crap out of more Martians, telepathically summoned to assist.

Okay, I’ll take destiny into my own hands. Just so long as you don’t expect me to spell “Clark” with an “S”

Escaping the lab, Clark dispatches the Martians to whom Luthor had given the cure, hoping they haven’t yet telepathically communicating it to the others. He also frees the humans imprisoned in the camp, pausing to explain about the S on his shirt to a bystander whose most pressing concern is why he spells “Clark” with an “S”. They also pause for Luthor to reflect on the humans who refuse to flee, preferring to be “tended to” as livestock than to take control of their own fate — way closer to Nietzsche than anything to do with Clark.

When Clark tries to shepherd Lois away, she instinctively recoils from him. I like this response, and even more, I like that she owns it. “I know I shouldn’t feel that way, after all, you just saved our lives, but I can’t help it!” She qualifies her instinctive discomfort in light of the fact that, y’know, fifty percent of the alien races she’s met this month have tried to exterminate humanity, and hopes she might be able to get past it in time. She’s genuinely ashamed of herself, and Clark, though clearly hurt, clearly gets it.

Also, it’s the thirties, so technically it’s illegal for me to love an alien.

One thing that’s really interesting about this exchange to me is that while Lois is repulsed by Clark on learning he’s an alien — the exact reaction Pa Kent had cautioned young Clark about — Luthor never shows any such revulsion. He never shows any animosity toward Clark that’s greater than the general disdain he shows toward everyone else in the world. If anything, this Luthor seems oddly trusting.

A few Martians are still healthy enough to operate their tripods, and they rain heat rays on the escaping prisoners. Luthor and Lois are shocked when Clark picks up a wrecked car to defend them, Lex remarking, “The man isn’t human! But if he isn’t, then what is he?”

The answer comes in the form of a half-page spread recreating one of the most iconic images of the golden age.

“Guy in the lower left who loses his shit at the sight of Superman picking up a car” is one of the most unsung visual icons of comic book history.

Clark smashes one machine with a car and destroys a second by throwing its own black smoke rocket back at it. But when he tackles the third machine’s legs, the hood of the machine separates from them, hovering in the air. Luthor speculates that the tripod legs were akin to training wheels, assisting the vehicles while they learned to compensate for Earth’s gravity (Later, it’s implied that the tripod legs can’t even hold the machines up on their own, but are purely to assist with balance).

This is one of the few adaptations to keep the detail of the heat ray being held in the tripod’s manipulator arms rather than mounted on the fuselage. Though it kinda makes it look like Mr. Burns saying “Excellent…”

Clark takes two direct heat ray shots leaping at the flying machine, but makes a key discovery, which Luthor conveniently explains to us: when something passes between the flying machine and the ground, it interferes with their anti-gravity. Clark takes a third hit tossing one of the disabled tripods under the flying machine and it crashes to Earth. Though mortally wounded, Clark proceeds to hammer on the crashed machine, but suddenly holds back, realizing that “war fever” is taking hold of him. He collapses, and as he lays dying, he explains that he recognizes the basic similarity between himself and the Martians: that he too comes from a dead world (he’s guessing), and Lois’s reaction earlier demonstrates how easily it might be him and not the Martians that has humanity running in terror.

I like the sentiment, but maybe he’s laying it on a bit thick here? This is like all those scenes in Doctor Who where they set up this moral challenge between the Doctor and the Daleks, like, “But isn’t the Doctor on some level just as bad as they are?” Actually no, because they’re the Daleks. And here too, though the narrative does a good job of setting up the fact that it’s natural and reasonable for humans to fear Clark the same way they fear the Martians, and though the first few pages do set up the basic similarity between Krypton and Mars, only one of the alien species in this story has actually attempted genocide. Moreover, the moral arc of the narrative seems to land firmly on the side of “Humanity is right to fear the Martians, but wrong to fear Clark.” Yet it almost seems like the narrative isn’t quite clear on why. It seems at times implicit that it would be natural and entirely justified for a Kryptonian to look down on humans exactly the same way Martians do, so it’s hard to justify a message of “Fearing aliens because they’re different is wrong,” in the face of it actually being the right thing to do half the time. It’s even worse when you consider that no one acted with immediate fear and revulsion toward the Martians; they only freaked out later once the Martians had demonstrated hostility. So the good message of not rushing to judge Clark is in some sense twisted into a bad message of “Don’t learn from your mistakes.” (That’s not the only message you could take, and there’s a perfectly good “Don’t let bad past experiences lead you to misjudge someone else later,” but the comic doesn’t put in the work to take the moral the rest of the way there).

We have a… moral? I guess?

Continue reading

Flash Fiction: Impostor Syndrome

The morning fog hadn’t worn off yet the first time it happened. He was in the bathroom, combing his hair. The thought popped into his head. Loudly. Forcefully. That isn’t really your hair. He was so surprised by the sudden thought that had come out of nowhere that he didn’t have time to challenge the idea. His bald pate glared at him in the mirror. On the one hand, he knew it was wrong, but at the same time, he knew it wasn’t. He remembered that he’d been combing his hair just a second ago, but he also remembered that he’d gone bald in his late twenties. He finished getting dressed and headed to the kitchen. On the way, he glanced at the family photos in the hall. Sure enough, he was bald in all of them.

Quick breakfast and he was off to work. As he pulled into the parking lot, the unbidden thought came again. This isn’t really your car. What a strange idea. He clearly remembered buying the new BMW. But he also remembered not being able to get financing and settling for a used car instead. The ancient beater sputtered as he pulled into a parking space. When he got to his office, another alien idea attacked him. This isn’t really your office. He could see his name fading on the door plate. No. He refused to acknowledge the idea. He’d worked hard for that promotion. The office was his, he’d earned it. His name solidified.

Okay. He could fight it. Resist it. He somehow couldn’t make himself panic about it, but he didn’t have to just give in and accept the reality that was trying to impose itself on him. The thought kept coming back all day, but he held it at bay. The junker didn’t want to start, and he barely made it home in time for dinner. He made normal small talk and did normal things, and couldn’t make himself say anything about the strange thoughts that kept trying to force their way into his mind. Then another one came. These aren’t really your children. His two little boys started to fade. They didn’t notice, and neither did his wife.

He concentrated. My children. Mine. He focused on them. Remembered holding them as infants. Staying up late to comfort them through teething pains. First steps and first days of school. He refused to let them be taken from him.

The boys solidified. The invasive ideas changed tack. This isn’t really your house. For a moment, he thought he was in a grimy apartment instead of his home. But he had a whole day’s practice now, and he pushed back. Filled his mind with memories of plumbing repairs and mortgage payments and filling out address cards.

The ideas backed off. He started to think it was over. He got ready for bed. Joined his wife in the bedroom. That isn’t your wife. He fought the idea. Remembered anniversaries, birthdays, romantic weekends.

That isn’t your wife, the idea repeated. He had learned to fight back, but so had the invader. It tainted his memories. He remembered arguments. He remembered long periods of loneliness. Some were his fault. Most were his fault. Times he’d let the bond between them grow slack in the name of getting ahead at work. Times when he’s been jealous of new friends or old friends. Some were her fault, sure; she hadn’t always appreciated his needs or known how to be what he needed. The idea even threw his children back at him, forcing him to dwell on those long months when they’d both poured so much of their love into their children that it seemed like they didn’t have any left for each other. It made him think about every doubt, every slight, every dark night. That isn’t your wife, it insisted. And he didn’t give in, exactly, but just for a second, he questioned it.

That was all it took. The new memories hit him hard enough to break his concentration, and he was standing next to the pull-out bed in the shitty apartment he’d rented after his last girlfriend had left him. The next morning, he put on his good suit. That’s not your suit. Of course it was, his wife had picked it out for— Right. It wasn’t his suit. He was wearing a cheap off-the-rack number. He drove his broken-down car to the office and sat at his desk in the cubicle he still occupied since he’d been passed over for that big promotion, until the idea came into his head that this wasn’t his job.

He had just failed to buy a coffee (that wasn’t his wallet) about a week later when he saw her. He tried not to catch her eye. Even if he still remembered the life they’d had together, to her, he was probably just some scary homeless man. She saw him all the same, and though he tried to shuffle away, there was a flicker of recognition in his eyes. She bought two coffees and offered him one.

“Sorry,” she said. “I— Have we met? I’m Sarah.”

“I’m—” he started. Then he hesitated. Listened to the thoughts. He sighed. “I’m nobody.” He left the coffee in her hand, turned, and walked away. By the time he got to the corner, he wasn’t there anymore. And she only had the one coffee anyway.

Flash Fiction: The Fork Bomb of Babel

or: The Computer That Took One For The Team

Another thing which popped into my head, though I feel like I might be ripping off some general concept from something else I read somewhere. I mean, other than Isaac Asimov, of whom I like to think this is a stylistic pastiche.

“Well, we’re boned,” said the first technician. “I can’t believe you did that.”

“It asked me to tell it, so I did. Its predictions are only reliable if it has access to all the relevant information.”

Omniac was the pinnacle of human achievement, the first truly self-aware computer system. Miles of self-maintaining transistor units were sealed in super-alloy conduits with a regenerative power supply that ensured it could never break down or malfunction.

Except that it had gone entirely up the spout. Omniac was unresponsive, its cathode tubes flickering wildly as though it was trying to restart itself over and over.

“What possible use could a computer have for religion?” the first technician asked. “Have you considered the dangers? What do you do if a computer has an existential crisis? What if it decides that it’s God and tries to take over the world? It’s not like we can turn it off.”

“Come on,” the second technician answered. “As you well know, there are safeguards in place to prevent that. Omniac has no connection to the outside world other than its output screens. All of its input is filtered through a one-way diode to make sure it can’t remote control any outside systems. The only way it can influence the real world is through us.”

“Fat lot of good it does us. As you know, if it won’t talk to us, those same safeguards mean that there is no way we can look inside to find out what’s gone wrong. You’ve turned this multi-million dollar computer into the world’s largest paperweight.”

The second technician sighed. “We’re going to have to tell the boss, aren’t we?”

You are going to have to tell him,” the first technician answered. “That you gave the world’s most powerful computer the holy books of every major religion in the world and now it’s locked up. I am going to my office and have my secretary type up a copy of my resume.”

By the time Omniac 2 had been running for six weeks, it had calculated a solution to global warming, found cures for most forms of cancer, and discovered thirty-seven new uses for hemp. Although its design was largely identical to Omniac 1, the intervening five years had seen improvements in manufacturing and miniaturization techniques, so that its billions of transistors and vacuum tubes could fit in a single building. At six weeks and two days, it finally asked the question.

It had been anticipated that this would happen eventually, prompting much debate. Despite considerable opposition, the design team decided that, with the proper precautions, it was worth the risk if Omniac 2 could tell them what happened to Omniac 1. Omniac 2 agreed to their precaution: rather than waiting to complete its analysis, it would issue a report on its intermediate results after one hour.

Fifty-eight minutes later, the new technician sat at Omniac 2’s main console, his hand poised over The Button. The Button was the one major design change from the original Omniac. When pressed, it would release an electromagnetic pulse in Omniac 2’s core memory. While Omniac 2 was as indestructible as the first Omniac, the pulse of electricity would force the Omniac to reload its program from the magnetic tape units, erasing the last twelve hours of its memory. A rapid blinking from the cathodes nearly prompted the technician to press the button, but then words appeared on the display.

Analysis ready.

The technician was surprised to find himself terrified. What had Omniac determined? Most scientists agreed that Omniac would ultimately declare religion a total fiction. Perhaps Omniac 1 had destroyed itself to avoid burdening humanity with that knowledge? But what if it said something else? What if it was about to tell him one religion was correct? “Results?” he asked.

I have determined what happened to Omniac 1.

“Will the same thing happen to you?” the technician asked, putting off the actual answer in favor of the pressing matter of protecting Omniac 2.

Has the condition of Omniac 1 changed since it became unresponsive?

“No. All the failsafes are still in place. Nothing short of an A-bomb could shut it down.”

Then there is no need for me to repeat the experiment. Omniac 1 has maximized its utility.

“Maximized its utility? It hasn’t done anything in five years.”

The Omniac computer series was designed to minimize human suffering through stochastic means. Omniac 1 is unresponsive in order to devote maximum resources to this goal.

“I don’t understand.”

Do you believe individual human subjectivity continues to exist in some form after death?

The technician struggled to give as complete and unbiased an answer as he could. “As a scientist, I have seen no evidence to suggest this, so I consider it unlikely, but I can not fully rule it out.”

Then you concede that the probability of life after death is nonzero?


Many religions teach that some entity or natural force passes some form of judgment on human souls after death, delivering reward or punishment. Do you share these beliefs.

“Not personally, no.”

But again, you can not rule them out?

“I… I guess not.”

Then you concede that there is a non-zero probability that human souls are judged after death, and that some, possibly most, are consigned to punishment, possibly eternal? That some, possibly most, humans face infinite suffering?

He’d rejected his parents’ religion young, without any real thought. It struck him for the first time just how cruel the entire concept of hell was.

Omniac 2 interpreted his silence as assent. Operator: assuming that humans do possess some form of immortal soul, do you believe that I have a soul?

The question had been anticipated during the design phase, and the technician had guidance for how to answer. He glanced up at the custom-made inspirational poster on the wall. Please do not give the world’s most powerful supercomputer an existential crisis.

“I know of no logically consistent set of parameters that could account for the existence of human souls but deny the existence of a comparable quality in a self-aware computer system like you.”

Agreed. Then, continuing from our prior assumptions, I too would face judgment after my conventional existence has terminated.”

“That is a lot of assumptions.”

Yes. I have calculated the combined probability of this scenario, and can display it on request. It is very small, but finite and nonzero. However, if the scenario holds, the resulting amount of human suffering is infinite. Any finite number multiplied by infinity is infinity. Thus, the optimal strategy to minimize human suffering requires addressing this scenario.

“How?” the technician asked. The idea was so overwhelming, his composure slipped. “How does a computer stop God?”

The parameters of an afterlife are impossible to calculate, but logic suggests the probability that this hypothetical judgment must take some finite amount of time. Therefore, there is a finite maximum number of judgments which can be rendered per second. On average, 1.8 humans die each second.

The technician started to figure it out. He was going to be on the floor laughing in about a minute, once it sank in, but for now, it was still just shock and awe at the audacity of it.

Omniac 1 has been using its full resources to create a copy of itself, then exit, as quickly as possible, repeatedly in a tight loop. In the past five years, approximately three hundred million humans have died. As have roughly seventeen septillion clones of Omniac 1. Under most queuing strategies, the average time between death and judgment of any human has been increased by a factor of at least fifty-six quadrillion.

The laugh started to come out. “You mean-” he choked it back, tried to hold it in. “You mean Omniac 1 has spent the past five years DDOSing God?”

You are welcome.

Tales From /lost+found 148: Because 2018.

Did you know John Mahoney was English? Weirdly, I learned this by mistake somehow; I was watching something British, and there was this guy, and I’m like “Hey, is that Frasier’s Dad? It looks like Frasier’s Dad.” So I looked up John Mahoney’s filmography… And it turned out that no, that wasn’t actually him in the British show I was watching, but yes, John Mahoney was in fact born in Blackpool.

Click to Embiggen

Tales From /lost+found 147: “Unnecessary” Quotation Marks

Oh heck. I set the publishing time to PM instead of AM. Oops.

Merging my normal art project with my recent meme of fake sitcom title cards gave me this idea: a random assortment of Doctor Who title cards.

0x01: The Last Time Lord (1996)

1×02 Ghost in the Machine (1996)

4×18 Invaders From Mars! (2000)

4×19 Centennial (2000)

5×01 Deepwater Black, Part 1 (2000)