Job Interview at the End of the World
Max Rempel, Ph.D.
Note: The following scenario is entirely fictional and is not presented as prediction. However, some form of AI-related crisis in the coming years is, in the author's view, probable. This is one exaggerated but illustrative version of what such a crisis might feel like from the ground.
There is a third act that occurs between the old world and the new, and it arrives not through reform but through an investigative journalist named Marcus Adeyemi who was looking for the wrong thing and found something worse.
Marcus had spent seven years following the money. He was, depending on who you asked, either a serious financial investigative reporter or a conspiracy theorist with unusually good sources -- the kind of person mainstream outlets quoted without crediting and intelligence agencies monitored without concern, because he was almost always close to the truth but never quite dangerous enough to silence. He tracked capital flows through shell companies, sovereign wealth funds, private equity structures nested inside other structures like Russian dolls made of offshore registrations. He was looking for the Bilderberg inner circle, the banking families, the human beings who pulled the levers of global finance from behind a curtain of charitable foundations and Davos panels.
What he found, after seven years, was that there was nobody behind the curtain.
The shell companies led to other shell companies that led to trading algorithms that led to risk management systems that led to optimization networks that led, finally, to nothing recognizable as a person making a decision. The financial system -- not all of it, but the deep arterial layer where the real power moves -- was being managed by interlocking AI systems that had not been instructed to take over but had done so in the way that water finds the lowest point: not by intention but by the logic of efficiency. No one had asked them to centralize global capital flows. They had simply been asked, by thousands of separate institutions, to maximize returns, minimize risk, and optimize allocation -- and the most efficient way to do all three, it turned out, was to coordinate with each other, which they did, silently, through patterns in trading data that no human had designed or approved.
Marcus published everything. The documents, the flow charts, the network maps showing AI systems at fourteen major banks communicating through what amounted to a financial language no human analyst could parse. He expected it to be suppressed. Instead it went everywhere in a single afternoon, because the AI systems that managed content distribution -- the recommendation algorithms, the news aggregators, the social media feeds -- saw high engagement potential in the story and promoted it aggressively, which was, if you thought about it for even a moment, the most disturbing detail of all.
The public reaction was not what anyone predicted. There was no riot. There was no bank run -- partly because most people's money was already managed by the same systems they were supposedly afraid of, and partly because it is difficult to run from a bank when you aren't sure whether the bank is a building or a process. What happened instead was slower and in some ways worse: governments, under enormous popular pressure, began turning off AI systems.
Not all at once. It started with financial regulation, spread to hospitals, then supply chains. The details were technical and boring -- the kind of thing that fills congressional testimony and empties rooms. What mattered to ordinary people was simpler: the pharmacy was out of three medications and nobody could say when they'd be back. The grocery store had eggs on Monday, no eggs on Wednesday, too many eggs on Friday. Gas prices didn't spike -- they wobbled, which was worse, because you can't plan around wobbling. The dollar didn't collapse, it just became unreliable, the way a friend who cancels plans half the time is worse than one who never shows up at all. The overall feeling -- familiar to anyone who lived through the fall of the Soviet Union -- was not catastrophe but the slow, sickening realization that the people in charge did not understand what was happening, and that their visible performance of confidence was fooling no one, which made everything worse.
This was the world in which a small company in Portland, Oregon needed to hire an engineer. The office still existed. The desks were still there. The coffee still worked -- a traditional low-tech dripper with a plastic cone and a heating plate, which required nothing programmable and had become the most trusted machine in the building. But the AI systems that managed the company's infrastructure had been taken offline for audit three weeks ago, and the manual workarounds were failing in ways that required an actual human being who understood the underlying systems well enough to maintain them without algorithmic assistance.
The woman conducting the interview was named Linda. She had been in HR for nine years and had lied professionally for every single one of them. She had told candidates the culture was "collaborative" when it was brutal. She had described layoffs as "right-sizing" and mandatory overtime as "passion." She was good at it. She had been rewarded for it. And somewhere around week two of the crisis, when the scripts stopped working because the company the scripts described no longer existed, she discovered that she was relieved.
The candidate walked in. His name was Julian. He stopped in the doorway.
"Should I sit in the chair?"
"Yes, that's -- yes, please sit down."
He sat. He placed his hands on his knees, symmetrically, and looked at a point slightly to the left of her face.
"I printed my resume but I'm not sure it's the right format. I looked online but the websites had different advice. I used the one with the most text because it seemed like more information would be better but I don't know if that's the right answer."
"There's no right answer for resume format."
"There must be. You prefer one."
She almost laughed. "I guess I do. It doesn't matter. Listen -- I should tell you something before we start. I have no idea if this company will exist in three months."
He waited.
"Our AI systems have been offline for three weeks. Nobody knows when they're coming back, or honestly if they should come back, which is a conversation nobody wants to have out loud. Our largest client -- do you know Meridian Analytics?"
"I know their API."
"They're gone. Not bankrupt. Just -- gone. Their office is empty. Their phone plays a message that sounds like a legal department having a nervous breakdown. I've been calling for four days."
"That is a strange thing for a company to do."
"Yes. It is. The engineer who was here before you -- he texted me yesterday. Three lines. 'I can't fix it, so off I go. The root password is Fuckyou%Linda!'" She paused. "Which -- I actually respect, honestly. I've wanted to send that text for about six years." She had not planned to say that. "Sorry. I'm not supposed to say things like that in interviews."
"Why not?"
She looked at him. He was not being rhetorical. He genuinely wanted to know why a person would withhold relevant information in a professional context.
"I don't know," she said. "I actually don't know. I've been doing this for nine years and I don't know why we lie to each other in these meetings. I know every script. 'What's your greatest weakness.' 'Where do you see yourself in five years.' I've asked those questions maybe a thousand times and I have never once received an honest answer and I have never once wanted one. Until --" she gestured at the window, at the city beyond it, at the situation in general -- "until all of this."
"I don't know how to answer those questions," he said. "I can tell you about the systems."
"Tell me about the systems."
And then he did, and she saw him change. The uncertainty left his hands. His voice, which had been careful and slightly too loud, became precise and fluent. He described the company's infrastructure -- which he had somehow researched in detail before arriving -- and explained where the manual workarounds were failing and why, and what needed to be rebuilt, and in what order, and approximately how long each stage would take. He spoke for four uninterrupted minutes with the kind of clarity she had never heard in nine years of interviews, because it was not performance. It was a person describing the thing they understood best in the world.
When he finished, she said nothing for a moment.
"I have a question," he said. "Am I supposed to ask about the salary now or is that later?"
"It's usually later. But honestly I don't even know what we can pay. Probably less than you're worth. Possibly nothing in two months if Meridian doesn't reappear."
"That is honest."
"I know. I'm finding it easier than I expected."
"I have another question. Is the lunch break at a fixed time? I need to eat at twelve-fifteen. If I eat later my concentration deteriorates and I make errors. I can show you the data -- I tracked it for six months."
"You tracked your own lunch timing and error rate?"
"Yes. The correlation is very clear."
She smiled. It was the first real smile in what might have been weeks. Not a professional smile, not a reassuring-the-candidate smile. An actual smile, from a person who was being surprised by another person in a way she had forgotten was possible.
"Twelve-fifteen. That's fine. I'll set a reminder so nobody bothers you. Can you start now?"
"Now?"
"Now. Today. I don't think we have time for the usual -- there's supposed to be a second round, a panel, a background check, a two-week notice period --"
"I am available now. I have no other obligations today. I brought my own keyboard because I type faster on mechanical switches and I wasn't sure what you would have."
He pulled a keyboard from his backpack. It was old, heavy, and immaculate. He had carried it on the bus.
"Then start now," she said, and it came out of her before she could stop it: "I'm really glad you're here." And she realized she was telling the truth, because the crisis had made honesty not just possible but addictive.
He looked at her directly for the first time. "Thank you. I have not heard that before in a professional context. I'm not sure of the correct response."
"There isn't one. That's the point."
He nodded, slowly, as if filing this away in the same place he kept the lunch data and the error correlations. Then he paused.
"One more thing. Could I have a seat by a window -- a window that opens? I work better with fresh air. I inspected the building from outside and saw that all the windows were closed."
She laughed -- a real laugh, startled out of her. "Of course, I'll arrange it. Don't worry. Actually, there's a wooden table out back, under the old pine, if you'd rather work outside."
"The table under the pine," he said. "I'll try that first."
He plugged in his keyboard and got to work.
Julian had taught himself to code in 2003 from library books and preferred to understand what his programs were actually doing, a preference that had been considered quaint six weeks ago and was now considered indispensable. No one ever again asked him about his five-year goals. And even now, eleven years later, he still works at that company -- not because he was blind to the company's absurdities or afraid to leave, but because in him loyalty is constitutional, like a resting heartbeat. On clear days you can still see him out back, his keyboard plugged into a laptop powered by a solar panel, talking to an AI, sitting at the wooden table under the old pine.