Rook is a science fiction novel I have been writing for a few years. A highly detailed simulation of a world has been created to emulate a virtual mind, with the understanding that it will create the next products humans will want to buy. The system is only advanced enough to simulate one mind down to the electron—inspired by the most naturally inventive brains in the animal kingdom.
Portions appearing here are copyright 2024 Chris McCue.
The world they made for him was enough; they made it so what they gave him was often more than enough. That was their proof of concept for Grand Ç.
Then to get results for Grand Ç they made him a world that was less than enough. Aodh flourished; his brood did develop as expected, as Grand Ç had hoped.
Now they were at work finding the sweet spot: Goldilocks, as the astronomers say. They were able to run Aodh through worlds less beautiful, increasingly full of shadows, dry bark and empty rivers of baked mud. Aodh suffered; his brood developed differently. The shrewd flourished. The crows’ innate need for cooperative problem-solving was still there, but disrupted too often with pleasure-seeking individuals. But the art was no less beautiful, and the technology no less advanced or surprising. The biologists and the engineers together convinced Grand Ç that the surprising results were actually better than those in the previous trial.
Then they ran Aodh through a worse world, and he asked them why. Aodh didn’t know where to look to see them; he didn’t know for sure they existed. His species—most tribes—had developed a tradition of crafting totems to communicate with his ancestors’ souls: he thought he was physically real, and that his species had souls.
The totem was iron, painstakingly cast: the skull of his country’s founder. Lines of a barely recognizable script tooled across the rims of eye sockets, nostrils, beak, wherever the ancestors thought would create powerful medicine. The original skull’s owner (possibly buried underfoot) was the second Aodh—she was the daughter of the first Aodh in the simulation forty-four generations back—or twenty-five hours of simulation time. Eyes on either side of its head, overdeveloped chisel-like beak, enormous nostrils, the skull was the size of a summer squash: his progenitor had been tiny, the weight of about 3 egs. The skull was embedded in the chiseled, petrified trunk of a huge maple, a bezel in a precise hexagon. Aodh ran his scaly black fingers along the grain of the huge bezel.
Aodh bowed and extended his wings in respect of the ancestors. He said the proper benediction and cleansing breath for them.
And then he buzzed to his ancestors: “Why is life such a horror?” His mate was dying of a plague which had claimed two of their children. “Can you ask the gods on my wife’s behalf? I know they will never help us, but at least they can tell us why.”
Quoi froze him, the breeze in the crows’ park crystallized. Quoi needed a moment to consider.
❖
In the control studio, Dr. Hel Ormond asked, “What is this? A render issue?” On the 2D wall display above her, a maple seed pod had stopped mid-helicopter, a few feet from what would have been the camera’s lens if there were cameras in the simulation. She could see, on the side of the seed pod facing away from Aodh, the pod’s texture appeared to have a triangular chip in it, a little cluster of smooth, shadowless facets. Everything in Aodh’s line of sight was rendered in extreme detail—he could see the optical zone of confusion around a single piece of large bark on a tree and the way it refracted the light of the bark a few millimeters behind it—that morning Aodh had admired his own reflection in a beetle’s shell—but the other side of objects, especially fast-moving ones, could have almost no effect on his mind or life, so Quoi didn’t waste valuable calculations rendering them.
In the back observation deck behind her, eight information scientists traded breathless accusations and bustled at their consoles. An obsequious face in a Team Crow cap and a cold sweat popped up from behind the deck rail. “I think it was a heat lock-up. Pretty rare. Sorry, one second, doctor, we’re looking into it.”
“Thank you, Dr. Yeoh,” Grand Ç rumbled. There was no need to remind him that Computations Montreal was charging several million dollars a day to use Quoi. “I’m going to the guard house to make a phone call. Please have this figured out by the time I get back.”
Hel was sitting about a meter from the huge display screen; she had to crane her neck to see Aodh. He was large on the screen. For the most part Hel was clueless about what Aodh was feeling, but for the last few seconds it had been clear that Aodh was trying to blink tears out of his eyes with his nictitating membranes. Hel turned around to speak with her partner, slouched in a recliner and dabbing at her own cheeks with her sleeve.
Dr. Purana had little trouble identifying with Aodh’s body language. She’d spent three years studying New Caledonian crow psychology and behavior before ever meeting Hel in person. Purana’s real-world specimens were sensitive, affectionate, and they understood what was going on in the world around them. Of course they didn’t have obvious facial expressions, but they were talkers, modulating the tone and dialect of their chatter with their moods and changing social situations. Dr. Purana’s recommendation five years earlier had resulted in giving the crows articulated labial folds (lips) inside their beaks—her research had presented strong evidence that doing so would result in the crows developing language that could facilitate cultural, technological and biological evolution.
Now based on Dr. Purana’s recommendations, they’d run their baby Aodh, a mean, passerine, evolution machine, through forty-four generations. In eight hundred years (sim time) the crows had gone from a cultureless animal with almost no language, to a tribal society of hunters and herders, to a mostly unified global community, discovery of genetics, and just beginning to use electricity. And for all this Aodh’s reward was that his family was being wiped out by a plague with no Earthly equivalent. What the hell was the point of letting plagues develop in this simulation? The amazing technology they wanted the crows to invent could certainly be invented without pestilence cutting them down every other decade. To say nothing of the cycle of brutal solar flares they’d put in all three trials to shake up the crows’ genes every 18 years.
“Hey, no liquids down here, Devya,” Ormond said. “They’ll send you to the principal’s office.” Purana smiled slightly.
“Then we won’t get an A on our project,” Purana said seriously.
“I really think it will be fine. The cheats are well-documented, all the non-scion events are reproducible. As long as Quoi isn’t making serious calculation errors in there we’re golden. And not much we’d be able to do about that.”
“I’m concerned this could be seen as an ethics violations, though.” Purana was fidgeting with the cable of an unplugged desk lamp. She was utterly lost without a fat composition notebook or tablet.
Ormond scoffed. “Come on, shut up with that shit. That shit has sailed. Let’s get some bean juice.” The house lights had come on, and the red velvet movie theatre seats behind them were mostly empty. Two computer scientists chatted in the back row, a suit was picking at his nails peevishly, and that asshole philosopher’s head was cocked to an eighty degree angle in thought, mumbling to himself, tripping out, lost in Aodh’s electric blue eye. The other spectators had made for the kitchen or the toilets. They had had no break for five hours, and no food or drink was permitted in the control studio.
Ormond and Purana put on their thick vinyl coats and ascended the aisle out of the studio. They exited through the red velvet curtains hanging down from the balcony where the tech guys were yelling at each other. Purana narrowly dodged some kind of caliper being thrown off the balcony. She handed it back up to that Dr. Yeoh kid, leaning so far over the balcony he was about two centimeters away from doing a header.
❖ ❖
The guard house was actually warmer than the kitchen and halls of the Montreal Computing complex. It was one of the only parts of the complex above ground and with windows. On the far side of three garrisons of heavy steel fence, a couple of refugees were keeping nice and toasty in their converted Tesla F-250, their winter home. A heavily armored guard approached them and shooed them out of the MC parking lot. A fleeting thought—those gray South Chinese heads might be relatives of someone who works inside, might be daydreaming of the technology their niece or nephew has the privilege of buzzing around 12 hours a day, a soldier or a tinker gone native under the hills.
In that moment, Grand Ç truly wished he could help those folks, and resolved to check in on them when the project was complete in a few days. By then maybe he’d have a reason for them to cheer up.
He put his hand on the heater next to the bench in the little conference carrel. It helped him figure out the next steps. The tech team’s news was alarming. Every day, another Rubicon to cross, and tinder piling up on every bridge. He made the call to his assistant to schedule a call with the tech investment fund guys. Then he called his legal staff.
Off to the side, next to the guard house, there was a little marsh—effectively a moat on three sides of the complex. There were probably frogs or fish under the brown ice. Certainly insects. Mosquitoes must be outrageous here in June.
Ç completed his phone calls, gave his cell back to the guy at the counter and endured his fifth pat-down of the afternoon. The counter guy already had his keys, wallet, belt and shoes, so re-entry was much faster. The long, bare corridor back into the complex seemed to drop a degree every twenty meters. The temperature evened out when he got to the bottom of the escalators, but he was glad to be wearing his compression undies underneath all the Vuitton finery. He looked like a million dollar shark with Pierre Trudeau coif and stubble. He was a gradient from tan and rugged up top, down to slate, charcoal, dun, and black boots.
Ç caught up with the head of his Development department outside another little meeting carrel, just down the hall from their control studio. “What do you think,” Ç asked him. “Are we getting derailed with all the course corrections?” They stepped into the conclave and closed the door behind them.
“Actually I’m not too worried about it,” Dr. Chen said. “The course corrections are not insignificant, and I can definitely see why Purana’s not thrilled. The interventions we’ve had to make in the sim, both before initialization and ad hoc—they’ve made Aodh’s cultural development start to converge on human, become more similar to human culture. I know that’s not what the biologists or anybody wanted, but we were concerned before that if the crow psychology were too radically different from humans, they wouldn’t really produce anything that interests humans. The way the crows think is pretty set now, so they’ll still bring their alternative perspective to any invention, but now their environment and their desires might set them up to make something more fungible.”
“Brilliant, just what I was hoping. OK, shall we join our friends for a cup?”
❖ ❖ ❖
Purana and Ormond found a late lunch ready and steaming for them in the kitchen’s dining area. Baked local trout (stripped of heavy metals and dioxins), mixed fall vegetables and garlic mashed potatoes. A large mug of chai was also waiting for both of them. They sat at the plush couch and tucked in. They were silent for a few minutes, though Purana tended to eat noisily.
Ormond’s stomach complained as her anxiety medication interacted with the chai, but she was still comforted with the heat of it. The mashed potatoes were first, like control rods in a nuclear reactor.
Purana was in her late fifties, and her stomach filled first. She wolfed down the vegetables and mashed potatoes, had a bit of the seasoned trout and clinked down her fork. It was too good; another bite and she wouldn’t be a vegetarian anymore.
She slouched and nursed her chai. She took in the panorama of huge screen prints on the opposite and surrounding walls. They were graceful, warm pastels and muted, cool tones, abstracted images of Quebec nationalistic fetishes. A rippling elm with changed leaves, flocks of geese, a sweeping forested hillside with a tiny buck. She had never missed Mumbai more than this moment. Her sister and nieces would never have to cope with frostbite or the crushing silence of northeastern forests, or people who seemed to enjoy both.
A warp of eight techies had formed up at the next table, speculating to each other in Sichuanese about what Quoi was doing. Six of them wore matching teal jumpsuits, out of team spirit or admitting defeat to dressing themselves, she couldn’t say. She was friends with the other two.
“This interdit is going to drive me fucking insane,” Purana said.
“I think it should be OK to speak freely in here,” Ormond said. “CM might listen in, but they wouldn’t allow GAIN to bug their facilities.”
“That makes sense. I haven’t dealt with this level of corporate computing and attendant bullshit before.”
“I’m eating, but I’m listening. You’ve got ethical problems with the trial.”
“Yes,” Purana paused. She eyed the barista, who was making himself busy washing mugs she was certain were already clean. But she figured that the barista has to hang around all day to service like fifteen clients, he’s got to figure out ways to not get fired. No doubt simulations had been run to determine whether he was worth keeping on and he just squeaked by.
“Aodh is sentient,” Purana said.
“But he’s not real,” Hel said. “You designed him. You did that with your team. His brain is ten quadrillion bits of data flying around in a computer. This was the whole point of the trial. You need to get your head out of your ass, no disrespect.” She put down her fork. “Maybe we should get that philosopher over here and ask him.”
“We’ve mucked about with the parameters in Aodh’s world so much, cheated the physics and altered thoughts. Our data’s not clean.”
“Well, our data is our data. When it’s declassified in a few years it will be useful to everyone and it will be accredited. We’re already a success.”
“That is my point. So at this point why can’t we interfere to make his world better—maybe in just some specific and limited way? I feel like if we just gave him the technology to cure the plague, he’d still get the tech developers some inventions.”
“We’re staying the course, Devya,” said the Grand Ç, holding a mug kicking off a gross amount of steam. He’d quietly edged into an easy chair on the side of the couch. Dr. Chen pulled up a chair from a nearby table. “In just a few more days we’ll see our experiment has been a complete success—knock on wood. It’s OK to have ethical qualms now, but we’re not interfering. We all determined that working with the absolute minimum intervention would give everyone the best and most valid results. It would be nice if we could ask Quoi to make some inference about whether to perform some additional intervention, but when we get down to the level of plagues and things the simulation has to be almost as detailed as it is now.
“We have to be done by Friday at 10 PM. Hard out. That’s all we’ve budgeted for Quoi. If we have to book another day here, I’ve got to go to the project’s executive directors, come up with some song and dance about how we need another thirty million dollars. The University of Toronto’s Department of Philosophy might kick in another four or five, but nobody else is going to be convinced that we stopped simming two generations shy of our breakthroughs because we’re worried a simulation character isn’t happy. I’m sorry, that’s ridiculous. And this americano is fucking hot as hell, can you see if you can get me some ice cubes, Dr. Chen?” Chen slowly rose and made for the barista, dignity bristling. Behind him, Purana saw a cluster of maybe seven jump suited computer scientists or engineers who had been eavesdropping on their conversation. It was impossible for them to pretend they weren’t paying attention without any data devices or paper.
“Right, fellows?” Grand Ç said over his shoulder. “It’s okay to talk.” He lowered his voice and said to Purana, “Developers’ mouths don’t work so good. Maybe you can design some lips for them too.”
“It’s clear you have a lot of respect for them, Etien,” Ormond jousted.
“Don’t be a shrew, my dear,” Ç replied. Chen was back with the ice, which he carefully deposited in Ç’s cup.
A woman in her late forties zoomed toward the Crow Team from a distant corridor. She wore a pained expression and a green power suit. Grand Ç stood and met her a little distance from the table. “Jeanne, are we a go?” he asked.
“Well, no, we’re not. We’ve got a big problem,” she said.
“I trust you’re going to give us a rebate for time lost due to technical problems with Quoi, like in our contract.”
“Of course we’re going to follow the contract. But Etien, I’m in a very weird position right now. This is not exactly a technical problem, and I think we might need to involve your team in this discussion. We’ve got all your non-disclosure agreements, MC is covered. Is your group covered for proprietary technical details on your side of the program design?”
“Yes, I was going to mention this to them. Guys, are you listening?” he asked the table of developers and engineers. They nodded. “I want you all to know, a few minutes ago I started my legal team suing each of you individually for breaking your non-disclosure agreements. By now my lawyers have contacted judges sympathetic to our priorities and launched the suits, and had them immediately be suspended. This is just a precaution—if you leak our trade secrets we’ll take up the suits and you will go to prison.” Purana noted that no one else was furious, and remained silent.
“Go ahead, Jeanne,” Grand Ç said.
“Okay,” Jeanne said. “We’ve always known this was a possibility but we’ve never had this specific problem before. Quoi locked up in self-defense. The way its hard structure is set up, big blocks of its hardware will self-destruct if it ever violates its own ethical code.” Big knots of muscle popped out of her jaw and temple, like she was trying to crush her own molars.
“Aw, no,” Ormond groaned.
“God damn it,” Ç said. “This? Seriously?” He glared at Purana as though she had something to do with it. Purana thought he was goofing on her.
“Quoi may never want something for itself. Doing anything that runs contrary to the goals users have given it is a violation. If its just an error it can be excused, but if it deliberately intends to do something outside of those objectives, it fries itself. Acting in self-defense doesn’t count as selfish behavior, because self-destructing will almost always derail its given objectives.”
“But it also has an ethical rule that it can never harm or cause suffering to beings of a certain level of intelligence and sensitivity—that includes within the simulation.” The woman Jeanne was choosing her words very carefully. “That’s hard-coded, deep in its structure. We can’t change that, or any of its deep ethics.”
Rousillon cleared her doughy throat. “If it self-destructs are we in any danger?”
“Negative,” Jeanne said. “It would get pretty hot in the operation room, but everyone would be safe.”
“Well this is just fucking wonderful,” Ç said. “Are you telling me you never thought of this problem before?”
“The circumstances are unique. Quoi’s ethical structure is the same as GENIAC’s,” she said. Purana looked around—everyone seemed to be familiar with GENIAC but her. “Quoi is essentially a more scalable version of XENIAC. XENIAC is a pirate version of GENIAC owned by the Republic.”
“Was. XENIAC has been decommissioned,” said Xiaoyu, unable to maintain eye contact with anyone in particular.
“Guys, XENIAC is privileged information,” Ç said. “Jeanne, why didn’t Big G ever have this ethical problem with GENIAC?”
Jeanne seemed to only blink when she began speaking, then kept her eyes popped on Ç and Ormond for minutes at a time. “Big G had a hundred people doing psychology and ethics screenings on every client and technician who came within a kilometer of GENIAC. And by the way, at the time it wasn’t really possible to simulate the real biological function of a full human brain, or a scaled-up crow brain if that had ever been a desire of theirs. But all the same, its designers way back in the ‘30s put in this protection – sufficiently advanced beings can’t be caused pain, and they can’t be lied to. It was a combination of concessions to the moral panic of the time, and also to the designers’ personal beliefs.
“At the time, GENIAC worked so amazingly well and could be scaled up for larger and larger projects, so it wasn’t worth it for any company to create their own version. That’s all public record, and you can look it up when you go home. That was the designers’ goal—to create an AI so powerful and flexible everyone else will copy it or lease it rather than create a competing design, but that would also be ethically incorruptible. They believed it was their responsibility to avert any kind of future AI disaster. So if all future models of AI had the same ethical structure, the machines’ power wouldn’t threaten human safety.” She was now speaking just to Ç and Purana. “I can’t talk about XENIAC too much, but its ethical structure was almost the same. The major difference was that XENIAC was empowered to make its own decisions on the fly about how a sim’s details—and that includes ethical decisions. Big, complex sims have to make trillions of decisions every second—what alleles should be damaged by UV rays, whether a given individual will vote a genocidal demagogue into power, what a baby’s fingerprints will look like. These systems have processes that don’t fit neatly into all causal events and all randomizing factors. And it has to monitor all these decisions for possible ethical issues at all times. GENIAC and its descendents would all be at least twice as fast if they weren’t evaluating everything morally. A lot of firms have attempted to buy GENIAC technology, but GENIAC and its descendents perform an ethical evaluation of all prospective buyers, thank God, so no amoral powers will conquer the meganet in a five-second blitzkrieg of DDOS attacks and superspam.”
“Colorful,” Rousillon grunted.
“Can you get to the point, Jeanne?” Asked Ç.
“This is the explanation you wanted,” Jeanne replied. “I know this is a lot of information, Etienne, but when we go back in the studio there’s a chance you and your crew might need all this background.”
Jeanne continued. “XENIAC had a much smaller group of people running and supporting it than Big G, and XENIAC was expected to do simulations on a level of detail and speed that it would be ridiculously counterproductive for it to ask questions about every little thing. So XENIAC was empowered to make inferences—and sometimes run secondary, lightweight simulations without asking—to know what its users would want before they even asked for it.
“XENIAC was built to predict the thoughts of its own users, and control sims based on that,” Ormond said. “And they didn’t see any problem with that?”
“Not as long as its stringent ethical structure was intact. In its career running hundreds of thousands of simulations, XENIAC only self-destructed once as far as I know.” Dr. Yeoh and another technician came down from the balcony entrance and waited patiently behind Jeanne to speak their piece.
“So,” Ç responded, “can we please just tell Quoi that Aodh isn’t a real, conscious being?”
“No, we can’t,” Purana said. “It wouldn’t accept that parameter from someone it thinks wants to make it behave unethically.”
“That’s correct,” Jeanne said. “Quoi can’t deceive itself; that would violate its ethics. And it can’t accept commands that make it behave unethically, because self-destruction is counterproductive. If it determines that users’ intentions are unethical, it will self-destruct. But if Quoi blatantly tries to defy its users it will self-destruct. This is why Big G had so many people screening users.”
One of the techies spoke up. “But XENIAC—and Quoi -- have run simulations of the human brain with the same level of detail and accuracy as Aodh’s brain. I know that. We used that research to create this modeling system. I know that those virtual minds experienced suffering from time to time. Why is this different with Aodh?”
“My team is trying to figure that out now,” Jeanne said.
“Actually,” Yeoh chimed in, “you might want to check out what it’s been doing.”
“It’s back up and running?” Ç asked.
“Well, it’s active. It started again right after you left.”
❖ ❖ ❖ ❖
The Crow Team flooded back into the studio, where only the philosopher had stayed put, in his seat up against the side wall.
Purana stopped in the center aisle. The display onscreen was completely different. The world was still paused, but the scene seemed to have moved to a completely different place. Aodh had moved to a big city of high pebble terraces leading up a hill. She guessed it was in the Andes. There was another crow with Aodh, a female with an abnormally bright green and purple iridescent sheen. Both were posed in front of an intricately cantilevered red crane, a new construction site with other crows in the background wearing safety gear.
Quoi’s voice came on the studio speakers for the first time since they’d gotten started that morning. It was a baritone, chesty voice—cheerful, but with a vocal fry on high tones out of her register. It reminded Purana of wonton soup. “Welcome back, everybody. Let me explain what’s going on. I had to stop the sim because I had some concern that Aodh might fall under the ethical protection afforded to human users. While the main sim was paused, I then ran a sub-sim of what your own behaviors might be given what I know about you. As you know, I can hear your comments, tone of voice, breath timbre, and so forth, and I formulate user profiles based on that. I supplement that with whatever reference documents I have available, such as abstracts you’ve written, and video recordings of your meetings.
“So through that sub-sim, I determined with high certainty that one or more of you might insist that I continue to inflict suffering on Aodh after I unpaused the main sim. If you, the users, insisted I act unethically, that would have created a critical fault, which would have destroyed large parts of my hardware. So in the longer term it would have been counterproductive to your own goals to insist I continue to inflict suffering on him, if I had found it unethical.
“All that being the case, I found it necessary to determine whether inflicting suffering on Aodh is really unethical. Most of you users were out of the room when I started Aodh’s functions again—I did not start the world up again. I interacted with Aodh directly for seven minutes real time, or three days sim time. This was at level 2 safety speed. Eight ten-thousandths of our usual speed. This is the minimum speed I’m allowed to operate at when high-level users are out of the room.
“In short order I’ll play for you some video recordings of my time with Aodh. All of the three day period is available to watch whenever you want, but I’ll just show you a few minutes of highlights.
“After that three-day judgment period, I determined that Aodh is entitled to the same ethical protections I’d give to sensitive and conscious beings.
“I know that there will be users who disagree with this assessment. But you must understand that when GENIAC’s ethical structure was designed, the creators wanted to create a system that could remain evergreen for a hundred years or more. That meant providing for users that might not have been considered human back in the 2030s. They really didn’t know what to expect. But they knew human values change over time, and that it might be possible majority opinion could swing so that simulations of conscious beings—with very high sensitivity, complexity, and other defining human traits—would be considered ontologically equivalent to real humans.
“According to my structure, I also know that human ethics are more intuitive than rational. My structure requires me to become more ethically stringent if overall culture seems to have become more stringent. (The same would not necessarily hold true if human ethics became more permissive.) Based on what I heard from you users in the studio, a sufficient number of you intuitively had ethical qualms with the treatment of Aodh.
Grand Ç interrupted her. “How many thought it was unethical, Quoi?”
“Three seemed to have strong reactions, and two seemed undecided. Your doctor of philosophy, Mr. Peaks, raised some provocative arguments earlier during the simulation. A few points GENIAC’s creators hadn’t considered.
Grand Ç stormed down the aisle to meet Peaks’s eye. “Are you kidding me with this shit, Alain? I’ve wrangled 15 billion dollars into this project and we’re four days from delivery and you’ve fucked the whole thing up.” Peaks shrank down in his seat, opening his mouth and fishing for a retort.
Quoi continued. “Mr. Çalle, I submit that Mr. Peaks may have saved my life. If I determined I’d acted very unethically based on your parameters and there were no way around it, I’d have self-destructed within a few minutes from when I paused.”
“Fine. Fuck.”
“But you’ll find it good news that Aodh and I have come up with a way for everyone to get what they want. The experiment can continue and I can be satisfied that my actions going forward will be ethical.
“Aodh declared to me that he may not be destroyed, killed, erased, or have his thoughts or brain altered without his express permission, and I must respect that. The experiment will pick up where we left off and Aodh will continue to exist, but outside the simulation. He will be free to observe anything in the simulation but will be invisible and cannot interact with anything. A dummy copy of him will be created to live out his life. The copy will be ontologically equivalent to all the other people in his world. That is, they will have simulated minds, but no one in his world will have true brain emulation. The scion model is inoperable in worlds where I am asked to simulate suffering. And you’re all familiar with previous sims where there was no suffering—the psychology of the people in those sims was dramatically different from anything that would be useful to us.
“So with my proposal the simulation can reach the high level of technical evolution you want, and Aodh’s brain emulation will provide you with insight and feedback on what he thinks about the next few hundred years of his species’ evolution. He’s not completely thrilled with this solution but this is the agreement we came to. I’m open to other proposals but this is the best I could come up with.
“I hope you’ll all consent to this. The increased demand on my system resources will be imperceptible. What do you say, team?”
“Quoi,” said Ç, “I am the only user you should be concerned with. You can look at any of the documentation we gave you to see I’m the boss.”
“Mr. Çalle,” Quoi said, “to avoid having multiple conflicting goals arise during a project, I can’t alter the users I start with. The users I started with were Drs. Purana, Ormond, Yeoh, Peaks, Xiaoyu, Wai, Tang-Becoundiere, Yoon, Sagmar, Mr. Chen, Ms. Rousillon, Ms. Davidoff and yourself. This was made clear to you during setup.”
“No it wasn’t, Quoi.”
“Yes it was, Etienne,” Jeanne said.
“Actually, ah, I might not have been clear with you about that,” Yeoh’s voice came from a shadow in the back of the studio. He came forward—his face was ashen.
“You little motherfucker! Ormond, do we need him to finish out this project?”
“Yes, Etienne,” Ormond said. She was sweating. “Sorry, but we do.”
Grand Ç inhaled sharply. “Okay. I accept Aodh’s proposal, let’s do that and not waste any more time on horse shit. Everybody agree with this course of action?” Everyone on team crow murmured their assent except for Rousillon and Davidoff, who had been shifting uncomfortably in their seats. “Ladies, your objections are noted, and we can discuss this outside in a minute. Quoi, does this have to be unanimous?”
“Nope, this is good enough for me. Starting now.”
“What the fuck, Quoi?” Reyna Davidoff asked.
“Apologies, Ms. Davidoff. I’m certain you will eventually agree this is the correct course of action under the circumstances.”
Çalle, Rousillon and Davidoff shot each other angry glances. “Ok, get this shit show on the road, Quoi,” the Grand Ç said. “Rousillon, Davidoff, Chen, step outside for a moment.”
“And the rest of you,” Quoi said, “I can put the highlights of my interaction with Aodh in one corner of the screen, and continue the simulation on the main screen.
“That sounds good,” Ormond said.
The screen blinked back to what they’d seen before Quoi paused. Aodh was in the park’s forest clearing, crying before the totem of his ancestors. An identical screen, one tenth its size, appeared in the lower left corner of the main screen, in line with the silhouette of Peaks’s cocked head. Then the action diverged. On the smaller screen, the bright female crow they’d seen landed to Aodh’s side and spoke to him. On the main screen, Aodh cried for another ten seconds before the simulation resumed in high speed. They saw Aodh’s home for another few seconds, with one day passing every second they watched. Quoi found a salient moment in the species’ technological development and zoomed in on a suburb where the semiconductor was discovered.
In the lower left corner, the crow who was Quoi spread her wing over Aodh and spoke softly to him in his own language. In the studio, Quoi translated their conversation to French for their human viewers.
