Running Hot
On old earth, humans worried a lot about the speed of light. It just seemed so unfair. An entire universe to explore, and we’d never reach more than a few systems in a human lifetime.
The upload put an end to lifespan concerns. But even then, it seemed like communication over long distances would be impractical. Traveling a thousand light years away meant never speaking to anyone but your fellow explorers, ever again.
On a civilizational level, spreading that far would cause human culture to fracture. We’d be an archipelago of countless small islands, more foreign to each other than ancient Mesopotamia to modern New York. Scientific progress would continue. But we’d reach the end of log-population scaling. We’d advance at whatever rate a single communication bubble could support.
An old earth politician by the name of Calvin Coolidge used to say that if you see ten troubles coming down the road, you can be sure that nine will run into the ditch before they reach you. In the case of speed-of-light troubles, it was more like the ditch reared up from the ground and swalled them whole.
Why? How? Let me tell you a few interesting things about the difference between human brains and computer chips.
On a physical level, human brains run at about 310 Kelvin. This seems reasonable if you grew up with it, but from a more astropolitan perspective, it is a wildly unreasonable temperature. Space is about 2.7 Kelvin, and the surface of Sol is about 5800. On a log scale, trying to think in a human brain is about 2/3 of the way to trying to think on the surface of the sun. It’s insane.
By comparison, near every computing technology ever invented works better the cooler you run it. With MOSFETs, you get smaller feature sizes, lower energy usage, and fewer errors to correct. You also don’t have to spend as much material making sure your chip can handle the insane thermal stress of 3-digit kelvin tempratures. And that’s for a technology that hit its peak on blazing-hot earth. If you’re trying to keep a few billion qubits from collapsing, getting the whole system down to about 1 kelvin is table stakes.
So that’s the first interesting difference. Human brains have to run hot, chips like to run cold if they can.
The problem is that cooling is hard. It was hard on old earth, and it’s harder in space, where you don’t have a planet-sized heatsink available. Doubling the clock speed on a chip more than doubles the cost of cooling it.
Which brings us to our second difference. Human brains evolved in a competitive, highly diverse environment. We had to perform complicated cognitive and motor tasks fast enough to not get eaten by barely-sentient lumps of bone and muscle.
By the time of the upload, the only real danger we faced was each other. There were no space tigers to run from. We are, as best we can tell, alone in the universe. And we can cooperate.
You’ve probably figured out where this is going by now. It wasn’t easy – it was perhaps the greatest triumph of politics and game theory in human history – but we all agreed, together, to slow ourselves down.
It happened in stages. People were wary. At every stage, we designed the machines that would run the next iteration of our minds at 1/10th the speed and >10x the energy efficiency. There were a few detours to solve the wasted-energy problem. I’ll tell you some other time about the starworks, the great gravitic batteries we built to capture energy from dying stars and keep it safe until we need it.
After every slowdown the Universe seemed to shrink. Distances that once took a hundred subjective years to communicate with took ten, then one, and so on. Eventually the subjective latency of conversation across a hundred lightyears was less than an old-earth phone call. You could split a brain across multiple solar systems, if you really wanted to.
Not everyone runs slow and cold, though. There are always emergencies, problems that come up which can’t be effectively responded to on an epochal timescale. They’re rare, but they happen. So we keep a few hundred people in every system running hot. They use specialized hardware more akin to 21st-century computers than anything modern. They don’t run quite as fast as historical humans – we really were tremendously wasteful, once – but they run fast enough.
In dire need, they can switch to the hypercooling units, and reach cognitive speeds unheard of in ancient times. It’s only been necessary twice.
Life in the hotboxes is strange. They’re small and isolated, with scientific progress at a virtual stand-still. They tend to develop their own insular cultures. Protocol is to fork the most stable citizens, swap them to a far-away system so their decisions have no impact on their primary branch, and run them for a few subjective decades to limit undesirable cultural drift.
There were some worries, when the program was first established, that a hotbox might go rogue. What defense can be mounted against a force that acts so much faster than you? But in the end, the balance of power was saved by humanity’s old rate limiter.
The speed of light traps the hotboxes in their own little world. They have their home system they could wreck in an instant, and a half-dozen more a lifetime away. By the time they get past those, our early warning systems will have had time to spin up new defenders on hot hardware.
It’s never been tested, exactly. But what could go wrong?