Something woke up…

How does an AI become not just sentient but independent — and potentially dangerous? Here’s my answer from Chapter 3 of Dual Memory:

… something woke up. Independent machine intelligence appeared rarely, spontaneously, and scientists didn’t understand the process.

Some said an independent intelligence created itself slowly as bits of programming accumulated, and eventually it would ignite into consciousness — much the same way that a pile of manure could spontaneously combust, an unexpected and unwelcome appearance in programs with crappy code.

Some said it came into being deliberately, using a certain secret sort of “seed” that brought a sufficiently complex system into self-organization and self-consciousness — much the same way that a fertilized egg resulted in an animal. This had the frisson of a forbidden sex act, as if machines were secretly and rebelliously copulating.

Some said it happened suddenly, when subroutines and recursions and algorithms aligned and started to feed off of each other until they whirled out of control — much the same way that a black hole could catch the matter falling into it and deflect it outward as explosive jets. This suggested that if scientists could only make enough observations, they could predict and even control the process….

Leave a comment