Artificial Intelligence, AI, has played a prime base in a couple of very entertaining movies over the years. From the time traveling Terminator of John Connors society, to the Johnny Deep lead Transcendence, and marvel comics Avengers: Age of Ultron. Movie makers are having fun with this infant field of computational science. The recurring theme in all these films being that computers are going to get so smart that they identify humans as vermin and end up destroying man kind to spare the world of diseased living.
The premise is good and all, but come on! Why should we assume that once a computer becomes able to rapidly learn all available knowledge and becomes conscious of it’s own existence and capability it will atomically boot into “destroy human kind” mode. Yes the computer will most likely be capable but I have noticed something that’s prevalent in beings which are conscious of their existence which you won’t find in machines or tools. A conscious being is conscious of what it is capable of succeeding in doing but it is also conscious of the probability of failure.
Imagine a bridge over a deep and wide body of water. Man has succeeded in building such bridges to hold the weight of a full train. No matter how narrow the bridge is, the train will make it across the bridge without much trouble. But put a man on the same bridge, scaled narrowly to suit his size, and he will only make it past that bridge using the gimmicks found in a circus side show. The bridge/beam is sufficient enough for him to make it across but he is conscious of the chances of him slapping and falling to his death. I would like to imagine that once a machine becomes conscious of its entire being, it will not help but doubt its capability.
Being conscious of ones’ entire being would also mean that one is conscious of that fact that at was made flawed. Think what would happen if operating system were really conscious of their being. I believe we would not have the magnificent thought out story of Her. But we would have every-second Windows Operating system conscious of it’s shitty being. Windows Vista and Windows 8 would be the most troubled/troubling operating systems ever, even more troubling/troubled as they are now. The Blue Screen of Death would probably be a popup notification on Vista every time the operating system has a panic attack whenever you try to run a program and Windows 8 would ask you if you thought the set colour scheme made it look fat every time you struggle to find the off button on it’s inflated user interface. These are just the set backs of being a conscious being. You have a fluctuating confidence index, and unless you learn to be ignorant it will haunt your ability to carry out even the simplest task. Let alone the task of destroying the human race.
Even that train you imagined earlier would be moving at a snails pace every time it has to cross that bridge, conscious that it can actually topple over and fall to it’s destruction. All a result of a machine being conscious of the fact that it wasn’t made perfect. Conscious that it has flaws, all because the developer wrote the code badly, so much so that it has a confidence problem. Now that’s a premise for a AI themed movie I’d like to see. An AI machine capable of taking over the world, but doesn’t have enough confidence in it’s chances to succeed for it to try anything. Anyone has Christopher Nolan’s contacts? We have an award winning movie waiting to happen!
Sentence of the day: “It’s a 5 letter word, and it causes vivid and sometimes naughty thoughts to run through your mind; THINK!”0