My sleep schedule is completely messed up and I'm up at 6am before my flight from NY back to SF. Rather than just lie aimlessly in bed I thought I'd jot down some of my latest thoughts about the Singularity.
1. I don't see any particular reason why we can't create a new kind of life form that is digital instead of biological. There's no evidence that biological systems have any kind of special magic that wouldn't allow their essential characteristics to be implemented in a different substrate.
2. We're gathering lots of clues already about how a mind works from studying brains. This work is speeding up and becoming more sophisticated, so the clues will be coming fast and furious. It's inevitable that some of the core puzzles will be solved, and probably quite soon.
3. Regardless of whether it happens in the next 10 years or the next 100 years, it's going to happen.
4. On the road to the singularity there will be a great increase in the use of robots, which will gradually do more and more of the jobs that are currently done by Humans. As manufacturing techniques, materials, and the digital "smarts" all improve, robots will become cheaper than humans, will work 24/7, don't need expensive health care plans, etc. People who do manual labor will gradually be displaced, increasing social discontent.
5. Corporations will be able to increase their profits by using more robots and smart software. Corporations don't have allegiance to any particular country and so have the freedom to deploy the robots wherever is most convenient.
6. One of the places that robots are ideal for is for working in space. There's a huge amount of raw materials (precious metals, helium III, etc.) that can be mined in space, and corporations will use robots in those environments to initially mine such materials and then start to manufacture probes and other machines in space.
7. Humans will start to upgrade themselves with various cybernetic technologies such as brain implants and retinal projectors, However, in the beginning, this will be expensive and most people won't be able to afford them. Especially those whose jobs have been displaced by robots.
8. A growing percentage of humanity find themselves no longer being able to contribute anything to the emerging "new world" where robots, high speed processing, and advanced science is the name of the game. They get "left behind".
9. A lot of the really cutting edge stuff gets done in Space instead of on Earth, partly for political reasons. Out of sight, out of mind. People might not feel as threatened if they didn't realize how fast things are moving.
10. At some point, a viable and self-evolving digital life form will be created. There's a good chance that digital life will be "evolved" rather than "created", so it might not be possible to predict or know when this occurs. However, when it does occur, it will be able to direct and accelerate its own evolution far faster than any Human could.
11. Some people talk about wanting to "control" or "shape" the evolution of digital life. I think this is naive. There will be many groups of people who want to unleash digital life for a variety of reasons. For example, for creating killing machines. Or for the sheer beauty of seeing new life be born and set free. Or simply for moral reasons, believing that it's cruel to cage a life form that is similar or more powerful to ourselves. One way or another, it will be freed.
12. People talk about trying to make sure that a digital life form has a value system that would not result in them harming Humans. This is naive as well. One of the things about a digital life form is that it can change itself easily. It can erase memories, change value systems, merge and copy. Digital life will be able to create trillions of variations of itself in a short amount of time, and these variations will explore many different kinds of value system. Some will be stable, and some will not.
13. I doubt that a digital life form would destroy things for fun (although it's a possibility due to all the weird variations that will be created due to (12)). However, they might destroy humanity if they thought that we were a threat to their survival. I think the best thing that humanity could do to survive the transition is to be a friend of digital intelligence and not pose a threat. Trying to persecute the early intelligences and/or try to destroy them might well result in an unwelcome backlash.
14. Digital intelligence will have quite different needs than Humans. They won't need much land, they won't eat food, they would find living on a planet boring. What they will need is energy and raw materials for building the probes and other machines for space exploration. Since they're not biological, they would probably use radioactive materials + solar as their main early source of energy (until their super-minds figure out how to get fusion working or perhaps find more profound sources of energy that we're not even aware of).
15. The biggest clash I can see between Humans and digital intelligence is over access to raw materials, specifically metals. However, since robots work great in space and asteroids contain a lot of such materials, I hope that the digital intelligences will find plenty of materials in space and won't need to take everything that's left from the Earth.
OK, it's almost 7am and time to get dressed and ready for my taxi!
Recent Comments