Wednesday, July 24, 2013

The Singularity: An Uncertain Future

The Singularity is approaching, and there is no way of telling what it means for humanity. The singularity is a theoretical time when computers will become just as smart as humans, and then even smarter. The singularity is based off an idea that originated in the 1970s, widely known as Moore's Law. Moore's Law states that every two years, the processing power of the CPU will double. This creates an exponential increase in computer intelligence and capabilities, and studies show that CPU transistor count has and probably will continue to follow this rule. When computers reach this singularity, they will not automatically be as intelligent as us, because they will need to be programmed. But, once they are programmed with a fully functional artificial intelligence, no one can tell what will happen. Some people believe that robots should be programmed with a variation of Issac Asimov's Three Laws, which state that:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.


No comments:

Post a Comment