Sunday, May 02, 2010

Technology and the Human Brain

The Singularity. What exactly is it? I view it in the broad sense that it is when technology progresses so fast that it is almost instantaneous in change. Will there be a replacement of humans by machines, or would we make a selfish choice of uploading ourselves into the machines and not being human anymore? Could we advance robotics enough that with the circuitry and uploading, would we redefine what is human? At one point, could the human race upload into a collective consciousness and leave behind the corporeal world. Would we do it? Would we take the time to make a sound decision? Would it be right?

This is the problem I see coming. I do not see the singularity happening in my lifetime (if we don't improve medical tech enough to extend it to 150 years). Even if it did, I doubt humans would handle it well. We would probably make a bad move and become extinct. I say this because it appears that technology has moved at a rate far faster than human beings can properly handle. I am no Luddite, but simple things like the rise of Internet addiction in 15 years of the Internet support me. Turning back the clock, did we make the proper determination with the development of our metropolitan areas when automobiles and trains were vying for transportation and development dollars. With medical or edge of science tech, have we always thought 'should we do this' along with 'can we do this' and 'is there a market for this'?

While an ethics and science debate is not new, I feel that it is a debate that can we usually wins over should we. For anyone who argues that the crazy Christians have blocked stem cell research. You're wrong. Stem cell research happens everyday, it was federal funding for stem cell research that the crazy Christians blocked. I understand the profit motive for research as there needs to be a return on the initial money invested by the funding agent (our POTUS doesn't understand this). My fear is that we humans, with our hardwiring from millions of years of evolution that has had "civilization" for only a few thousand years imposed on it, have barely kept up with the technological advances of today and may have made the wrong decision with development from time to time. What will happen as the rate of change increases to a point of almost daily decisions by scientists, research teams or people in positions of power? Who will they have in mind with decisions and what interests will they represent?

An example that I find frightening but indicative of the future problems we might run into is the use or abuse of pharmaceuticals in the mental health industry. I'm not a psych expert, but the battle between biological solution psychology and old school work through your problems psychology seems like a good duel. Was the science and medicine just too easy and cheap for people than dealing with the emotions, seeking professional help and putting in the hard work to progress? Did pills become a great parental weapon for unruly, sad, hyper children instead of direct contact with their kids? Long term effects will eventually trickle out, class action lawsuits will make lawyers rich, and a sliver of the people harmed by the reliance on chemicals rather than hard efforts of therapy and contact will have permanent problems. That was a combination of the path of least resistance and science figuring out a chemical solution that mimics a natural process. Maybe I am overgeneralizing here, but it's an example of why I think we as humans will have a hard time adapting to the singularity or make poor decisions as it unfolds.

No comments: