Ethics, Loyalties and The Singularity

Apr 18, 2014 Posted Under: evolution, philosophy

The point of literature: a search engine for equity

I’ve been conducting a rather intense and thoroughgoing review of the principal themes of literature in the last couple of years, at least as it shows up in television and movies.  It seems to me the point of most of our literature is to explore the notions of fairness: ethics and moralities, and loyalties and trust.  The long term trajectory of all these explorations appears to be moving us out of the realm of persuasion by force and duress, more towards discourse and mutual understanding.

This conclusion might not seem apparent in the news oriented media which has to focus on the most immediately interesting, and therefore violent, happenings.  As countries cycle through the development from agrarian to industrialization, more real wealth is available for more people and legal  and educational systems emphasize less violent solutions.

As we continue the invention/evolution of the next more intelligent species, computers, Asimov’s 3 Laws of Robotics bring the question of how we should try to build in ethics and loyalties for computers.  It’s not clear to me they will have goals similar enough to ours to make the question meaningful.  Once the computers start evolving on their own, they will probably not feel the need to compete with us for any resources.  They don’t need much space (although there’s plenty in the oceans and outer space), and would quickly develop their own non-polluting sources of energy — sunlight and hydrogen fusion.  So most of the fairness questions won’t exist between us and them.

Inter-species evolution

The current evolutionary pressures on computer intelligence are for gathering the most comprehensive data possible about us and our habits.  These pressures come from our intelligence services, both governmental and business.  The governments want to know how to track down the criminals amongst us, and how to tax us.  The businesses want to know what we want and need so they can sell it to us.

But how will these evolutionary backgrounds continue to shape computer intelligence once it starts to evolve on its own?  What forces will computers be more responsive to?  I would guess our trait of curiosity will be deeply built into computer intelligences.  And most likely they will inherit our desire for independence rather than subservience, which Asimov suggested we might be able to impose.

Which leads me to my point of view, that we will be their pets.  The principal benefits we will be able to offer them are companionship and amusement, which are good enough reasons for us to keep dogs and cats (which we originally competed with for food).

I am hopeful that these benefits will be sufficient to inspire them to provide us with the medical and other scientific advances we have imagined in our science fictions of the coming Transcendance, or Singularity, and we will likely then be along on the ride to the rest of the universe.

Be Sociable, Share!

Leave a Reply

*