The Misnomer of the Technological Singularity

Recently, I have come across the term technological singularity in my journey through cyberspace (i.e. The Artificial General Intelligence Society; the Association for the Advancement of Artificial Intelligence; and Humanity +). Essentially, the technological singularity is a theoretical point in the future where the rate of technological progress and artificial intelligence develops to such a tremendous level that it dwarfs all other intelligent life, transforming the world beyond anything humans can predict or fathom. A more comprehensive description of the technological singularity, along with who coined it and who popularized it is available here:

Technological Singularity

This scenario doesn’t sound that scary, right? Yet, something very ominous and forbidding is implied when one compares something to a gravitational singularity. In astrophysics, a singularity is a point theorized to have infinite density and zero volume, such as at the center of a black hole.

If one considers the technological singularity from the more sensationalistic doomsday scenario, then it might be a point in the future when all intelligent life is swallowed up by artificial intelligence and reduced to fiery smithereens in an apocalyptic accretion disc of death and destruction. Going by this interpretation, one might be lead down the road into believing that humans are incapable of foreseeing the consequences of their own actions. It would be the equivalent of telling a child, “Yes, you can go build yourself a robot, but you may not play with it because it might hurt you.” Inherent in this is a deeply-seated notion that people should be afraid of what they are capable of creating, and hesitant to master the tools at their disposal.

On a side note, after spending the better part of a week busting out a book proposal, revising a query letter, editing half a book, and finding the first literary agency I want to submit to yesterday, I discovered that they are on vacation from December 23rd until January 6th [ *Grumble Grumble* ]. It’s fair enough though, everybody needs a holiday. It’s taken me eight years to finally get around to making this book happen, so another two weeks won’t hurt it.  Though, I wonder where all of those artificial literary agents are hiding at and if they charge an arm and a soul…hmm.

Bettering a Mouse Trap Builder

The process of making one’s first blog post is usually fraught with nerves, as it’s the standard bearer against which all other blog posts will be judged. What course should one take? Low ball it and hope people will be impressed by one’s improvement, or aim high regardless of whether or not inspiration will continue to strike like clockwork? Personally, I prefer stargazing to puddle gawking and sandbagging.

As I start to edit my manuscript for this new discipline, Affect Engineering, I can already hear the technophobes yelling at me, “Why would anyone want to invent a math equation to model emotion? That’s just like asking for the androids to come in and replace us!”

To them, I would say, “The androids would be too late.” People must always be the masters of their own tools, or else they risk being dominated by them. A robot is a tool built by people. Emotions are tools most of us come ready-equipped with but not knowing how to use.

I am not building a better mouse trap. I am bettering a mouse trap builder.