The Cylons Cometh: Human-Machine Hybrids and our Impending Immortality

BlogHer Original Post

Calculating the rise and fall of science fiction books, television shows, and movies, I've determined the obvious. Science fiction is no longer dismissed easily as distractions for geeky misfits or as fanciful tales for children, and that may be because the world's observed science fiction over the years become science fact.

So, here I am at 50, a Star-Trek-Twilight-Zone-Outer-Limits-Lost-in-Space-fed child of the 1960s. When I finished high school in the 70s, universities anxiously pitched computer science to graduates with the right test scores, hoping potentials could be drafted to the future. My generation may be part of the reason television's pushing out science fiction shows -- the retired Lost; Fox's Fringe; CBS's FlashForward, which has been cancelled; and the return of V and Battlestar Galactica. The last on the list has given birth to a prequel, Caprica.

Battlestar Galactica Cylon
Image by kodiax via Flickr

My generation grew up on television, pressed the on-buttons of the first personal computers, made playing video games the cool thing to do as we nursed our Pac-Man addictions, and passed our growing dependence on technology onto our children who flock to movie theaters jonesing for special effects and silver screen spectacles that make them believe not only can Superman fly, but so can they. And they dream it into their visual arts, dance, music, and want so much more.

My daughter, 29, is working on a novel about a female general in a matriarchal society, and I am working on a novel about humans in peril on another planet. She and I had a discussion a few months ago about technology. I said, "I look at the world and see gadgets that 30 years ago would have been called impossible. We're seeing products and events today that were the stuff of science fiction in the fifties and sixties. I mean, I remember when the big deal was to have a car phone, a big clunky thing that cost too much and had to stay in the car. Here we are with cell phones almost anyone can purchase that are small enough to lose in a purse. Cell phones are like the size of Star Trek communicators."

She nodded. Living with me, she's watched me skip popular drama television to watch documentaries that plod along about evolution or mutation of the Y chromosome or the coming age of nanotechnology. She's joined her brother and me to watch films featuring Stephen Hawkings, not the popular stuff he's done lately, but the old stuff where a narrator's dry voice tells of black holes and Schrödinger's Cat and have interviewers with theoretical physicists who say Hawkings is not as bright as less popular physicists. She's been there when my son, now 19, has looked from the television to her to me to his own body and exclaimed, "OMG! We're a nerd herd."

Consequently, she was not surprised when I came home with a copy of the May 22-28 issue of The Economist with the cover "And Man Made Life: The first artificial organism and its consequences." Its editorial piece on that article is subtitled, "Artificial life, the stuff of dreams and nightmares, has arrived."

TO CREATE life is the prerogative of gods. Deep in the human psyche, whatever the rational pleadings of physics and chemistry, there exists a sense that biology is different, is more than just the sum of atoms moving about and reacting with one another, is somehow infused with a divine spark, a vital essence. It may come as a shock, then, that mere mortals have now made artificial life.

Yes, my dear Virginia, humans are near designing artificial life on laptops from the comfort of their homes. As for the nightmare part, all I can say is think of the possibilities for larger creatures. Think of that movie Splice.

Science, Technology, and Bioethics

I talked to blogger Yvette Perry about this latest advance. In her own words, here's Yvette's background:

In my current job my research focus is research ethics, in particular how to educate and train graduate and undergraduate students in the responsible conduct of research. My educational background includes a graduate minor in Bioethics. In my PhD program I focused on the ethics of assisted reproductive technologies and adoption; my dissertation concerned the beliefs about genetics held by adoptive mothers.

Naturally, she's been keeping up with our brave march toward creating artificial life. Speaking specifically of the news in The Economist, I asked a few questions:

Nordette: What are the benefits to humans and what are the dangers of humanity's new frontier of creating life in the lab, even microbial life? What are the ethical concerns?

Yvette: Some of the articles about this do a pretty good job of summarizing some of the concerns. Life--as anyone who has seen Jurassic Park knows--always finds a way. This fact will complicate our ability to see into the future and be able to say what the consequences (intended and unintended) will be for synthetic life.

In talking about cases like this there is a danger of hyperbole -- both in discussing the dangers and the benefits. A greater danger is that people who fall on both sides of that fence are not as scientifically literate as may be required for us to have fruitful conversations and debates on these topics.

Often popular media doesn't help matters. For example, I've read MSM reports that seem to give a picture of a couple of scientists whipping up a new life form. Kind of like Tony Stark in the new Iron Man movie creating a new element by himself in his basement lab out of some duct work--all in the space of a 2 or 3 minute music montage. In this case, though, the research team was huge (over 20 people are co-authors on the article), and they've been at work on this since Bill Clinton was in office. And the cost has been reported at something like $40 million dollars! In addition, a new "creature" was not created, as some people seem to have in their minds. Instead, a new genome was "built," it was inserted into the *cells* of an existing "creature" (Mycoplasma mycoides bacteria), and then the organism took directions from the new "instructions" and was able to successfully replicate.

All this is not to say that ethical worries are unwarranted. Just that a better-informed public might be better able to discuss them.

She's correct. The Economist editorial acknowledges:

... only the DNA of the new beast was actually manufactured in a laboratory; the researchers had to use the shell of an existing bug to get that DNA to do its stuff. Nevertheless, a Rubicon has been crossed. It is now possible to conceive of a world in which new bacteria (and eventually, new animals and plants) are designed on a computer and then grown to order.

Science analysts attribute much of this rapid growth to computers, and the Internet, which lets us share more information quickly, and some researchers see the melding of human with computer as the logical next step.

Enter the Terminator or Neo, Our Savior

This kind of new era science -- growing furniture to order, redesigning our genome to extend life, merging our brains with machines for enlightened brilliance -- is also the stuff of a recent New York Times article, "Merely Human? That’s So Yesterday." It begins telling readers about the BrinBot, a rudimentary robot commanded by Sergey Brin, the co-founder of Google, that maneuvered through a room of students at Singularity University miles away from Brin himself. The robot "consisted of a printer-size base with wheels attached to a boxy, head-height screen glowing with an image of Mr. Brin’s face."

Singularity University, per its website:

... is an interdisciplinary university whose mission is to assemble, educate and inspire a cadre of leaders who strive to understand and facilitate the development of exponentially advancing technologies in order to address humanity’s grand challenges. With the support of a broad range of leaders in academia, business and government, SU hopes to stimulate groundbreaking, disruptive thinking and solutions aimed at solving some of the planet’s most pressing challenges. SU is based at the NASA Ames campus in Silicon Valley.

It was founded, in part, by the brains behind Google. According to the NYT, the nine-day course where 40 people observed the BrinBot cost $15,000.

The people mentioned in the article are a who's who of high-tech, science, business, and even includes someone from the Department of Defense. One of the people mentioned often is Raymond Kurzwiel. As I read about him and his plan to live for 700 years and resurrect his father from the dead, I realized that I'd heard of him before. I recall he or something he did being featured on Oprah, and so I looked him up, googling his name with hers. He was featured once on Dr. Oz's radio show in 2008 and also in O Magazine under six beautiful minds, modern-day geniuses in 2007.

Although Kurzweil wasn't on Oprah's show, his concepts appeared in some form in a few segments about eating for longevity, make that eating less to live longer or by supplements, and high-tech ways to extend your life.

Singularity? You watch Star Trek; you know that word. As I read about Singularity University, I thought its goals encompass all aspects of the word's definitions. Its students are unique. Its professors speak of life extension, taking us to a point where life seems infinite and humans manipulate biology at will. At night, while the group drinks wine and snacks on popcorn, it discusses science projects only those with the intelligence of a gifted mathematician may grasp fully, such as the practical applications for determining what will happen when the derivative of humanity's journey as metaphor for a mathematical function becomes undefined, incomprehensible, which is sort of like the lab blowing up or our reaching the event horizon of a black hole. (Ow! That hurt my head.) Google founders, Google partners, people who've co-founded companies like PayPal and invest in Facebook have these conversations and take these classes. And I thought the old EPIC prognostication was spooky.

I read more and learned some of these people anticipate the demise of the rest of us, and the timeline for the Singularity's emergence is 20 years from now, around 2030, with a future of have and have-nots run by a superior intelligence, a human-machine hybrid. And I start to think these brainy people didn't get the Cylon memo from Battlestar Galactica survivors, and totally ignored what Sarah Connor had to say about SkyNet and the warnings from Neo and Morpheus in The Matrix. And those biologists working with them so did not read the "Patternist" series by Octavia Butler nor Mary Shelley's Frankenstein! Or maybe they did.

My mind floods with all the monster movies where the hero screams, "Noooooo!" at the Mad Scientist. They're trying to end the world, I think, but the adventurer in me thinks maybe not. Still, we've seen the movies, read the books. Do these trips into science ever end well?

I see benefits, but can't settle my disbeliever bone that rattles at the article's tidbits:

Devin Fidler, a former student, is in the midst of securing funding for a company that will build a portable machine that squirts out a cement-like goop that allows builders to erect an entire house, layer by layer. Such technology could almost eliminate labor costs and bring better housing to low-income areas.

I sigh, "Well, that's good. Those eliminated laborers will need that low-income housing since they will have no jobs."

However, in their midst, they have an insurance man as well, "Guy Fraker, the director of trends and foresight at State Farm." He's interested in prevention, such as the possibility we can one day turn hurricanes away from cities. He's in the business of "cleaning up the mess of unintended consequences," he said. Given our present Gulf Coast BP oil disaster, I thought of externalities and what may die on the altar of human innovation.

Which brings me to the life work of Sonia Arrison, who the NYT tells us is a Singularity University founder and the "wife of one of Google's first employees." She's writing a book that, with the University's work, she hopes will prepare us -- I mean the rest of you humans -- for our inevitable leap in evolution.

I say the rest of you because unless her associate, Kurzweil, comes to visit me personally to redo my health regimen or upload my brain into a computer, and Jason Bobe then sends some of his friends to decode my genome and then hooks me up with folks at Amgen or some whiz kids in a lab redesigning genes "on the cheap," I don't anticipate being around here in 20 or more years, at least not in any shape that makes me a suitable guinea pig.

And yet, I recall a conversation with a domestic violence counselor in 2003. She told me to stop talking like I was going to die just because I'd received bad news about the state of my kidneys. I thought I was being practical to think in terms of not being around, which would lead me to plan for the inevitable and possibly leaving my children sooner than I hoped.

She said, "You have to be positive. I don't want to hear you talking in terms of not being around. For all you know, with all the advances in science, they could be growing kidneys in the lab by the time you need one."

It makes sense that she would think that way. We were in New Jersey, pharmaceutical alley, with biotech companies standing by. And while the bottom had fallen out of the dot com tech industry and Lucent had been wounded, the rise of technology and science bloomed, marking the air with its scent.

Rise of the Singularity People

I couldn't resist that subtitle, Rise of the Singularity People, because we have arrived at the place where science fiction seems wacky no more and ancient prophecies surface, making us pause. If you read the New York Time piece, you'll see that Singularity proponents are akin to religious zealots spreading their gospel. In fact, some of them, per the NYT article, even have what could be perceived by fundamentalist Christians as a threat to their existence in particular:

“There are enormous social and political issues that will arise,” Mr. (Richard A.) Clarke (a former head of counterterrorism at the National Security Council and novelist) says. “There are vast groups of people in society who believe the earth is 5,000 years old. If they want to slow down progress and prevent the world from changing around them and they engaged in political action or violence, then there will have to be some sort of decision point.”

A decision point? The article steers away from the doom and gloom of cultural and religious warfare back to Kurzweil's positive vision of faster computers and greater minds helping us solve all the world's ills with science and technology.

It sounds like pipe dreams, but I remember reading about a time not to much before 1977, when I rejected a scholarship in computer science, that a man said "there was no reason for people to have computers in their homes," and an engineer challenged the value of micro chips. Skeptics on the cusps of new ages have always chuckled at the "dreaming idiots" in labs.

My father lives with me as well, an 89-year-old World War II veteran who shares stories about days gone by. As I watch this new day glimmer on the horizon, I recall a story he tells about his youth. I've got to record him soon telling how cars once had no glass windows but curtains, how you could dislocate your shoulder cranking one up, and without windshield wipers, drivers in his hometown of Vacherie, La., used to rub half an onion on the glass to deflect raindrops so they could see through rainfall. And he also tells this story:

I remember when we started hearing about automatic transmissions on cars. We said it would never happen. How can a car be as smart as a man and change gears by itself. That was something.

I talk to my children about this world and how the future of humanity seems to hang in the balance not by decisions from political leaders but by scientists and engineers working quietly in white coats or in jeans at laptop. We have discussed The Immortal Life of Henrietta Lacks or how some humans are willing to ignore the suffering of other humans for the sake of research, as in the case of the Tuskegee Syphilis Study, which gave us the medical ethics doctrine of informed consent. Yvette said that we didn't learn as much as we should have from that case:

... especially when it comes to "vulnerable" research populations. For example, there are still issues even today with research being conducted on children (often Black and Hispanic) in foster care, and with research participants in developing nations.

I'm back to thinking of that conversation I had with my daughter. We talked about speculative fiction, and I said we may be seeing more books centered on magic and the supernatural because technology is advancing so quickly, there's hardly anything that can be written or put on the big screen that we haven't seen or don't suspect is just around the corner.

Watching a documentary once, I heard that rapid advances in technology became one of the challenges of the Star Trek franchise on TV. While it was once the pseudo-prophet of fantastic future, its writers found themselves hard-pressed to come up with anything hundred of years in the future that science didn't indicate was merely years away, downscaling of NASA programs notwithstanding. Consequently, I told her, we look to magic and the supernatural again to offer us the mysterious, the Harry Potters and Twilights.

"Yes," she agreed, "Mom, the more technology advances, the more it looks like magic too, and we have to dream beyond that."

More:

This video shows Kurzweil, an inventor and multi-millionaire, explaining "the coming Singularity," both the upside and downside. According to the NYT article, some "Singularitarians aren’t all that fond" of the 62-year-old futurist. They think he's "hijacked the Singularity term." He also has a film, Transcendent Man, that delivers the message "Prepare to Evolve." Captain Kirk (William Shatner) follows him, and singer Stevie Wonder says Kurzweil has "a gift that he's been given and he uses it for the betterment of mankind." However, talk about strange bedfellows, Glenn Beck likes him too.

Nordette Adams is a BlogHer CE & you can find her other stuff through Her 411.

Comments

In order to comment on BlogHer.com, you'll need to be logged in. You'll be given the option to log in or create an account when you publish your comment. If you do not log in or create an account, your comment will not be displayed.