In the photo to the right are Craig Venter and Hamilton Smith. You’ve never heard of them? Well, they might soon be as famous as J. Robert Oppenheimer. What did they do? They created artificial life. And that may bring with it some amazing benefits to human kind. . . or it may kill us all. Indeed, what they have done presents us with a danger unlike any danger we have ever faced before, and that’s not hyperbole.
In 1995, American biologists Venter and Hamilton became the first people to unravel the DNA sequence of a living organism (a bacterium). This was a major achievement that brought with it both the possibility of good and bad. For example, this advance promised to one day end genetic disease. It promised new drugs, the ability to make us all stronger, smarter, healthier, and to give us longer lives. That’s all good. But it also brought with it the possibility that parents would set out to create designer babies, with unknown consequences. Also, many worried that this could create a two-tiered society in which the haves have all of the best genetic traits money can buy and the poor don’t. Some feared this could even lead to two completely different human species. Yet, even assuming the worst, such concerns could be controlled through careful legislation that restricted the use of that technology for good purposes.
The new danger, however, will be well beyond such controls and it’s far more dangerous.
On May 20, 2010, Venter and Hamilton became the first human beings to create an artificial life form. They did this by creating a strand of synthetic DNA with 100 genes on it. Then they placed that DNA onto an existing dead cell. Through the introduction of various chemicals, the DNA strand took over the cell. Then it duplicated itself, and life was born. . . life without ancestors.
So what’s the problem? Here’s the problem. What they did is actually quite simple once you know how to do it. Unlike the creation of chemical, biological or nuclear weapons, this process does not require extensive hardware or massive amounts of technical know-how which only a handful of governments possess. Indeed, it is being suggested that this process is so simple that, someday soon, creating new life will be as easy as choosing genes “off the shelf” and most anyone will be able to learn to do this.
What this means is that millions of people all over the world will be able to create whatever they can dream up. Thus, controlling the use of this technology will be impossible. Indeed, forget the analogy to controlling nuclear, biological or chemical weapons -- which involves monitoring a few governments and controlling the export of certain technology and hardware. A more appropriate analogy will be trying to stop computer viruses, which can be created by anyone in the world with access to a computer and a little knowledge of computer code.
And the word "virus" is key. When people think of “life,” they typically think of animals. But animals aren’t really the problem. If some biologist in Russia creates a half-pig, half-dog, that’s not the end of the world. But what if that same biologist creates a virus that looks like the flu, passes just as easily, and acts like Ebola? What if an Al Qaeda scientist changes mad cow disease to let it skip across species to humans?
That’s the real danger, not that people will create brand new super-bugs from scratch, but that they will modify existing ones to make them more fatal, to make them harder to detect, or to make them spread quicker or easier.
There is no limit to the depths of evil that we could achieve with this technology. And given that human nature rarely stops trying to plumb those depths, it is not hard to see viruses suddenly appearing that cause blindness, deafness, infertility or death. It is not hard to see new species of plants created that wipe out whole eco-systems, or bees with poison in their stingers, or ultra-violent versions of existing animals. All of this will be possible.
So what do we do? Sadly, this is where I have few answers. The Economist recommends that we turn this knowledge loose to everyone. Their thinking is that it is best to have millions of good guys all working on solutions whenever something evil appears, like we do with computer viruses. But that analogy doesn’t work in this instance because the ability to counteract these things quickly may not be enough, and the danger of letting this knowledge out is so great. For example, if I designed a killer virus that spread like wildfire, but waited 60 days to show any symptoms, by the time people started dropping dead, it would be too late to find a solution. . . hundreds of millions of people would have it.
But what are the alternatives? Government control? From the sound of it, it will be impossible to keep this genie in the bottle. As with nuclear weapons, now that everyone knows it can be done, it will only be a matter of time before others figure it out. Thus, even if we put a stop to it right now, that may only delay the inevitable and it will leave us without the research needed to counteract the new things that suddenly appear.
Criminal punishments? Those only stop people who don’t have criminal intent in the first place. No angry 16 year old wiz kid in the Philippines is going to care about a prison sentence in the US. Neither will an Al Qaeda scientist.
Monitoring? How? We can’t monitor the creation of computer viruses now.
Ultimately, there may not be an answer. But I would suggest a combination of things and I would suggest we start immediately. First, I would suggest some sort of export control to try to limit the spread of the technology, and I would suggest registering the necessary chemicals. I would also suggest establishing procedures for genetic testing when people arrive at hospitals with strange diseases and establishing some form of improved communications or database through the CDC to track unusual symptoms. Finally, in all honesty, I recommend that anyone caught creating such a virus be dealt with brutally, and that the punishment for such a crime be death. That is the one penalty that might dissuade most people from playing around with these things.
Normally, I favor all scientific advances. But this one could be the exception. This one appears to be a true Pandora’s Box, and we need to tread very carefully on this.
Wednesday, June 2, 2010
Artificial Life: The Biggest Danger We May Ever Face
Subscribe to:
Post Comments (Atom)
24 comments:
Another story that falls into the "Hasn't Star Trek taught us anything?" category. :-)
What this means is that millions of people all over the world will be able to create whatever they can dream up.
I can't help but feel "millions" is slightly overblown. I'm not saying it's wrong but, hell, how many people still have trouble programming a cell phone? Or setting a show to record on TiVo? Genetics is still a giant leap for people who've never dabbled in it.
I guess the barometer would be my mother. If this is something she could do, then we're in trouble.
Mankind has been on a self-destruct course since we discovered fire (or maybe before - who knows?).
Scott, Millions is not overblown. Think about computer viruses. How many people do you think have enough knowledge to make a computer virus? If only 1 in 100 people has that knowledge (which is probably way too low), then you're talking about 70,000,000 world wide. So if only 1 in 7000 people has this knowledge, then you're talking one million people.
And if they are right, this will be the sort of thing that any biologist will be able to do and anyone else who wants to learn about it.
As for Star Trek, yeah, this is one of those things. But people rarely think of the consequences of the things they do, they just want to see if they can do them.
And of course, sometimes you never know what the consequences will be. Cars have killed millions of people the world over, but they've saved millions more. Nuclear weapons only killed around 150,000, but have arguably prevented world war 3, which would have killed hundreds of millions.
This is the first one that worries me because of the insidious nature of what can be created and the fact that people will actively try to do their worst.
Finally, on the millions point, keep in mind that it only takes one person.
LL, We have an unbelievable capacity for self-destructive, both on a personal basis and as a group. It truly is a wonder that we've made it this far. In fact, thinking about it, I'm surprised our species didn't die off playing Jurassic-chicken or pin the tail on the predator.
when i read "the hot zone" i made the leap to what could happen. knowing what ebola was and what it could do made me wonder when it would be weaponized. when i learned about nanotechnology, or any other scientific/technological breakthrough, i have believed us one step closer to the ultimate discovery: artificial life. now that we have it...god help us.
Patti, When I was still in DC, they had an ebola infected monkey escape a lab in one of the suburbs. It was only a miracle that the the form of ebola it had, had changed to become benign or you would have heard about a million people dying gruesomely in DC. Unbelievable.
I remember reading The Hot Zone in high school (and seeing the film Outbreak) but this reminds me of another novel, one I have not read, called Cosm, by Gregory Benford. I'll get around to reading it one day but the synopsis always interested me: it tells the story of scientists who accidentally create a miniature universe, and all the moral and ethical dilemmas therein.
Scott, I haven't read it, but it sounds interesting. The creation of new life, in any form, is an area that really does call out for ethical guidelines, and I suspect the creation of a mini-universe would raise all the same issues.
Hit the nail on the head, Scott. My first thought went to "Gattaca." The two-sided coin - either people like me would be born fully healthy, or people like me would never be born.
Custom genetic viruses. Imagine what Hamas and Hezbollah would do with that.
Damn you, Andrew. Halfway through the article, I was all excited and picking out names for my future pet dodo, and then you had to ruin it with logic. Ugh.
This is really frightening. This is how the world ends - not with a bang, but with a whimper? :-(
JG, That's the problem. On the one hand, this stuff promises huge benefits. It can be used for all kinds of good things. But this one can also be used for very, very bad things. Unfortunately, I fear that we as a species really aren't ready to handle that kind of power, that this will become a devastating new weapon for anyone who wants to use it. . . and there are a lot of those people.
I think it's definitely time for us, as a people, to start thinking about the ethics/morality of these things and how to control the negative consequences -- otherwise, we will just stumble forward unprepared, hoping to put out fires after they have happened and waking up one day saying, "what have we done?"
CrispyRice, "naming pet your dodo" -- LOL!
It really is a shame how human nature is because this could be one of those moments in history where we are looking at a great new future. But sadly, we need to be even more concerned with a dangerous new future.
Not to make a joke of it, but it should be "so this is how the world ends, not with a bang but with a sneeze."
I think we're all in agreement. This has way more potential for harm than good.
And sorry, I can't help it. I have to throw in one more sci-fi reference:
"We were so concerned with whether or not we could, we forgot to ask whether or not we should." - Babylon 5
JG, Excellent reference! I loved Babylon 5, great show!
By the way, many people say that science fiction is where most of modern philosophy and ethics are being made. I think there is something to that, because science fiction does tend to look forward and ask "what if", a question which other genres disdain.
It made me think of Jurassic Park and the Jeff Goldblum’s character talking about something in regards to - - just because you can, doesn’t mean that you should tinker with genetics. It would be safe to say that was the point of Michael Crichton book.
Stan, Yep, that was his point, because we shouldn't do things until we understand the consequences. And that makes sense because you need to be able to understand the pros and cons of an action before taking it and what precautions might be needed to reduce or prevent the cons.
For example, this might be the kind of discovery that you don't publish, where you don't release information about it publicly. Or it might be the kind of thing you just don't do. Those are the ethical questions that we should be discussing, especially as our science is currently outstripping our maturity.
Scary stuff. I hope somebody is thinking about this.
Mega, I hope so. But when has the government ever been prepared for a problem?
The science channel has a very good show with a round table that is chaired by Paula Zahn of all people. The guy who created this is on the show and answers some of the "fears". They have a catholic priest to respond to the Vaticans comments which they call lukewarm. A biologist from Princeton and several others.
The one thing the guy did say was that most of the "science fiction" horror stories were at least a life time away as they had very little control over the genes. He was talking about modifying human DNA with it.
One of the benefits was that they thought they could create a type of algaie that made a substance similar to oil and thus "end the energy crisis". He did not talk about modifying viruses which is a real concern as doing something like that is easier as a virus (if it is truly life since it can't actively live outside a normal cell) is the simplest.
That said we all already modifying existing viruses to make changes to make changes to animals. There is a guy in one of the North Eastern states somewhere who a couple of years back altered the genes of a living virus and implanted it into a goat embryo. The result was a goat who produced the spider silk in their milk. The idea is that this is the strongest substance we know but spiders are darn hard to heard, goats aren't.
The ability to create life from scratch adds a new wrinkle. I think in the end it might be harder to make a "killer" virus then we think. Sure anyone can play with the genes but to get a microbe that can infect humans and counter the immune system which is not an easy thing to do would tale a lot of kowledge. I think maybe the real danger is not that someone does this on purpose but does this without even knowing they did it.
But who knows..........
Individualist, This one is very different that than altering existing genes. This is basically assembling existing genes to create something new. And the reason it's likely to be so dangerous is that it's apparently very easy to do.
As I'm not a geneticist, I can't say for sure how easy or how difficult this is. But from what I've read, these guys did it and they claim it's very simple to do. In any event, the day is coming closer on these things and we need to start thinking about how to avoid the negative, not just what are the potential benefits.
The reason they would work with existing viruses is that they have already shown themselves to be successful at what they do. Thus, you don't need to worry about trying to create something from scratch that will both be able to live and infect. Instead, you take something that is already out there and effective and you just give it a new trait -- like the goat you're talking about.
What is the real danger and how soon is it coming? Obviously, we don't know for sure. The potential danger on this sounds incredible. So we need to nail that down and figure out how to handle it before this goes too far down the road and we wake up one day with the world ending.
Andrew
You are right. For me the danger lies not in the assembly process itself but the level of understanding and knowledge of the human genome. I do not see why I could not modify an existing virus or create a new one to get this effect.
I see this new technology as a very useful tool for understanding the genes themselves. If I can isolate a gene that produces a particular protein in a creature with only 100 genes I have a much better opportunity to find out the exact effect of that gene than to study it in a human cell with several million genes. Once I have that understanding I can then go large scale either creating more complex creatures or even easier making one or two modifications to an existing one.
The upside is that while it provides more avenues for attack the knowledge also provides more avenues for defence. If we figure out a way to eliminate any desease germ we'd like through nanites maybe then the ability to create deadlier ones is less of a threat.
Where we go with this I am not sure but in Futurism (what we are doing in this discussion) there is always evidence for and against.
Still the possibility you are talking about is a very real concern and it was not one tht was made to the inventor on the Science Channel show. He discounted the "science fiction" doomsday scenarios of cloning half people etc. Biological warefare did not come up and I am thinking now that it should have been.
Individualist, I usually discount the science fiction doomsday scenarios as well because they always ignore key facts. And I favor scientific advances wholeheartedly.
But this one is different. This is the first one that I see as having an actual potential for real disaster.
My concern with this one is that if it truly can be done by so many people, then I fear that will learn how to do "evil" much quicker and much more effectively than we will learn how to do good or how to counteract evil. I think the analogy to computer viruses is a good one, except that those don't result in fatalities.
And given the fact that natural viruses already kill millions each year, I think this not an unreasonable fear. If you could take the flu -- which is highly contagious -- and give it some more fatal traits (like ebola), then the damage you could do would be incredible.
This is the first invention that I think should have been delayed until we had a better handle on it. And I hope that whoever is involved in this is considering these issues and treads very carefully. This is not a moment to let ego overcome judgment.
Yea I agree its a problem. I just have little faith in people's ability to control things so they better start working on the defence now even if it is in secret.
It a 12 Monkeys scenario for sure except no Bruce Willis to time travel back and fix everything.
Individualist, That movie actually came to mind when I wrote the article.
I agree about preparing defenses. I think the worst thing to do now would be to stop the research, now that everyone else knows it can be done. But they need to think about what information they will release and they need to think about other ways to prevent this from turning bad on us.
Post a Comment