"The singularity is the point at which artificial intelligence reaches the capacity of human intelligence and we have some combination of the two," says Paul Hoffman, chairman of BigThink.com and moderator of a panel about the film at the 2010 Woodstock Film Festival. Kurzweil elaborated: "The core concept of the singularity is the exponential growth of information technology."
Basically, the singularity is merging of man and machine. Should we be worried about this? Is it good that artificial intelligence outpace the human intelligence that created it? Will singularity bring a dystopian Matrix-like world? Will humans have to fight intelligent machines like those in the film Terminator Salvation?
In Woodstock, Kurzweil reflected on the dangers of technology: "On the one hand, this holds the key to solving the major challenges of humanity, because it’s really only the scale of this exponentially growing technology… that can solve the major challenges we have….energy, environment, disease, longevity, poverty. On the other hand, we’ll have new bases that can be abused. I believe the power of exponentially growing technology is inexorable. But what we do with it—whether we actually do apply it to these major pressing challenges on the one hand, and protect ourselves from people who willfully abuse them on the other hand—that’s very much in our hands."