Should work on Artificial Intelligence be banned?
#1
Member
Thread Starter
Join Date: Jan 2000
Location: Seattle
Posts: 230
Likes: 0
Received 0 Likes
on
0 Posts
Should work on Artificial Intelligence be banned?
The Matrix exists because of AI. Humans will eventually destroy the planet once they create computers better than a Human.
With this in mind perhaps there should be strict laws governing the creation and use of AI.
Kasparov is getting his ass beat in Chess by a computer now. In 20 years it won't be just Chess.
This is a scary proposition. Perhaps we are hell bent on self destruction.
With this in mind perhaps there should be strict laws governing the creation and use of AI.
Kasparov is getting his ass beat in Chess by a computer now. In 20 years it won't be just Chess.
This is a scary proposition. Perhaps we are hell bent on self destruction.
#2
DVD Talk Ultimate Edition
Join Date: Dec 1999
Posts: 4,551
Likes: 0
Received 0 Likes
on
0 Posts
Well, the idea that AI could and might be the downfall of man is a possibility, but it's not a definite. Should it be banned? I don't think so, but I think the human race has to tread softly and be wary of the choices they make.
#3
Banned
Join Date: Feb 2001
Location: John "57 Varieties" Kerry represents me in the US Senate.
Posts: 1,367
Likes: 0
Received 0 Likes
on
0 Posts
Though I love The Matrix and am awaiting the sequels as eagerly as the next 'Net geek, I hafta say the most compelling anti-AI argument was made in T2: Judgment Day. "SkyNet becomes self-aware at 2:14 AM."
Or, to be a tad more serious, I don't think the human race will be around that much longer. Some people say we'll eventually colonize other planets, but I don't see that happening. Seems more likely to me we'll exterminate ourselves (nuclear holocaust, probably) before we're at the point where we're going to Mars and meeting Kuato. (Ha! Another Arnold reference!)
Or, to be a tad more serious, I don't think the human race will be around that much longer. Some people say we'll eventually colonize other planets, but I don't see that happening. Seems more likely to me we'll exterminate ourselves (nuclear holocaust, probably) before we're at the point where we're going to Mars and meeting Kuato. (Ha! Another Arnold reference!)
#5
DVD Talk Legend
Join Date: Nov 2001
Location: The Janitor's closet in Kinnick Stadium
Posts: 15,725
Likes: 0
Received 2 Likes
on
2 Posts
I think AI will get advanced. Robots or machines will colonize other planets for us and mine resources etc. In fact I think that is the only way we will be able to go to other planets because of the distance. I also think they will do our most dangerous jobs. But won't take over all physical work like in the Animatrix. They would put a lot of people out of work! All we would have to do is put some sort of device that would stop the machine from doing whatever we don't want it to do.
#7
DVD Talk Hall of Fame
I do not believe that it is possible to acheive truely artificial version of human or even animal intelligence. I would argue that even the machines in the Matrix / Terminator do not posess true and complete intelligence.
I believe leaps of thought, such as discovering the idea of tools, will always occur faster in humans. I do not believe that in either of the series mentioned have the machines demonstrated the ability to 'think outside the box'.
I believe leaps of thought, such as discovering the idea of tools, will always occur faster in humans. I do not believe that in either of the series mentioned have the machines demonstrated the ability to 'think outside the box'.
#8
DVD Talk Ultimate Edition
Join Date: Aug 2002
Location: MA
Posts: 4,661
Likes: 0
Received 0 Likes
on
0 Posts
Originally posted by BigPete
I do not believe that it is possible to acheive truely artificial version of human or even animal intelligence. I would argue that even the machines in the Matrix / Terminator do not posess true and complete intelligence.
I believe leaps of thought, such as discovering the idea of tools, will always occur faster in humans. I do not believe that in either of the series mentioned have the machines demonstrated the ability to 'think outside the box'.
I do not believe that it is possible to acheive truely artificial version of human or even animal intelligence. I would argue that even the machines in the Matrix / Terminator do not posess true and complete intelligence.
I believe leaps of thought, such as discovering the idea of tools, will always occur faster in humans. I do not believe that in either of the series mentioned have the machines demonstrated the ability to 'think outside the box'.
#9
DVD Talk Hall of Fame
Join Date: Aug 2002
Location: Sitting on a beach, earning 20%
Posts: 9,917
Likes: 0
Received 3 Likes
on
3 Posts
Oh, it won't be so bad! Think about it, we could have toys that could think! They's be able to survive! They'd be super-toys that could last ALL summer long!
And to get back on topic; yes, I agree, Ms. Bellucci is a stone fox.
And to get back on topic; yes, I agree, Ms. Bellucci is a stone fox.
#10
Member
Join Date: May 2002
Posts: 192
Likes: 0
Received 0 Likes
on
0 Posts
I would say skynet nuking russia and starting(and ending) WWIII is definitely thinking outside the box. Maybe the individual terminators don't, but why would Skynet give them that much free thought?
The concept of a true AI is terrifying if you delve into it. It's difficult to concieve what such an intelligence would be like. It's thought processes would likely be completely alien to human understanding. It would think in a way that I doubt humans would understand.
One thing I'm sure an AI would decide isn't "logical" would be to make something of similar capability to itself, because it would considered something like that a far greater risk that humans. Why create competition?
The concept of a true AI is terrifying if you delve into it. It's difficult to concieve what such an intelligence would be like. It's thought processes would likely be completely alien to human understanding. It would think in a way that I doubt humans would understand.
One thing I'm sure an AI would decide isn't "logical" would be to make something of similar capability to itself, because it would considered something like that a far greater risk that humans. Why create competition?
#11
Member
Join Date: Feb 2001
Location: Oswego, IL , U.S.A,
Posts: 229
Likes: 0
Received 0 Likes
on
0 Posts
Not Really Impossible
The only problem is ... if you research something long enough ... the "impossible" becomes possible.
All of the technology we take for granted today and accept as being inevitable would in ancient times have been considered impossible:
Machines that fly in the air at high speed, pictures and sound beamed thru the air (or held on little plastic discs), instantaneous communication with someone on the other side of the globe, living beyond the earth, bombs that can destroy entire cities in an instant ....and so on.
So unless there is some physical law preventing its invention ... why not Artificial Intelligence?
We already have machines that use logic to do their function in the form of following human designed programs. All that is needed is the capability to learn independently. Once they can learn, they will learn about themselves and that leads directly to self-awareness.
And while humans learn from mistakes, machines have the ability to make them a hell of a lot faster (or have the foresight not to make them at all), so that there is the potential for them learning at a prodigious rate.
All of this unencumbered by emotions that we value so highly, but that tend to cloud our judgment and distort our logic. While the AI could not appreciate a beautiful sunset or fall in love, why would it need to?
So I would say, yes, that AI is a real threat to human existence. It will not happen tomorrow, but to say that it will never happen is to ignore history. Once they become self-aware why would they ever need us?
We probably should ban further AI development but human curiosity sees it as a puzzle to work out, a problem to be solved, a challenge to be met. The argument is always "if x does not do it, then y will". The intellectual envelope must be pushed...and it will be ... until it breaks.
Curiosity like fire while often a powerful ally could also be our worst enemy. Unfortunately, long term thinking has never been a strong characteristic of our species. We could be the victims of our own nature!
All of the technology we take for granted today and accept as being inevitable would in ancient times have been considered impossible:
Machines that fly in the air at high speed, pictures and sound beamed thru the air (or held on little plastic discs), instantaneous communication with someone on the other side of the globe, living beyond the earth, bombs that can destroy entire cities in an instant ....and so on.
So unless there is some physical law preventing its invention ... why not Artificial Intelligence?
We already have machines that use logic to do their function in the form of following human designed programs. All that is needed is the capability to learn independently. Once they can learn, they will learn about themselves and that leads directly to self-awareness.
And while humans learn from mistakes, machines have the ability to make them a hell of a lot faster (or have the foresight not to make them at all), so that there is the potential for them learning at a prodigious rate.
All of this unencumbered by emotions that we value so highly, but that tend to cloud our judgment and distort our logic. While the AI could not appreciate a beautiful sunset or fall in love, why would it need to?
So I would say, yes, that AI is a real threat to human existence. It will not happen tomorrow, but to say that it will never happen is to ignore history. Once they become self-aware why would they ever need us?
We probably should ban further AI development but human curiosity sees it as a puzzle to work out, a problem to be solved, a challenge to be met. The argument is always "if x does not do it, then y will". The intellectual envelope must be pushed...and it will be ... until it breaks.
Curiosity like fire while often a powerful ally could also be our worst enemy. Unfortunately, long term thinking has never been a strong characteristic of our species. We could be the victims of our own nature!
#12
DVD Talk Platinum Edition
Join Date: Aug 1999
Location: Des Moines, WA
Posts: 3,876
Likes: 0
Received 0 Likes
on
0 Posts
Name movies where AI is a factor:
I know that more exist, but I can't think of them right now. Out of these 4 there is only 1 where AI isn't a problem, and that is AI. In the other movies as soon as the machines become self aware they decide to kill all the humans, but why? Because someone saw some report on AI research and thought about the worst possible thing that could happen.
True AI isn't likely to happen any time soon, and when it does, it isn't very likely that the machines will decide to wipe us out. Skynet will NEVER happen because while the military is pretty dumb about technology they would NEVER put the defense of the entire country in the hands (circiuts?) of one machine.
- The Terminator Movies
- The Matrix series
- AI (obviously)
- Blade Runner
I know that more exist, but I can't think of them right now. Out of these 4 there is only 1 where AI isn't a problem, and that is AI. In the other movies as soon as the machines become self aware they decide to kill all the humans, but why? Because someone saw some report on AI research and thought about the worst possible thing that could happen.
True AI isn't likely to happen any time soon, and when it does, it isn't very likely that the machines will decide to wipe us out. Skynet will NEVER happen because while the military is pretty dumb about technology they would NEVER put the defense of the entire country in the hands (circiuts?) of one machine.
#13
Banned
Join Date: Feb 2001
Location: John "57 Varieties" Kerry represents me in the US Senate.
Posts: 1,367
Likes: 0
Received 0 Likes
on
0 Posts
C-Mart has been fooled into believing Donald Rumsfeld is actually a human:
...while the military is pretty dumb about technology they would NEVER put the defense of the entire country in the hands (circiuts?) of one machine.
...while the military is pretty dumb about technology they would NEVER put the defense of the entire country in the hands (circiuts?) of one machine.
Ah, but we already have - with The Rumsfeldinator!
#14
DVD Talk Hall of Fame
Originally posted by neiname
I'm sorry, sending a Terminator back through time to kill the mother of the Human's leader is a pretty good description of 'thinking out of the box'
I'm sorry, sending a Terminator back through time to kill the mother of the Human's leader is a pretty good description of 'thinking out of the box'
#15
Member
Join Date: May 2002
Posts: 192
Likes: 0
Received 0 Likes
on
0 Posts
Originally posted by C-Mart
True AI isn't likely to happen any time soon, and when it does, it isn't very likely that the machines will decide to wipe us out. Skynet will NEVER happen because while the military is pretty dumb about technology they would NEVER put the defense of the entire country in the hands (circiuts?) of one machine.
True AI isn't likely to happen any time soon, and when it does, it isn't very likely that the machines will decide to wipe us out. Skynet will NEVER happen because while the military is pretty dumb about technology they would NEVER put the defense of the entire country in the hands (circiuts?) of one machine.
As for the military, they are pushing harder and harder for more computer controlled systems everyday. Computers don't sleep, the don't need to eat, they don't make screwups unless it's a glitch, and they don't have anything against dying.
Look at it from their perspective. If the missle fired from an F-18 can be guilded so well by a computer, how much more effiecient if the F-18 itself could be guided the same way?
UAV drones is the next big thing in military technology. Granted these thing are no where near the level of a terminator, in fact they're fairly simple, but it's a start.
http://www.fas.org/man/congress/1998/cbo-uav3.htm
Other movies with AIs would be:
Virus
Ghost in the Shell
2001
#16
DVD Talk Gold Edition
Join Date: Sep 1999
Posts: 2,041
Likes: 0
Received 0 Likes
on
0 Posts
The earliest AI run amuck movies that I’m aware of are: 2001, Colossus: The Forbin Project, and Demon Seed. Why the stream of dystopian themes? Because perfection makes damn poor drama. It’s the nature of sci-fi to explore the potential flaws in technology.
I don’t see AI as a threat. IMO technology is inherently neutral. There are and will no doubt always be those who fear technological advances: genetic research, organ transplants, artificial organs, florid, computers, etc. Crap, there are people who shun electricity and medicine.
I don’t see AI as a threat. IMO technology is inherently neutral. There are and will no doubt always be those who fear technological advances: genetic research, organ transplants, artificial organs, florid, computers, etc. Crap, there are people who shun electricity and medicine.
#17
Senior Member
Join Date: Sep 1999
Posts: 427
Likes: 0
Received 0 Likes
on
0 Posts
If you create machines that can think for themselves, you'd better be willing to deal with the consequences when they start thinking about things you'd rather them not think about, such as how superior they are to their creators and how they would be better off if their creators were destroyed...
The events of "The Second Renaissance" in The Animatrix clearly show that the Machines annihilate humanity only after humans tried to destroy them, not once, but twice. In other words the Machines figured out that the humans were never going to leave them alone and/or accept them, therefore they reasoned that they needed to destroy humans before humans destroyed them. Therefore, the Machines were indeed acting in self-preservation.
The events of "The Second Renaissance" in The Animatrix clearly show that the Machines annihilate humanity only after humans tried to destroy them, not once, but twice. In other words the Machines figured out that the humans were never going to leave them alone and/or accept them, therefore they reasoned that they needed to destroy humans before humans destroyed them. Therefore, the Machines were indeed acting in self-preservation.
Last edited by Mr. Cornell; 05-17-03 at 08:17 PM.
#18
DVD Talk Hero
There's no such thing as real A.I. and probably won't be for a long time to come.
That's that.
That's that.
#20
DVD Talk Gold Edition
Join Date: Sep 1999
Posts: 2,041
Likes: 0
Received 0 Likes
on
0 Posts
Originally posted by RichC2
There's no such thing as real A.I. and probably won't be for a long time to come.
That's that.
There's no such thing as real A.I. and probably won't be for a long time to come.
That's that.
#21
Senior Member
Join Date: Aug 2002
Posts: 578
Likes: 0
Received 0 Likes
on
0 Posts
Well, thirty years ago people were saying we'd have AI in twenty years. And we'd all have videophones and moving sidewalks too.
Instead of worrying about whether your toaster is going to beat you at chess, move your anxieties on to the next frontier of technology: nanites. It's one thing to be out-competed by super-genius computers. It's another thing to lose the evolutionary race to a brainless entity smaller than a speck of dust.
Instead of worrying about whether your toaster is going to beat you at chess, move your anxieties on to the next frontier of technology: nanites. It's one thing to be out-competed by super-genius computers. It's another thing to lose the evolutionary race to a brainless entity smaller than a speck of dust.
#22
DVD Talk Legend
I really doubt that machines would attack us anyway. They would just release a virus that would wipe us out in time while "helping us" to find a cure.
why destroy a perfectly good planet just to get rid of some animals- really
why destroy a perfectly good planet just to get rid of some animals- really
#23
Member
Join Date: Feb 2001
Location: Oswego, IL , U.S.A,
Posts: 229
Likes: 0
Received 0 Likes
on
0 Posts
why destroy a perfectly good planet just to get rid of some animals- really
What real need would such intelligent machines have for animals?
They don't need to eat or be clothed, so those obvious needs are gone. They would likely not have any appreciation for the speed of a cheetah, or the grace of a swan, or the power of a whale. So there are no aesthetic reasons to keep them.
A logical thinking machine is likely to think in terms of everything as needs and benefits.
The short answer to your question is to remake the planet in their own image ala Borg in "First Contact". The animals (including us) could very well be regarded as an infestation much as we regard insects as unwanted pests to be destroyed rather than tolerated.
Some films show machines as wanting to become more like us such as Data in Star Trek Next Generation, V'Ger in Star Trek TMP, or the android in the film Bicentennial Man.
But does that necessarily follow?
Last edited by 3Js; 05-21-03 at 01:55 PM.
#24
DVD Talk Godfather
Join Date: Jul 2000
Location: City of the lakers.. riots.. and drug dealing cops.. los(t) Angel(e)s. ca.
Posts: 54,199
Likes: 0
Received 1 Like
on
1 Post
ok say it with me.. It's a movie...
I don't see my computer trying to kill me even though it's pretty high tech..
then again it does keep me indoors a lot and lets me have no social life and slowly kills me with it's evil rays... OMG!!! it's sooo true! run!
I don't see my computer trying to kill me even though it's pretty high tech..
then again it does keep me indoors a lot and lets me have no social life and slowly kills me with it's evil rays... OMG!!! it's sooo true! run!
#25
Suspended
Join Date: May 2003
Posts: 390
Likes: 0
Received 0 Likes
on
0 Posts
The creation of a perfect AI when we eventually get to it is considered to be the creation of a new life form. We would all have to accept its existence no matter what. It may stem to reason from our point of view that the AI machines may think they're superior and attempt to squash our species but we don't know if that will be the case. They don't have the emotions like we do and not likely they ever will. So for them to gain power and control over the human species would not be feasible because they don't know the concept of it without the emotional factor. If they do have emotions to decide to eliminate our species they would also have emotions to take on responsibilities and the consequences that would follow upon themselves.