Go Back  DVD Talk Forum > Entertainment Discussions > Movie Talk
Reload this Page >

Should work on Artificial Intelligence be banned?

Movie Talk A Discussion area for everything movie related including films In The Theaters

Should work on Artificial Intelligence be banned?

Old 05-08-03, 08:09 PM
  #1  
Member
Thread Starter
 
Join Date: Jan 2000
Location: Seattle
Posts: 230
Likes: 0
Received 0 Likes on 0 Posts
Should work on Artificial Intelligence be banned?

The Matrix exists because of AI. Humans will eventually destroy the planet once they create computers better than a Human.

With this in mind perhaps there should be strict laws governing the creation and use of AI.

Kasparov is getting his ass beat in Chess by a computer now. In 20 years it won't be just Chess.

This is a scary proposition. Perhaps we are hell bent on self destruction.
Old 05-08-03, 09:14 PM
  #2  
DVD Talk Ultimate Edition
 
Join Date: Dec 1999
Posts: 4,551
Likes: 0
Received 0 Likes on 0 Posts
Well, the idea that AI could and might be the downfall of man is a possibility, but it's not a definite. Should it be banned? I don't think so, but I think the human race has to tread softly and be wary of the choices they make.
Old 05-08-03, 09:39 PM
  #3  
Banned
 
Join Date: Feb 2001
Location: John "57 Varieties" Kerry represents me in the US Senate.
Posts: 1,367
Likes: 0
Received 0 Likes on 0 Posts
Though I love The Matrix and am awaiting the sequels as eagerly as the next 'Net geek, I hafta say the most compelling anti-AI argument was made in T2: Judgment Day. "SkyNet becomes self-aware at 2:14 AM."

Or, to be a tad more serious, I don't think the human race will be around that much longer. Some people say we'll eventually colonize other planets, but I don't see that happening. Seems more likely to me we'll exterminate ourselves (nuclear holocaust, probably) before we're at the point where we're going to Mars and meeting Kuato. (Ha! Another Arnold reference!)
Old 05-16-03, 10:14 AM
  #4  
DVD Talk Legend
 
Join Date: Jun 2000
Location: NYC
Posts: 17,015
Likes: 0
Received 0 Likes on 0 Posts
Old 05-16-03, 12:42 PM
  #5  
DVD Talk Legend
 
Mopower's Avatar
 
Join Date: Nov 2001
Location: The Janitor's closet in Kinnick Stadium
Posts: 15,725
Likes: 0
Received 2 Likes on 2 Posts
I think AI will get advanced. Robots or machines will colonize other planets for us and mine resources etc. In fact I think that is the only way we will be able to go to other planets because of the distance. I also think they will do our most dangerous jobs. But won't take over all physical work like in the Animatrix. They would put a lot of people out of work! All we would have to do is put some sort of device that would stop the machine from doing whatever we don't want it to do.
Old 05-16-03, 12:58 PM
  #6  
Banned
 
Join Date: Sep 1999
Location: The City of Roses.
Posts: 4,802
Likes: 0
Received 0 Likes on 0 Posts
AI isn't what's going to kill us. Our doom will come from normal human faults.
Old 05-16-03, 01:04 PM
  #7  
DVD Talk Hall of Fame
 
Join Date: Oct 1999
Location: not CT
Posts: 9,617
Likes: 0
Received 2 Likes on 2 Posts
I do not believe that it is possible to acheive truely artificial version of human or even animal intelligence. I would argue that even the machines in the Matrix / Terminator do not posess true and complete intelligence.

I believe leaps of thought, such as discovering the idea of tools, will always occur faster in humans. I do not believe that in either of the series mentioned have the machines demonstrated the ability to 'think outside the box'.
Old 05-16-03, 01:14 PM
  #8  
DVD Talk Ultimate Edition
 
Join Date: Aug 2002
Location: MA
Posts: 4,661
Likes: 0
Received 0 Likes on 0 Posts
Originally posted by BigPete
I do not believe that it is possible to acheive truely artificial version of human or even animal intelligence. I would argue that even the machines in the Matrix / Terminator do not posess true and complete intelligence.

I believe leaps of thought, such as discovering the idea of tools, will always occur faster in humans. I do not believe that in either of the series mentioned have the machines demonstrated the ability to 'think outside the box'.
I'm sorry, sending a Terminator back through time to kill the mother of the Human's leader is a pretty good description of 'thinking out of the box'
Old 05-16-03, 01:30 PM
  #9  
DVD Talk Hall of Fame
 
Join Date: Aug 2002
Location: Sitting on a beach, earning 20%
Posts: 9,917
Likes: 0
Received 3 Likes on 3 Posts
Oh, it won't be so bad! Think about it, we could have toys that could think! They's be able to survive! They'd be super-toys that could last ALL summer long!

And to get back on topic; yes, I agree, Ms. Bellucci is a stone fox.
Old 05-16-03, 01:38 PM
  #10  
Member
 
Join Date: May 2002
Posts: 192
Likes: 0
Received 0 Likes on 0 Posts
I would say skynet nuking russia and starting(and ending) WWIII is definitely thinking outside the box. Maybe the individual terminators don't, but why would Skynet give them that much free thought?

The concept of a true AI is terrifying if you delve into it. It's difficult to concieve what such an intelligence would be like. It's thought processes would likely be completely alien to human understanding. It would think in a way that I doubt humans would understand.

One thing I'm sure an AI would decide isn't "logical" would be to make something of similar capability to itself, because it would considered something like that a far greater risk that humans. Why create competition?
Old 05-16-03, 01:42 PM
  #11  
3Js
Member
 
Join Date: Feb 2001
Location: Oswego, IL , U.S.A,
Posts: 229
Likes: 0
Received 0 Likes on 0 Posts
Not Really Impossible

The only problem is ... if you research something long enough ... the "impossible" becomes possible.

All of the technology we take for granted today and accept as being inevitable would in ancient times have been considered impossible:

Machines that fly in the air at high speed, pictures and sound beamed thru the air (or held on little plastic discs), instantaneous communication with someone on the other side of the globe, living beyond the earth, bombs that can destroy entire cities in an instant ....and so on.

So unless there is some physical law preventing its invention ... why not Artificial Intelligence?

We already have machines that use logic to do their function in the form of following human designed programs. All that is needed is the capability to learn independently. Once they can learn, they will learn about themselves and that leads directly to self-awareness.

And while humans learn from mistakes, machines have the ability to make them a hell of a lot faster (or have the foresight not to make them at all), so that there is the potential for them learning at a prodigious rate.

All of this unencumbered by emotions that we value so highly, but that tend to cloud our judgment and distort our logic. While the AI could not appreciate a beautiful sunset or fall in love, why would it need to?

So I would say, yes, that AI is a real threat to human existence. It will not happen tomorrow, but to say that it will never happen is to ignore history. Once they become self-aware why would they ever need us?

We probably should ban further AI development but human curiosity sees it as a puzzle to work out, a problem to be solved, a challenge to be met. The argument is always "if x does not do it, then y will". The intellectual envelope must be pushed...and it will be ... until it breaks.

Curiosity like fire while often a powerful ally could also be our worst enemy. Unfortunately, long term thinking has never been a strong characteristic of our species. We could be the victims of our own nature!
Old 05-16-03, 01:54 PM
  #12  
DVD Talk Platinum Edition
 
C-Mart's Avatar
 
Join Date: Aug 1999
Location: Des Moines, WA
Posts: 3,876
Likes: 0
Received 0 Likes on 0 Posts
Name movies where AI is a factor:
  • The Terminator Movies
  • The Matrix series
  • AI (obviously)
  • Blade Runner

I know that more exist, but I can't think of them right now. Out of these 4 there is only 1 where AI isn't a problem, and that is AI. In the other movies as soon as the machines become self aware they decide to kill all the humans, but why? Because someone saw some report on AI research and thought about the worst possible thing that could happen.

True AI isn't likely to happen any time soon, and when it does, it isn't very likely that the machines will decide to wipe us out. Skynet will NEVER happen because while the military is pretty dumb about technology they would NEVER put the defense of the entire country in the hands (circiuts?) of one machine.
Old 05-16-03, 02:21 PM
  #13  
Banned
 
Join Date: Feb 2001
Location: John "57 Varieties" Kerry represents me in the US Senate.
Posts: 1,367
Likes: 0
Received 0 Likes on 0 Posts
C-Mart has been fooled into believing Donald Rumsfeld is actually a human:
...while the military is pretty dumb about technology they would NEVER put the defense of the entire country in the hands (circiuts?) of one machine.


Ah, but we already have - with The Rumsfeldinator!
Old 05-16-03, 02:30 PM
  #14  
DVD Talk Hall of Fame
 
Join Date: Oct 1999
Location: not CT
Posts: 9,617
Likes: 0
Received 2 Likes on 2 Posts
Originally posted by neiname
I'm sorry, sending a Terminator back through time to kill the mother of the Human's leader is a pretty good description of 'thinking out of the box'
Actually no, developing time travel would be thinking outside the box. But that concept was already "invented" by mankind and needed only be researched by Skynet. Using that technology to end their problem is a very simplistic and basic progression of thought.
Old 05-16-03, 02:40 PM
  #15  
Member
 
Join Date: May 2002
Posts: 192
Likes: 0
Received 0 Likes on 0 Posts
Originally posted by C-Mart
True AI isn't likely to happen any time soon, and when it does, it isn't very likely that the machines will decide to wipe us out. Skynet will NEVER happen because while the military is pretty dumb about technology they would NEVER put the defense of the entire country in the hands (circiuts?) of one machine.
While I don't disagree that true AI is a long way off(if it's even possible), why wouldn't it want to wipe us out? Even if it initially has no intentions of it, how long before we give it a reason? Skynet fired the nukes because as soon as the military realized it had gone self aware, they tried to destroy it. They weren't fast enough. Skynet doesn't kill because it likes it, it has no concept of like or dislike. It kills in defense. It's decided that it won't be save until all it's competition is dead, and good ol John Conner hasn't given it a reason to think otherwise(not that he could appeal to Skynets emotions, the thing is pure cold logic, a logic we can't comprehend)

As for the military, they are pushing harder and harder for more computer controlled systems everyday. Computers don't sleep, the don't need to eat, they don't make screwups unless it's a glitch, and they don't have anything against dying.

Look at it from their perspective. If the missle fired from an F-18 can be guilded so well by a computer, how much more effiecient if the F-18 itself could be guided the same way?

UAV drones is the next big thing in military technology. Granted these thing are no where near the level of a terminator, in fact they're fairly simple, but it's a start.

http://www.fas.org/man/congress/1998/cbo-uav3.htm

Other movies with AIs would be:
Virus
Ghost in the Shell
2001
Old 05-16-03, 11:52 PM
  #16  
DVD Talk Gold Edition
 
Join Date: Sep 1999
Posts: 2,041
Likes: 0
Received 0 Likes on 0 Posts
The earliest AI run amuck movies that I’m aware of are: 2001, Colossus: The Forbin Project, and Demon Seed. Why the stream of dystopian themes? Because perfection makes damn poor drama. It’s the nature of sci-fi to explore the potential flaws in technology.

I don’t see AI as a threat. IMO technology is inherently neutral. There are and will no doubt always be those who fear technological advances: genetic research, organ transplants, artificial organs, florid, computers, etc. Crap, there are people who shun electricity and medicine.
Old 05-17-03, 08:14 PM
  #17  
Senior Member
 
Join Date: Sep 1999
Posts: 427
Likes: 0
Received 0 Likes on 0 Posts
If you create machines that can think for themselves, you'd better be willing to deal with the consequences when they start thinking about things you'd rather them not think about, such as how superior they are to their creators and how they would be better off if their creators were destroyed...

The events of "The Second Renaissance" in The Animatrix clearly show that the Machines annihilate humanity only after humans tried to destroy them, not once, but twice. In other words the Machines figured out that the humans were never going to leave them alone and/or accept them, therefore they reasoned that they needed to destroy humans before humans destroyed them. Therefore, the Machines were indeed acting in self-preservation.

Last edited by Mr. Cornell; 05-17-03 at 08:17 PM.
Old 05-17-03, 08:31 PM
  #18  
DVD Talk Hero
 
Join Date: May 2001
Posts: 45,322
Received 1,019 Likes on 810 Posts
There's no such thing as real A.I. and probably won't be for a long time to come.

That's that.
Old 05-17-03, 09:02 PM
  #19  
DVD Talk Hall of Fame
 
Join Date: May 2001
Location: California
Posts: 7,729
Received 1 Like on 1 Post
For all we know, we could be inside a matrix right now.

Old 05-17-03, 10:30 PM
  #20  
DVD Talk Gold Edition
 
Join Date: Sep 1999
Posts: 2,041
Likes: 0
Received 0 Likes on 0 Posts
Originally posted by RichC2
There's no such thing as real A.I. and probably won't be for a long time to come.

That's that.
If by “real” AI you mean what is generally referred to in the field as Strong AI (as opposed to Applied AI or Cognitive Simulation, both of which have made significant strides) then yes, we are a long, long way from achieving that goal. In fact, there is no generally agreed upon definition of intelligence or even a measure that delineates success. Most researches agree that the Turing Test is far too limited; some even argue that its pursuit is nothing but a distraction. Nonetheless, computer technology and AI research are still in their infancy. Who knows what man will achieve in 20, or 50, or 100 years. After all, it wasn’t that long ago people said man would never fly or travel to the moon. And even if we should never achieve the goal, I believe there is merit in the quest.
Old 05-17-03, 10:47 PM
  #21  
Senior Member
 
Join Date: Aug 2002
Posts: 578
Likes: 0
Received 0 Likes on 0 Posts
Well, thirty years ago people were saying we'd have AI in twenty years. And we'd all have videophones and moving sidewalks too.

Instead of worrying about whether your toaster is going to beat you at chess, move your anxieties on to the next frontier of technology: nanites. It's one thing to be out-competed by super-genius computers. It's another thing to lose the evolutionary race to a brainless entity smaller than a speck of dust.
Old 05-17-03, 11:38 PM
  #22  
DVD Talk Legend
 
gcribbs's Avatar
 
Join Date: Aug 1999
Location: Sacramento,Ca,USA member #2634
Posts: 11,975
Received 2 Likes on 1 Post
I really doubt that machines would attack us anyway. They would just release a virus that would wipe us out in time while "helping us" to find a cure.

why destroy a perfectly good planet just to get rid of some animals- really
Old 05-21-03, 01:50 PM
  #23  
3Js
Member
 
Join Date: Feb 2001
Location: Oswego, IL , U.S.A,
Posts: 229
Likes: 0
Received 0 Likes on 0 Posts
why destroy a perfectly good planet just to get rid of some animals- really
I am not disagreeing with the sanity and good intent of the statement, but now you are thinking like all we humans think. Probably not as a intelligent machine would "think".

What real need would such intelligent machines have for animals?

They don't need to eat or be clothed, so those obvious needs are gone. They would likely not have any appreciation for the speed of a cheetah, or the grace of a swan, or the power of a whale. So there are no aesthetic reasons to keep them.

A logical thinking machine is likely to think in terms of everything as needs and benefits.

The short answer to your question is to remake the planet in their own image ala Borg in "First Contact". The animals (including us) could very well be regarded as an infestation much as we regard insects as unwanted pests to be destroyed rather than tolerated.

Some films show machines as wanting to become more like us such as Data in Star Trek Next Generation, V'Ger in Star Trek TMP, or the android in the film Bicentennial Man.

But does that necessarily follow?

Last edited by 3Js; 05-21-03 at 01:55 PM.
Old 05-21-03, 02:36 PM
  #24  
DVD Talk Godfather
 
Join Date: Jul 2000
Location: City of the lakers.. riots.. and drug dealing cops.. los(t) Angel(e)s. ca.
Posts: 54,199
Likes: 0
Received 1 Like on 1 Post
ok say it with me.. It's a movie...

I don't see my computer trying to kill me even though it's pretty high tech..

then again it does keep me indoors a lot and lets me have no social life and slowly kills me with it's evil rays... OMG!!! it's sooo true! run!
Old 05-21-03, 02:59 PM
  #25  
Suspended
 
Join Date: May 2003
Posts: 390
Likes: 0
Received 0 Likes on 0 Posts
The creation of a perfect AI when we eventually get to it is considered to be the creation of a new life form. We would all have to accept its existence no matter what. It may stem to reason from our point of view that the AI machines may think they're superior and attempt to squash our species but we don't know if that will be the case. They don't have the emotions like we do and not likely they ever will. So for them to gain power and control over the human species would not be feasible because they don't know the concept of it without the emotional factor. If they do have emotions to decide to eliminate our species they would also have emotions to take on responsibilities and the consequences that would follow upon themselves.

Thread Tools
Search this Thread

Archive - Advertising - Cookie Policy - Privacy Statement - Terms of Service -

Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.