Jump to content
Phantis Forums
Sign in to follow this  
Tzatziki

Ai and Agi: Exciting Or Problematic?

Recommended Posts

What do you guys think, is the exponential growth and capabilities of artificial intelligence and also artificial general intelligence a good or bad thing for us in the long run?  People like Elon Musk are not keen and argue that at some point (possibly sooner than most people would guess) the machines will surpass human ability in many ways besides chess and develop past the point of us being able to control or even defend ourselves against them should they turn on us. Others like Mark Zuckerberg argue the exact opposite that this type of advancement would be an incredible advantage to mankind and the bots would mans new best friend assisting us in countless ways and making life better for all. 

Myself I feel that its all down to who is creating them, even though at some point they may have independent thoughts (I think they already do actually, but for now we can at least still shut em down see fb experiment below), their original design or make up would probably be the reason for whatever they eventually end up becoming or evolving into. If the military controls most of it well, it could be like the terminator movies. If it is some other companies with less violent aims, like research, computing, safety, clean up, wtv then they will go that way.

Have any of you looked into this at all or might know a thing or two about it?

 

 

 

terminator.jpg

Share this post


Link to post
Share on other sites

I think humans will merge with machines. It's already happening with replacing parts, including microchips into brains.

The question of AI is an open one. Anything can happen, but it's possible that AI will become its own species once it gains independence of action, can learn, can "procreate" and even become conscious. Now, I don't have a clue how we'd know about self-conscience. I don't know whether you have it, but I assume so since you're human like me.... but how about "machines" that may not want to be turned off??!!

Stephen Hawking thinks AI will be our doom. [source]

  • Like it 1

Share this post


Link to post
Share on other sites

It is times like these that I would like to sell everything I own (which is not much) and just go to the woods or next to the sea and become self-sufficient.

Share this post


Link to post
Share on other sites
On 12/21/2017 at 3:28 PM, Athens4 said:

It is times like these that I would like to sell everything I own (which is not much) and just go to the woods or next to the sea and become self-sufficient.

Unless you become very primitive, you can't be self-sufficient. No one knows how to do anything any more (from a to z), unless you're talking about constructing a hut and fish with primitive tools you also make.

Share this post


Link to post
Share on other sites

Look up Primitive Technology on Youtube.

Share this post


Link to post
Share on other sites

Interesting videos.  There might be some people on this planet that they have the knowledge and ability to survive in the state of nature without modern tools.

However, advancing civilizations brought  a division of labor (specialization) which increased efficiency and thus a single person didn't need to know how to make tools, build shelter, hunt, farm, etc... in other words to be self-sufficient. All of which are needed for survival in a state of nature.

Also, you don't think of the diseases, injuries, bad teeth, and accidents you may have had in your life, all of which could have killed you. Primitive people had an average life span of less than 30.

On 12/29/2017 at 10:35 PM, athinaios said:

Unless you become very primitive, you can't be self-sufficient. No one knows how to do anything any more (from a to z), unless you're talking about constructing a hut and fish with primitive tools you also make.

...when I said no one, I meant in a general sense people have left the primitive life (and lost the knowledge of survival in a state of nature). We now have a complex life because there's complexity with division of labor/specialization.   I don't need to know how my computer is made  (there's no single person who knows all the steps*), but this complexity allows me a richer life.

 

*Even if you provided the modern tools to the smartest more capable individual, no single person can construct a computer from scratch.  I mean, to know how to make plastic, procure rare earth elements, make and shape metal, etc. There are probably many thousands of people involved in making a single computer, including the farmers who grow coffee, the traders, the coffee makers, et al, who enable the code makers and engineers to function later into the night under man-made electric lights.... .Who also manage to have some free time because they don't have to cut a tree to make bow and arrows to go hunting for dinner.

Now this is the complexity modern civilizations have. 

Share this post


Link to post
Share on other sites

From the BBC: Humanity 2.0... Smarter, fitter, better? "Philosopher Julian Baggini explains the radical vision of transhumanism - where humans become part-machine."

https://www.bbc.com/ideas/videos/humanity-20-smarter-fitter-better/p05t8q6h?playlist=the-a-z-of-isms

Julian Baggini seems to think that part machine, part human is the only way forward.

Share this post


Link to post
Share on other sites
On 1/11/2018 at 5:15 PM, Lazarus said:

From the BBC: Humanity 2.0... Smarter, fitter, better? "Philosopher Julian Baggini explains the radical vision of transhumanism - where humans become part-machine."

https://www.bbc.com/ideas/videos/humanity-20-smarter-fitter-better/p05t8q6h?playlist=the-a-z-of-isms

Julian Baggini seems to think that part machine, part human is the only way forward.

There will be a different species (if we survive) in a few thousand years. We're already replacing human body parts with mechanical/electronic parts. Brain diseases, like Alzheimer's, may be cured with a microchip implant. Same with eyes, ears, etc. The key is the mind, the intelligence of a person, the self identity. But, this is pliable, I think. You wouldn't know otherwise if you were born into a certain species (transhuman?)

Many species have an average life span of 3 million years; we haven't done this yet. So, what would you choose if faced an extinction--your own or the species? Would you adapt or adopt the merging with machines? Would you choose to live in a virtual reality (in the ..Matrix) or opt for death?

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Recently Browsing   0 members

    No registered users viewing this page.



×