@martenus This is from an off-topic conversation elsewhere, this topic is to make it on-topic. All machines only do what they were programmed to. If the programming involves adpating/learning new things, even to the point of re-writing its own programming to do it, that is part of its original programming. We too only work within the parameters provided to us (note that whether or not you believe some other entity provided those parameters to us is another matter). A useful way to define an AI as being AI (not just a program), might be in comparison to human intelligence. But that leads to some rather serious issues: We don't really know how we work, how can we compare? We can't really define how much 'computing power' we have 'Computing power', 'intelligence' etc is such an abstract concept. Different types of intelligence cannot easily be compared Do we want AI with the same type of intelligence as us? Maybe remembering your great aunt's name is great, but brute-force grunt work like computers can do now (or something else entirely) will be more suited for situation x If we want AI with the same type of intelligence as us, maybe the only way (or best or most efficient way) to do it is stop using silicon and start using biology. AKA create me a human but give him four legs and x-ray vision All in all, it's probably not a good idea to define AI in terms of humans. Screw social issues. If you lose a job due to being surplus to requirement, you change to a field where you are required. If it comes to a paradigm shifting scenario like we are considering, maybe we should change the model entirely . Go to the utopian scenario of not needing to work, rather we work if we want to, on things we want to work on (how many PA mods would there be if it was people's full time occupation?). I don't know, maybe the system could work based on your contribution to society, people could see your contribution and hit a 'like button'. Let it be known I am not advocating facebook BTW I think 'soul' is a bit of a wishy-washy term. We are a function of all our experiences, something so utterly complex that we will likely never understand it in a meaningful or true way. Whether there's some secret-sauce in the mix who knows, but that's the definition of 'soul' I like to use. And here's the words artificial intelligence so anyone searching will find this topic.
True artificial intelligence can: Pass the Turing test. It is indistinguishable from a human when communicating with it. Learn from previous experiences and adjust behaviour accordingly. Truly learn and understand language. Not just knowing "gravity" is related to "Isaac Newton" but understanding what those terms actually mean. Interact with humans in a natural manner, using speech or text. http://en.wikipedia.org/wiki/The_Singularity_Is_Near
I believe this is correct but this is a too specific requirement. There are many uses for an AI that do not require the AI to behave anything like a human. Let's say you use an AI for science or to control an unmanned probe or space craft. You certainly don't want it to react emotionally or deceitful. It would have to capable to behave in such a way to pass the Turing test though. I don't think we want to give an AI this capability if we can help it...
I never understood why people try defining intelligence based on human norms. If you want to be able to compare intelligences across the board, you need a single reference point. On that note, I've yet to see a definition of quantifiable intelligence I've agreed with, so that might be tricky. Also, didn't some Unreal bots pass as human, in that people thought they acted more like humans than the human players?
I'd say that's a bit too ambitious for now, considering most humans hardly know what 90% of the words they use every day actually mean.
Link me. The Turing test pits a computer against a human (and a second human as a control) in a text-based chat. The computer must chat in a manner which is indistinguishable from a human to pass.
http://phys.org/news/2012-09-artificially-intelligent-game-bots-turing.html It's not text based, but I'd prefer bullets as a form of communication over words any day.
I was referring to the virtual bullets in UT2004 with the intent of implying that nonverbal behavior is more significant than "traditional" forms of communication.
Well, it specifically cites stupidly perusing enemies because of grudges even when at an obvious disadvantage. So nonverbal rage, yes.
As long as they don't make excuses or type "[I actually don't think I should post this swear here...which kind of ruins the joke ]" I'm not convinced.