This thread:
http://forumz.tomshardware.com/Human-Brain-Fastest-Cpu-ftopict222985.html
got me to thinking about learning computer systems ... you know, like the computer that became self-aware in the Terminator movies or the computers in The Matrix movies that turned humans into batteries.
Brief summary:
We have software that can self-modify it's code based on branched decision trees or code that optimizes itself (self-optimizing).
We have other software that claims to "learn" based on trends of user input data, with corresponding adjusted output (expert systems? for example). The ouput however, remains limited within the software code, that is unalterable by the code.
Do we have any software yet that combines these two, so that it adjusts it's output according to input history AND is able to self-modify it's code to adjust output beyond what was originally envisioned by the programmer? Can a system really be considered a "learning system" if it's not able to self-modify it's code?
I don't know if I'm making much sense, so if I'm not, I can just ask:
How close are we to computers becoming self-aware?
Are there any autonomous learning systems in existence?
[edit]
What is the current state of "learning" computers?
How do people define a "learning" computer?
[/edit]
http://forumz.tomshardware.com/Human-Brain-Fastest-Cpu-ftopict222985.html
got me to thinking about learning computer systems ... you know, like the computer that became self-aware in the Terminator movies or the computers in The Matrix movies that turned humans into batteries.
Brief summary:
We have software that can self-modify it's code based on branched decision trees or code that optimizes itself (self-optimizing).
We have other software that claims to "learn" based on trends of user input data, with corresponding adjusted output (expert systems? for example). The ouput however, remains limited within the software code, that is unalterable by the code.
Do we have any software yet that combines these two, so that it adjusts it's output according to input history AND is able to self-modify it's code to adjust output beyond what was originally envisioned by the programmer? Can a system really be considered a "learning system" if it's not able to self-modify it's code?
I don't know if I'm making much sense, so if I'm not, I can just ask:
How close are we to computers becoming self-aware?
Are there any autonomous learning systems in existence?
[edit]
What is the current state of "learning" computers?
How do people define a "learning" computer?
[/edit]