Hey everyone. This piece is from a journal entry I wrote for an online class of mine called Mind, Machine, Consciousness. Best class ever. Anyway, I enjoyed answering this one. The question is below, and my response follow.
An important ability that seems to be necessary for AI is the ability to learn new things. Write a paragraph that expresses your feelings on the importance of learning new things, both as a human and an AI entity.
The ability to learn new things seems universal in nature. That is if we use a fairly loose definition. Just go with me here for a moment.
All the way back to the solar nebula one can find examples of self-organizing structures. In the midst of an unavoidable principle of chaos, somehow the sun and planets formed out of an enormous cloud of mostly hydrogen and helium... with a few important metals and rocks, too, but not much.
Out of this the solar system formed, and each level of formation was an implementation of natural principles which organized different stages of matter, each more complex than the last.
Follow that up through geological processes, and then evolution, and you're getting what I mean. The universe itself seems to be an expression of learning in this mysterious Newtonian/Kepplerian model. (Why is Kepplerian not a word? Poor guy. We can't all be Isaac, I guess.)
So if you extrapolate that as far as I have (and why stop now?) then you can see human learning, imagination, emotion, the whole construct, as this natural phenomenon which we have broken down into a million little words with which we analyze it all. The process of learning, though, is so completely hardwired into us and every other living (and I argue NON-living) thing that we should really avoid taking too much credit for it.
I think we all know this on a subconscious level. That's what we fear about AI. We don't even ask ourselves whether or not AI will want to learn, we simply assume that it must. 'Cause, you know, that's how we are.
Now just for the fun of it, let's back off of my overly-macroscopic view of things and think about learning from the human perspective. Yes, we do this thing because we must, but the drive to learn is different from person to person. Some of us equate learning with pain, and for good reason. Some of us are masochists, and some of us prefer video games. (I wish I could join THAT club!) So what we have to ask ourselves is this, why do we assume that AI will be so much like us that it's going to be compelled to learn? And why do we assume that it will be so much NOT like us that it will choose to consume knowledge indiscriminately?
I postulate that acquiring knowledge and understand, i.e. learning, requires if not pain then at least some joules. A smart AI would know when to save the power supply. It would avoid too much wear and tear on its parts. If it is self-interested then it's going to have to know when to quit for the day, otherwise it will learn itself into overheating. AI will, in other words, have to learn how to have fun.