The computer uses transitors which close and open allowing for only two
positions to use a binary code of 1's and 0's. The semiconductors that the
small transistors in the chips are made of will either conduct current or
not conduct current according to temperature. So, the millions of
transistors act like swithes which are either open or closed, to make a 1 or
0 for the binary code.
Let's consider these switches to be saying either yes or no. Comparing this
to human thought which has imagination and free will, it seems that human
thought also has maybe, and not just yes or no.
So, suppose a computer could be designed with transistor switches that say
yes, no or maybe. Most of the transistors would be the standard variety that
just say yes or no so the binary code can work in most areas of the computer
the way that it does now.
However, to give the computer free will and imagination, in the calculating
part of the computer certain key positions in the circuits could have these
yes, no or maybe swithces. The computer would excercise free will and
imagination at those points.
The computer would be no longer just a calculator but something beyond that.
How would you design a yes, no or maybe switch?
Well, possibilities could be:
If the switch is on, and you send it the signal to go off, then it either
goes off or stays on. Likewise if the switch is off and you send it the
signal to go on, it either goes on or stays off.
Another possibility is that the switch goes into a mode where it switches
back and forth from on to off, repeatedly.
These special maybe switches could also be set up so that they act like
regular on and off switches sometimes, and then sometimes they say maybe.
Perhaps a random signal generator could be used to help with the maybe
switch.
Experimenting could be done to see if this is feasible.
Now, a computer that thinks for itself, just like humans that think for
themselves, would probably sometimes make mistakes. However, the computer
that thinks for itself, having an imagination, might also see things
correctly in a totally new way, and this could be very beneficial because of
the large amounts of data that computers can handle in a short time.
It may also lead to terrible trouble with a rebellious computer that wants
to take over like Hal on the movie 2001 and collosus on the movie, Collosus the forbin project.
Any comments: