Just because we are conscious does not mean we have the smarts to make consciousness ourselves. Whether (or when) AI is possible will ultimately depend on whether we are smart enough to make something smarter than ourselves. We assume that ants have not achieved this level. We also assume that as smart as chimpanzees are, chimps are not smart enough to make a mind smarter than a chimp, and so have not reached this threshold either. While some people assume humans can create a mind smarter than a human mind, humans may be at a level of intelligence that is below that threshold also. We simply don't know where the threshold of bootstrapping intelligence is, nor where we are on this metric. _KevinKelly
Kevin Kelly has created a "Taxonomy of Minds" as a way of classifying different types of minds and what they might be able to do.
Precisely how a mind can be superior to our minds is very difficult to imagine. One way that would help us to imagine what greater intelligences would be like is to begin to create a taxonomy of the variety of minds. This matrix of minds would include animal minds, and machine minds, and possible minds, particularly transhuman minds, like the ones that science fiction writers have come up with.
Imagine we land on a alien planet. How would we describe or measure the level of the intelligences we encounter there -- assuming they are greater than ours? What are the thresholds of superior intelligence? What are the categories of intelligence in animals on earth? _Read the rest...TaxonomyofMinds
The actual development of superior minds is more likely to occur via evolutionary mechanisms, rather than from straightforward design from principle. The adaptive landscape graphic above provides a small portion of an evolutionary adaptive landscape. Creatures that achieve the higher peaks may be capable of achieving greater feats, but also may be more subject to extinction when the environment shifts -- or when the adaptive landscape is enlarged by merging with a previously separate adaptive landscape (building a bridge between islands, tunneling through a mountain chain, digging a canal through an isthmus, or the emergence of an intergalactic wormhole).
Rather than waiting until our minds become capable of creating other minds, it is more likely that humans will create an evolutionary landscape from which a more intelligent mind than human minds might emerge.
Recently, in conversations with George Dyson, I realized there is a fifth type of elementary mind:This is the approach to AI which Al Fin cognitive scientists have been promoting and utilising. It would be fooling one's self to imagine that it will be easy to evolve a smarter mind. But at least it is not impossible, as most conventional approaches to AI are proving themselves to be. (conventional AI researchers are attempting quantitative solutions where qualitative solutions apply)
5) A mind incapable of designing a greater mind, but capable of creating a platform upon which greater mind emerges.
This type of mind cannot figure out how to birth an intelligence equal to itself, but it does figure out how to set up conditions of evolution so that a new mind emerges from the forces pushing it. _Technium
There is something quite amusing here: The human mind itself can flit among the taxonomy of minds, at any given time. Because of how the human brain evolved, and the paths we have taken in development, each one of us is multitudes. Without a doubt, we all need better training in using our minds.
More: An interesting set of links to sources which expect or assume the imminent creation of a super-human machine intelligence (and a consequent "singularity") and a few sources which are critical of such a "hard take-off" to singularity superintelligence
Al Fin is among the skeptics of the "techno-singularity" concept. Rather, Al Fin expects any near-term singularity to be of the "bio-singularity" variety.
Taken from an earlier Al Fin posting