Singularity missed – Piekniewski’s weblog

Every so often within the discussions of AI/AGI and what not comes the central determine of that total mental motion – Ray Kurzweil. And with him inevitably comes a type of an exponential chart just like the one under: 

Mainly the curve depicts Moore’s legislation (which isn’t disputable), with just a few further labels suggesting {that a} specific computing efficiency is someway equal to a processing energy of brains of varied animals. 

Superficially this appears to be like positive, however after all the issue is hidden in how can we arrive with these equivalences? The everyday reply to this query is that maybe the labels ought to transfer round on the curve left or proper however the sit there someplace so it is positive, we is likely to be off by a yr or two, who cares. The relevance of whether or not or not it even is sensible to put brains together with computer systems on that chart usually is not even questioned. 

Since I wish to maintain this put up brief, let’s minimize straight to the conclusion – this chart alone reveals we’re off by at the least 23 years from authentic predictions. Why?

Let’s take a more in-depth look: Kurzweil claims we needs to be seeing insect mind functionality in $1000 pc in 2001. Alright what would we count on from an insect mind? Nicely the issues an insect ought to be capable of do, with maybe the addition that we someway subject a command of what to do. 

So what can a bee do? It could possibly autonomously navigate in novel setting for miles, acknowledge and method a specific kind of a flower, choose up some nectar, navigate again to the hive (with out GPS, however maybe with a magnetometer), and do it many instances over in a day. 

So we should always count on to have the ability to have e.g. a fight drone that may take a grenade, navigate for miles to enemy territory, choose on an enemy goal, drop the grenade, and navigate again to the bottom. 

However we now have drones like that in Ukraine no? Sure. Remotely managed… 

That is all that you must know, to know that Kurzweil is off not by a yr or two however by some twenty years at the least. It is 2024 on the time of scripting this, and the perfect navy expertise that would profit from an insect mind like capabilities continues to be on distant management. I may write an identical story about what we should always count on if had robots managed with the aptitude of rodents or canines. However I do not assume I’ve to, you get the purpose. 

So if Kurzweil is true that regardless of the mind does might be considered as computation (which I’ve some severe doubts about and like to name “management”) and even giving him all the good thing about the doubt, empirical proof reveals we’re at the least 2 a long time behind. So we should not count on human like intelligence earlier than late 2040’s. That’s after all if we really get an insect mind this yr. Which we can’t. 

In the event you discovered an error, spotlight it and press Shift + Enter or click on right here to tell us.

Feedback

feedback