(a couple of exchanges on facebook with another objectivist about the possibility of a coming ‘Quantum singularity’ inspired this post. I was rather clumsily trying to explain my own opinion that even with exponential growth in computing and mechanized technologies, that the complexity of ‘human’ intelligence wasn’t going to be duplicated – at least not comparibly – any time soon)
This is gonna be crude, but I wanna get some form of this down as a mathematical representation.
If = C*e/c
left side: results produced from an intelligent system
If => a combined concept of both intelligence and functionality
right side: technological constraints at any point in time
C => capacity
e => efficiency
c => complexity
As I said, this is crude and I’m still working it out in my head as to what effects what and how, but let me try to at least explain my thinking.
If – This deals with the desired, expected or observable result. When speaking of intelligent systems, it would relate to the level of intelligent ability achieved and the functionality with which that intelligence could be applied.
Say for example you were speaking of an android where the desired result was ‘human-like’ behavior, thinking, reactions, speed, interaction with it’s environment etc. How well it emulated the human model would entail it’s ‘If’ rating.
It would be as subjective as the results desired or even those incidentally achieved by any such ‘intelligent’ machine and could consist of far more factors than I could list here, including some that might even be overlooked by a human observer with relative expectations.
To some extent it would even entail it’s own amount of efficiency, but in a different form than I use on the right side. (in this case pertaining to how efficiently the If behavior acted out, where the ‘e’ on the right pertains to how effectively the ‘hardware’ and ‘software’ technologies interact to achieve it)
C – Capacity is just what it sounds like. The speed, storage, command sets, and any other limit-specific parts making up the combined hardware and software (at least in present day IT terms) of the system. (e.g. the capacity of an average home computer today is comparable to my 4Ghz AMD dual core Turion processor with 333Mhz bus speed, 4Gb or ram and 300Gb SATA hard drive and tied to a 100baseT ethernet and a comcast internet connection)
e – efficiency is a little harder for me to weed out here because it also kind of ties into capacity above. But I thought it useful to include it here as it’s not simply a factor of hardware speeds, MIPs, memory size, bus speed, and net speed but how the system, it’s operating system software and any software running on top of that works within the confines of the hardware capacity. A poorly written program running on a fast computer will still run poorly, just faster.
And of course there are other things that can factor into ‘e’ – size of the components, the overall ‘cost’, the energy required to run it, even things such as the facility in which it is housed. (facilities management is a huge factor in large scale processing ‘clean room’ management – air temperature, security, power, back-up power, etc – even a well-thought-out home computing environment may well include an air conditioned airspace, a UPS system with a surge protector, a shelf full of ‘extra’ hardware components in the event of a partial failure and system of off-site back-up storage as ‘normal’ operating)
c – complexity is also hard because it kinda overlaps into efficiency while also being effected by the desired (or necessary) Intelligence and functionality. Basically I see this as the factor that relates to how complex the needs placed on the system are in relation to the resulting intelligence and functionality.
Adding 4+4 is not very complex. Looking at a Picasso painting and drawing from it’s designs abstract concepts, framing them into coherent thoughts then proceeding to move both the musculature of the lungs (to force air upward), vocal chords (to produce sound at the appropriate volume and pitch) and the throat, jaw, tongue and lips (to create the necessary resonances, enhancements and supplemental sounds) to deliver a strung out chain of phonic utterances in any given language sufficient to relay the concept coherently to another observer… is not.
Now imagine adding to that the need to breath, pump blood through the veins, fight off disease, maintain proper balance and ambulatory movement, monitor things such as hunger, thirst, fight/flight responses, processing and interpreting visual, audio and other sensory data in real time and adding it in real time to ongoing processes and behaviors, yadda yadda yadda
It’s easy to crunch numbers if all you do is crunch numbers in an enclosed box with nothing other than numbers to deal with!
So obviously Capacity (C) is expanding exponentially – there is still some debate over whether or not programming and other factors of efficiency (e) are improving in step with it or not. Compiler technologies have always been an issue (the interpreters between the many known programming ‘languages’ and the actual ‘machine’ code).
I can recall when the cold war was just winding down how many American programmers were fascinated with Russian programming for it’s efficiency. Russian computers were way behind their American counterparts, but their software was incredibly efficient as a result and the resulting functionality of their programs wasn’t as far behind (relatively speaking) things running on much faster and more advanced US computers.
One of the big drives that has been ongoing for over a decade is to approach programming from what is dubbed a ‘natural language’ perspective, where a need to know C, C++ or Java to program a computer will be a thing of the past. True, intelligent machines will improve the efficiency of such code, but most attempts thus far at achieving ‘natural language programming’ have involved either way too much complexity to make it feasible or produced very inefficient code due to the number of ‘layers’ involved in translating a layperson’s relating of a problem into an actual computer program capable of attempting to solve it.
So let’s for sake of argument say the ‘C’ is expanding exponentially but ‘e’ is reasonably constant as some improvements in compiler technology are met with other requirements or environmental factors that stifle the gains in efficiency. Ahhh, but complexity is growing by leaps and bounds also. The nature of the problems we are addressing with digital solutions, the tasks we are putting to them, is growing at an incredible rate.
If you go back to the android example, we’ve seen some very pre-alpha prototypes of human-like machines that talk or dance. But they are far from human. The trend is that each subsequent wave gets closer, but still doesn’t come that close. They will of course get closer over time, but before we see Star Trek’s data running around it’s gonna have to go a hell of a long way in growth of both Capacity (C) and complexity (c).
Basically what I was trying to demonstrate by putting it in the math equation is that Moore’s law (which is simply a factor of the ‘C’ for Capacity in my equation) is only one facet of the issue that leads to the end result (Intelligence and functionality or If). As you add to the complexity desired of the end result, it is a factor applied to both the Capacity and efficiency and for that reason I am still not fully convinced of any near-pending singularity. At least not in the form of a specific or near-instantaneous (relatively speaking) ‘event’.
addendum: with all that said, at some future point (with increased knowledge and understanding), the complexity to replicate human intelligence and functionality ultimately is a finite quantity. So it is perceivable that there is an amount of Capacity and efficiency that could outweigh the necessary complexity to create a reasonable facsimile (or at least something comparable in overall I and f)
But with that said, where I see the current issue is that we aren’t even close to fathoming the true complexity necessary to reach that finite quantity of ‘c’ nor do I think we will in the time periods currently being suggested.