Programming: Ever wonder if...?
...it's all just smoke and mirrors?
Ultimately, any program running in 99% of the computers out there today is running the old fetch-execute cycle on a Von Neuman architecture: A processor fetches the next instruction, executes it, stores the result, and goes to the next instruction.
Atop that, we've layered: Subroutines, Modules, Functional Programming, Structured Programming, Object-Orientation (Objects, Inheritance, Polymorphism, Composition/Aggregation), Closures, Atoms, Processes, Multithreading, Semaphores, etc.
We're desperately trying to make the digital computer be more than the sum of its parts, just as a living human brain is somehow more than a collection of neurons and synapses. We *desperately* want abstractions...need them, in fact.
That's the whole point of the progression from bare binary programming, through assembly, to high-level languages, etc.--abstraction. The speed and capacity of computers grows. The speed and capacity of the minds of programmers does not. We fall victim to the same limitations as any other human--can only hold so much info in our heads at one time. These abstractions let us think of the programs in coarser form. Instead of considering the instructions, "Take an array, choose a pivot value, partition it into "less than pivot" and "more than pivot", then recursively sort those two segments the same way," I can think of the algorithm quicksort.
* * *
There's a tension there, though...ultimately, abstractions leak. They're lossy. When you abstract or model something, you're glossing-over some detail about how the underlying system functions, and praying that detail doesn't matter.
Sometimes, it matters.
So, sometimes, I wonder if the abstractions are just smoke and mirrors that'll bite us someday. :-)
Ultimately, any program running in 99% of the computers out there today is running the old fetch-execute cycle on a Von Neuman architecture: A processor fetches the next instruction, executes it, stores the result, and goes to the next instruction.
Atop that, we've layered: Subroutines, Modules, Functional Programming, Structured Programming, Object-Orientation (Objects, Inheritance, Polymorphism, Composition/Aggregation), Closures, Atoms, Processes, Multithreading, Semaphores, etc.
We're desperately trying to make the digital computer be more than the sum of its parts, just as a living human brain is somehow more than a collection of neurons and synapses. We *desperately* want abstractions...need them, in fact.
That's the whole point of the progression from bare binary programming, through assembly, to high-level languages, etc.--abstraction. The speed and capacity of computers grows. The speed and capacity of the minds of programmers does not. We fall victim to the same limitations as any other human--can only hold so much info in our heads at one time. These abstractions let us think of the programs in coarser form. Instead of considering the instructions, "Take an array, choose a pivot value, partition it into "less than pivot" and "more than pivot", then recursively sort those two segments the same way," I can think of the algorithm quicksort.
* * *
There's a tension there, though...ultimately, abstractions leak. They're lossy. When you abstract or model something, you're glossing-over some detail about how the underlying system functions, and praying that detail doesn't matter.
Sometimes, it matters.
So, sometimes, I wonder if the abstractions are just smoke and mirrors that'll bite us someday. :-)
the word "lossy" in scrabble?
ReplyDeleteLOL!
ReplyDeleteNo, it's the antonym of a(nother) jargon word--lossless, which comes to us from the world of compression.
"Lossless" compression (like the Huffman coding algorithm in WinZip/Gzip) remove reduntant sections to make data smaller, but they can decompress back to a bit-by-bit replica of the original.
"Lossy" is...er...not that. JPEG is lossy, because it loses information as it compresses. MPEG-3 (MP3's) does the same thing.
Lossless is perfect for machine use, and lossy works great for human consumption. People's minds "fill in the gaps" as they perceive the information, be it visual, auditory, or txtual. ;-)
Yeah, it worries me how many people you come across who don't know any low-level stuff at all, like pointers or binary or that ascii 7 and ctrl-g and tab are somehow related :-)
ReplyDeleteI had to modify some assembler in some startup routines to work around a compiler bug when I worked at the previous company and there was really no one around to give me any pointers. Held my breath for a few releases... :-)
And it's not just computers. If there were suddenly no supermarkets, how many would starve? I could catch some fish and make a fire, but have never done the hunting thing, let alone field-dressed a deer, etc.
ReplyDelete