Serial and parallel programming style.
Until the past decade or so, most software programming was serial. That is, the code was meant to execute one statement at a time. That was fine, serial code performance kept increasing as long as microprocessor clock rates kept increasing.
Microprocessor clock rates stopped increasing a few years ago, and so people turned to parallel programming to extract more performance out of their application programs. One central idea in parallel programming is that independent data and instruction streams may be divided up and executed simultaneously. Much of the underlying computer processors (CPUs, GPUs and FPGAs) support this style of programming. What I find puzzling is that the code development environments do not seem (to me, at least) to fully support a parallel programming style.
I’m speaking from personal experience here. I know I absolutely have to draw out my parallel code on a large piece of paper and only then can I implement pieces of the code in a traditional text-based development environment. Much to my surprise, I’ve seen papers in cognitive neuroscience concluding our human brains use the speech center to work things out in a serial fashion and use the visual center to work things out in a parallel fashion. It sure seems like a useful parallel programming environment would fully support coding using visualization instead of coding using text characters from a computer language. To be fair, this may be confirmation bias on my part. Or perhaps a programming environment using only imagery it a bit too far off in the future. I do not know as I tend to have many more questions than answers. And I’d love to be able to create parallel programs from pictures.
Maybe a computer is a hardware and software system?
Once upon a time, there were three main types of computer people: the scientists, who used the Fortran programming language; the business people, who used the Cobol programming language; and the computer hardware designers. I was in the latter group.
The computer hardware designers used various extremely low-level assembly languages (and later the C programming language) to make their hardware designs work. Cobol and Fortran are high-level, rather abstract computer languages that require several layers of software infrastructure between them and the bare metal computer. These layers of system software infrastructure were created by the hardware designers at first. My point here is the computer hardware designers had to write the software to make their creations work for the scientists and business people.
Unsurprisingly, I came up through the ranks of the computer profession thinking that hardware and software development were two sides of the same coin. I viewed a computer as a system of hardware and software with a very fuzzy and malleable dividing line between the two.
I woke up one day and realized that somehow computer hardware and software had gotten separated and the two groups often viewed each other with distrust and even disdain. This often resulted in much finger-pointing and blaming when things at the hardware/software boundary did not work.
I tried sharing my view with people that a computer is a system that requires thoughtful design tradeoffs between its hardware and software aspects. This view was not well-received. The hardware/software distinction became even fuzzier (in my mind, at least) when hardware developers started using HDLs (Hardware Description Languages) to create their hardware instead of logic schematics. I’ve never been able to convince a hardware person that designing with an HDL looks a lot like software programming.
I returned to University and earned a degree in Computer Science in 1996, thinking that I would then be validated on both sides of the hardware/software coin. Much to my amusement, this software validation caused the hardware developers to distrust my views.
No worries, it is what it is. Hardware and software developers still ply their trade in separate groups and I still hold firm to the idea that a computer is a system that has two overlapping aspects to it. Although I remain amused that all (most?) of my profession makes an arbitrary distinction between computer hardware and software.