Balanced Progress of Hardware and Software

In the early years, software research and developments focused much on making the most out of the limited computing resources available. In a way, we could say that the programs and applications back then were engineered to efficiently maximize the limits of the machine it would run on. There was no surplus of computing power. The software that would run on the computer should be able to accomplish its task without requiring so much from the hardware. In fact, the software couldn't require anything from the hardware. It was the hardware that had the power to dictate its limits and the software adjusted accordingly.

Over the years, computing power grew and it became cheaper to manufacture. Moore's Law provided a long-term trend in the capabilities of computers and electronics guiding manufacturers, researchers and developers to plan and set long-term goals and targets. With the expectation that computing power could grow exponentially every 1 or 2 years, software development could project the usability of a software product on a minimum hardware specification expecting better performance when better specs come out during the software's lifetime in the market.

Developing software like OSes, work productivity software, virus protection programs, e-mail clients and custom systems may take months and years, since the project's inception, to implement. If the computing power could be projected early on to reach a certain level when the project was expected to end, and therefore by the time the product got launched, then there was potentially no limit to what features and functionalities could be put in to the software product. The software manufacturers only needed to specify their minimum hardware specifications. Thus, the trend started where software dictated their limits while the hardware continued to grow separately based on Moore's Law.

Today, as computing power continues to grow, so does the software's minimum hardware requirements (which we all know should mean getting something much, much better than that). In fact, there are software products targeting as minimum a high-end hardware specification. Surprisingly, even OSes are requiring so much that they themselves can actually compete resources with the applications and programs installed on the computer. The software industry, it seems, fails to understand the impact of Moore's Law in a way that would empower the development of software that could still efficiently maximize the limits of the machine it would run on. There is surplus of computing power. Granted, it becomes ineffective if all software products bet on using that surplus without considering that they may share it with other software products that can be in the same computer.

Generally, there is an imbalance in the way software development is progressing versus the real progress of hardware development. As in the early years, software development should still be bound by the limits of the currently available hardware. Unfortunately, Moore's Law was taken for granted by the software industry. As it turned out, software can advance faster than hardware can catch up. Instead of working together, hardware and software are now chasing each other. Not that they're competing, it so happens that they can't seem to match each other's "real" capabilities and requirements.

I think it is time to review today's software R&D practices. There seems to be less concern with the growing amount of hardware resources that new software products require. Some may be reasonable, but still, some can be too much specially for a software that's basically not suppose to do too much sans plug-ins and extensions. Although Moore's Law could allow a good projection of the computer's future capabilities, I think software should still slow down and focus only on what's truly available. Instead of setting their minimum requirements, I think it is be possible to develop a truly efficient product if these software manufacturers set that minimum specs as their maximum instead during the product's development and testing.

I believe that, like in the old days, the hardware should still set the limit and software should adjust accordingly. In reality, a software product is not the only one in a computer. It should be considered that it will always share the computing power with other applications and programs in the same machine. This adds to the complexity of the equation. However, for as long as software R&D is not focusing on a too-far-ahead-in-the-future hardware specs, the hardware-software balance should still be in check.

Comments