I always thought it was hardware advancement or lack of, that dictates where software can and will go.
Somewhat true but not always the case. OpenGL and Direct X are probably the biggest examples counter to this argument. OpenGL and Direct X advance and provide incredible new features but routinely, there are no processors available to take advantage. Occasionally games embrace the absolute latest version of these driver sets and eventually the graphics add-on cards come out to support them.I always thought it was hardware advancement or lack of, that dictates where software can and will go.
For clarity's sake, I suggest you use "World Wide Web" in place of Internet. The Internet has been around for a very long time. It was only with the introduction of the WWW circa. 1993 that graphically-oriented online data was made truly accessible to the public.The only examples I can think of is:
Internet (not really software but a huge catalyst of change) > iMac
postscript + laserwriters + Pagemaker = publishing revolution.Adobe's postscript drove the introduction of Apple laserwriters, which created desktop publishing.
So did the Death Star.As the world developed a taste for full-blown multimedia, processors and sub-processors exploded.