Of course, this is just going to be the rantings of an old fart… but I guess that’s part of the reason to have a blog, right? To rant?
Geek alert… this is going to be a bit technical.
The other day we were discussing the need to move an internal website from a rough and tumble, less-than-actively-supported environment to something more mainstream (as in, supported by IT). I reflected that if the prototype environment was contained to single disk partition, it would be relatively easy to make a virtual image of it and move it to a supported VMware server and run it intact. (Translation: if the website was on a single disk drive, we could copy that disk drive to a file, which could be used by a piece of software to simulate a real computer).
The response I got was to the effect “yeah, that would work and wow do you know your stuff!”
I had to chuckle. Virtualization, the process of having a real computer simulate one or more computers, has been around since 1967 when IBM created an operating system called CP-40 (and later, CP-67). I first encountered virtualization more than a decade later when in 1979 I was a system programmer working at Hewlett Packard in their mainframe data center. I was responsible for installing IBM’s then current virtualizing operating system, VM (for virtual machine) on HP’s multi-million dollar Amdahl mainframe. At that time HP only had one mainframe (hey, they cost a lot of money, even for HP) and if a systems programmer wanted to try out a change to the operating system, you’d need to come in on the weekend for the few hours that the data center wasn’t running. By installing VM, we could run two simulations of our physical mainframe. One would run our production operating system, and allow business to carry on as usual. The other we could use to test new versions of the operating system. No more weekend testing!
Later, as I started to develop PC software, I watched how Intel added capabilities of their microprocessor chip. By the time Intel announced the 80386 version of their microprocessor chip in 1985, they had added everything needed so that it could simulate multiple computers using virtualization. It wasn’t until 1998 that VMware was formed and created the first software to virtualize the PC.
Being a virtualization affectionado I’ve been experimenting and using VMware’s software since 2001. This included (and still includes) running their Mac OS X specific version, Fusion, on my Mac Book Air. For those that are PC-only literate, Fusion allows me to run Apple’s Mac OS X operating system AND SIMULTANEOUSLY run Windows 7 on my Mac Book Air. I can readily switch between the two environments, including cutting and pasting (a version of which had been present in IBM’s VM mainframe operating system in the early 1980’s).
VMware is not the only company that provides virtualizing software. There are even open source versions.
So, referring back to the comment that set this off… yeah, I know this stuff. Been there, done that… and even in more than one environment.
But I think the bigger picture is this… in any maturing industry, great ideas are going to be reused. I’ve watched from a point where computers were so expensive that everyone had to share to a point where computers are so cheap everyone has one (or more!). I’ve watched the data move from that centralized model, where all the data is in one place, to a decentralized model (first with distributed minicomputers and later PCs on local area networks) and now back again… to the “cloud.”
Likewise, social media has its roots in technology that was created in the earliest days of computing. In the earliest days there was commercial systems like Compuserve, but even before Compuserve there were systems such as Douglas Engelbert’s NLS demonstrated in 1968. Oh, by the way, Engelbert demonstrated the mouse at the same time as part of the user interface to NLS.
Business models need to take this phenomena into account. Not only do you need to be aware of the side attacks that Clayton Christensen talks about in “The Innovator’s Dilemma,” you need to see where yesterday’s solutions might once again solve the problem’s your product is solving today.
You’ve heard it before: history repeats itself. Unfortunately, I think some technology companies think they are immune from that law. They are not. And there is nothing more embarrassing than to lose to the past.
Oh, and if you are looking for an industry to model where the past is new again, look west to Hollywood and east to Broadway. They’ve realized this and profited from it for years.