The Evolution of Computing: Personal Computing
Initially, companies developed applications on minicomputers because it gave them more freedom than they had in the mainframe environment. The rules and processes used in this environment were typically more flexible than those in the mainframe environment, giving developers freedom to be more creative when writing applications. In many ways, minis were the first step towards freedom from mainframe computing. However, with each computer being managed the way its owner chose to manage it, a lack of accepted policies and procedures often led to a somewhat chaotic environment. Further, because each mini vendor had its own proprietary OS, programs written for one vendor's mini were difficult to port to another mini. In most cases, changing vendors meant rewriting applications for the new OS. This lack of application portability was a major factor in the demise of the mini.
During the 1980s, the computer industry experienced the boom of the microcomputer era. In the excitement accompanying this boom, computers were installed everywhere, and little thought was given to the specific environmental and operating requirements of the machines. From this point on, computing that was previously done in terminals that served only to interact with the mainframe — the so called “stupid terminals”— shall be made on personal computers, or machines that have their own resources. This new computing model was the embryo of modern cyberspace with all the services that we know today.
Meanwhile, when IBM introduced the first personal computer in 1982, the model 5150, companies around the world began installing personal computers throughout their organizations and experienced great benefits from the use of these simpler and cheaper machines. Mainframes were now seen as relics of the pass and this proliferation of personal computers ended the need for virtualization as a solution for multitasking. Later, in 1988, IBM introduced the Application System/400 (AS/400) that quickly became one of the world’s most popular business computing systems.
Moreover, as information technology operations started to become more complex, businesses of all sizes became aware of the need to control their IT resources. Client-server computing arrived during the early 1990s and microcomputers, now called servers, started to find their home in old computer rooms. The availability of quality and inexpensive networking equipment and new standards for networking cabling made it possible for businesses to use a hierarchical design to put servers in a data center hosting room inside the company. The term "data center" first gained popularity during this era referring to rooms which were specially designed to house computers and were dedicated to that purpose. But the aforementioned lack of compatibility between systems and the inability to migrate applications among them, were the key factors which dictated the fall of computing model based on the personal microcomputer.
Previous Post – Next Post