This is a three-part fast-paced and non-techy journey on the evolution of the machines that have been doing the heavy-duty for us: both Users and System Administrators everywhere.
It’s a homage to electronic equipment…we know! But we owe these machines a lot!
Part I [1940s-1990s]
Mainframes, Standards and the rise of the Personal Computer.
Let’s get it straight, servers are still just computers, fancy, expensive and glorified computers, at least in the most common use of the word, anyways.
Once upon a time…
It’s a given, in the 40’s these first computers were huge, clunky and slow, but they started to do some pretty cool stuff right away: the first remote access computer was designed by George Stibitz and was later demonstrated in the American Mathematical Society conference, performing calculations in a remote CNC using special telephone lines.
In 1942, the first computer (Atanasoff-Berry) was able to store information in their main memory, and it could solve systems of linear equations (out of all things, we know…), was built by Professor J. Atanasoff and a grad student by the name of C. Berry. This was actually a huge milestone for computers and definitely did raise the bar for the era.
ENIAC starts the performance and general-purpose revolution in computing calculation by using electronics instead of electromechanical technology (aka vacuum tubes). Built by John Mauchly and J. Presper Eckert from 1943 to 1945, this was a time well invested, you see, this 30 tons computing system was able to run 1000 times faster than any previous computer! It was more reliable and we happen to have a photo to prove how cool it was:
By the mid-century, the UNIVAC1 was publicly introduced as the first commercial computer with an original price of US$159,000, a way too expensive toy, even for most universities. Still, 46 of these systems were able to be sold and deployed.
The first mass-produced computer was built by IBM. In the first year alone, IBM sold 450 IBM650 which was fascinating, considering they already have 12.5K rpm data storage. This allowed faster store speeds rather than that of the drum storage computers.
Later in the decade, MIT Research Laboratory of Electronics built the first transistor designed for general-purpose programmable computers: the TX-0. This prototype was built in order to test large magnetic core memory, transistor circuitry and hosted imaginatively programming tests like 3D tic-tac-toe.
Through over the 50’s more than 100 computer programming languages were developed, to run them in the next few decades and some of them still active today.
The 60s and 70s – Computer flower power!
Mainframes were brought to us by the big tech companies of the time such as IBM, Hitachi among others. But these can’t yet be considered the modern server as we know it, for one, they were too entrapped in their own proprietary protocols and hardware, as they weren’t able to speak with foreign systems. They required tailored operating systems, hardware, and the implementations of such systems were extremely expensive and with long implementation periods.
These super machines were quite powerful, centralized in their approach. Terminals were just that: endpoints for a central processing unit that managed all the resources and managed all security in a single place.
And this was ok and appropriate… for the time being.
Note: There was actually a lot happening during these decades, but mostly involved with the computer development itself. True, huge improvements in databases and applications used in academic research, but that’s a world in and of itself, we just can’t cover them here.
It’s the 80’s, the rise of the PC
Operating systems were now mature enough to address security, reliability and a wide variety of services as these machines were now required to “talk and understand” each other. By the same time, the golden age of processor production was getting stronger with big advances in electronics manufacturing. These alone boosted greatly the servers computation power.
As soon as the personal computer was getting equipped with modems, and a phone line allowed you to travel to the new digital world, the demand for new services soared, and it was a matter of time, for the modern Server to find its place in our advanced computer room.
The early Modern Servers were nothing more than a PC version with more than average resources, a machine that hosted files and was able to communicate with other computers in their network and the Internet. Most importantly, they were now able to accept OEM components, and this was actually the land of dreams for any potential computer operator.
As wide communication networks started to connect various commercial sectors (such as banks, insurance companies and manufacturers) in the early 80s, standardization was also inevitable on the server-side.
The PC boom: Welcome to the 90s!
As companies started shifting from paper into massive internal databases, this meant quite a challenge for backup and storage infrastructures.
Also, the rise of the mighty web brought us improvement in web server technologies during the 90s.
At the same time the personal computer had just stepped up as a household device, and many users were now demanding new internet services, despite not even knowing what they should look like… it’s just like everyone knew that the Internet was full of potential, but until the 90s no one had figured a way to exploit it.
And so starts the reign of Dot.Com. Everything was e-”this” or virtual-”that”. if it was on a computer and over a phone line, it was definitely cool and we all wanted to be part of it.
As personal computers prices kept getting lower, employees were having their first contact with new modern GUI OS’s (yes, we’re talking about Windows 3.1!), even the annoying modem dial screeching was still music to every tech-savvy ear.
Unknowingly to people, they were training for the jobs of the future, training their mouse and keyboard skills, to fulfil the digital shift that companies already started working towards.
Oh! and they needed people with computer skills, loads of them…
End of the part I
This is an article written by the MD3’s IT team.