Most, if not all the buzz surrounding information technology has to do with what’s coming and future possibilities. It also involves seo for clients, marketing, AI analyzations etc. It almost seems like a waste of time to focus on the past when trying to upgrade and enhance the one thing our country can’t seem to live without. But when looking at the information technology industry and how it’s grown, it’s practically mesmerizing. The one thing that’s certain is that advancements have regularly been made for decades, and each new creation has paved the way for bigger and better ones in the future. So why not take a look back at how exactly information technology became what it is today? And how much technical support is actually needed for fresh users of IT?
What Is Information Technology?
First, what in the world is information technology? In its most basic form, information technology is the utilization of computers to study, store, and control data or information. Information technology, or IT, is often used as a synonym for computer networks but it includes other information distribution technologies like phones, radios, and TVs.
Giving Information Technology a Name
The term “information technology” in its modern sense, first appeared in a 1958 article written by Harold J. Leavitt and Thomas L. Whisler. The article was published in The Harvard Business Reviewer. In the article, the authors commented “the new technology does not yet have a single established name. We shall call it information technology (IT).”
For Leavitt and Whisler, their definition included three categories: the application of statistical and mathematical methods to decision-making, techniques for processing, and the simulation of high-order thinking through computer programs.
The History of Information Technology
Information technology has been around for centuries, practically as long as humans have been around. That’s right; information technology hasn’t always been about computers. Centuries ago, there were ways of using technology to communicate, unlike the ways it’s used now. The history of information technology is divided into four ages. Those four ages are pre-mechanical, mechanical, electromechanical, and electronic.
The earliest age of information technology is the pre-mechanical age. This age occurred between 3000 B.C. and 1450 A.D. During this age, humans obviously didn’t have computers, televisions, or telephones, but they used their own forms of information technology that you’re likely somewhat familiar with. Humans would attempt to communicate using language or picture drawings called petroglyphs, usually carved into rocks. It was then that some of the first alphabets were created, like the Phoenician alphabet.
Alphabets became more popular, and people started writing information down, so pens and paper were developed. Paper was eventually created out of papyrus plant, but the most popular form of paper was created by the Chinese who used rags.
As more information was written down, a way to store information was created. It was then that books and libraries were developed. There’s a good chance you’ve at least heard of Egyptian scrolls. Yes, even those were considered a form of information technology!
It was also during the pre-mechanical age that a numbering system was developed. In India in 100 A.D., people created the first 1-9 system. Once numbers were created, so were calculators. The calculator became the first sign of an information processor, and at the time, the popular model was the abacus.
Between 1450 and 1840 a lot of new technologies were developed thanks to a sudden peak in interest in the area. The slide rule, the analog computer used to multiply and divide, was invented. The Pascaline, a popular mechanical computer, was created by Blaise Pascal. Charles Babbage created the difference engine which calculated polynomial equations using the method of finite differences.
Sure, these machines were nothing like the ones we’re used to today, given that they can’t do more than one type of calculation each, but these inventions paved the way to the modern processors and calculators we now use. Plus, to the people living during the mechanical age, these gadgets and processes were remarkable.
The electromechanical age occurred between 1840 and 1940 and included the start of telecommunication. The telegraph was invented. Morse code was invented by Samuel Morse in 1835, the telephone was created by Alexander Graham Bell in 1876, and the first radio was created by Guglielmo Marconi in 1894. As you can imagine, these technological inventions were greatly important, given how they led the way to even bigger advances in the field of information technology.
The first automatic digital computer in the United States was the Mark 1, created in 1940 by Harvard University. The large-scale computer was 8 feet tall and two feet wide and weighed 5 tons. Punch cards were used to program it.
Finally, there’s the electronic age that we all know and love. The electronic age is the time between 1940 and now. The ENIAC was designed to be used by the U.S. Army for artillery firing tables and was even bigger than the Mark 1. It was the first high-speed digital computer capable of being reprogrammed to solve a full range of computing problems. The ENIAC used mostly vacuum tubes for its calculations.
Also during the electronic age, high-level programming languages like FORTRAN and COBOL were created. The latest generation created and utilized central processing units which include memory, logic, and control circuits on one single chip. Also, the personal computer, known as the App II, was developed, as well as the graphical user interface (GUI). As information technology grew, so did the need for IT professionals. Large companies like Samsung in South Korea or Apple and Microsoft in the United States, seem to be leading the way, but it’s the smaller, lesser known shops like Mustard IT, or the mom and pop shop down the road, that truly keep information technology the thriving area our country now depends on.
Computers and information technology processors have only gotten faster and smarter. It’s a marvel all its own that we’ve simply been able to keep up. There’s no telling what’s next in the future of information technology.