The history that led to the development of IT as it is known today goes back millennia.
But the expressionInformation technologyis a relatively new development. The term first appeared in a 1958Harvard Business Reviewarticle that predicted its future effects, titledManagement in the 1980s:
"During the past decade, a new technology has begun to take hold in American business, one so new that its significance remains difficult to assess ... The new technology does not yet have a single established name. We will call itInformation technology."
Information technology has evolved and changed ever since. This article will explore this history and the importance of IT.
What is IT today?
Information technologyis no longer just about installing hardware or software, solving computer problems, or controlling who can access a particular system. Today, IT professionals are in demand and they:
- create policies to ensure that IT systems run efficiently and are aligned with an organization's strategic goals;
- maintain networks and devices for maximum uptime;
- automate processes to improve business efficiency;
- research, implementation and management of new technologies to meet changing business needs; and
- maintain service levels, security and connectivity to ensure business continuity and longevity.
In fact, today's modern hyper-connected data economy would collapse without information technology.
The slow development of computers and computer technology
Before the modern computer ever existed, there were precursors that helped people perform complex tasks.
The abacus is the earliest known calculating tool in use since 2400 BCE. and is still used in parts of the world today. An abacus consists of rows of movable beads on a rod that represent numbers.
But it was not until the 19th century that the idea of programming devices really arose. At this time, the Jacquard loom was developed, allowing weavers to produce fabrics with intricate woven patterns. This system used punch cards that were fed into the loom to control weaving patterns. Computers well into the 20th century used the loom's system to automatically issue machine instructions. But electronic devices eventually replaced this method.
In the 1820s, the English mechanical engineer Charles Babbage - known as the father of the computer - inventedDifferent engineto help with navigation calculations. This was considered the first mechanical computing device.
Then in the 1830s he published plans for hisAnalytical engine. The analytical engine would have worked on a punch card system. Babbage's student, Ada Lovelace, expanded on these plans. She took these plans beyond simple mathematical calculations and designed a set of operational instructions for the machine—now known as a computer program. The Analytical Engine would have been the world's first general purpose computer. But it was never completed and the instructions were never carried out.
Many of the data processing and execution capabilities of modern IT, such as conditional branches (if statements) and loops, stem from the early work of Jacquard, Babbage and Lovelace.
Herman Hollerith, an American inventor and statistician, also used punched cards to enter data into his census tabulating machine in the 1890s. This was an important precursor to the modern electronic computer. Hollerith's machine recorded statistics by automatically reading and sorting cards numerically coded by perforation position. Hollerith started the Tabulating Machine Company to manufacture these machines in 1911. It was renamed International Business Machines Corp. (IBM) in 1924.
German engineer Konrad Zuse invented the Z2, one of the world's earliest electromechanical relay computers, in 1940. It had very low operating speeds that would be unthinkable today. Later in the 1940s came Colossus computers, developed during World War II by British codebreakers. These computers intercepted and deciphered encrypted communications from German encryption machines, codenamed "Tunny". Around the same time, the British mathematician Alan Turing invented the bomb. This machine decrypted messages from the German Enigma machine.
Turing - immortalized byTuring test- first conceptualized the modern computer in his paper"On computable numbers"i 1936.In this piece, Turing proposed that programmable instructions could be stored in the machine's memory to perform certain activities. This concept forms the very foundation of modern computer technology.
In 1951, the British electrical engineering company Ferranti Ltd produced Ferranti Mark 1,the world's first commercial general purpose digital computer. This machine was based on the Manchester Mark 1, developed at Victoria University of Manchester.
The IT revolution is gaining momentum
J. Lyons and Co. released the LEO I computer in 1951 and ran its first business application that same year. MIT's whirlwind--also released in 1951—was one of the first digital computers capable of operating in real time. In 1956, it also became the first computer to allow users to enter commands with a keyboard.
As computers evolved, so did what eventually led to the field of IT. From the 1960s onwards, the development of the following devices set the stage for an IT revolution:
- screens
- text editors
- the mouse
- hard drives
- fiber optics
- integrated circuits
- programming languages such asFORTRANandCOBOL
Today's IT sector is no longer the exclusive domain of mathematicians. It employs professionals from a variety of backgrounds and skills, such as network engineers, programmers, business analysts, project managers and cyber security analysts.
Read more here abouttop cyber security careers.
The Information Revolution and the Invention of the Internet
In the 1940s, 50s and 60s, governments, defense institutions and universities dominated IT. However, it also spilled over into business with the development of office applications such as spreadsheets and word processing software. This created a need for specialists who could design, create, customize and maintain the hardware and software required to support business processes.
Different computer languages were created and experts for these languages also appeared. Oracle and SAP programmers emerged to run databases, and C programmers to write and update network software. These were in high demand – a trend that continues to this day, particularly in areas such as cyber security, AI and compliance.
The invention of email in the 1970s revolutionized IT and communication. Email began as an experiment to see if two computers could exchange a message, but it evolved into a quick and easy way for people to stay in touch. The term "email" itself was not coined until later, but many of its early standards, including the use of @, are still in use today.
Many IT technologies owe their existence to the Internet and the World Wide Web. But ARPANET, a US government funded network that was conceptualized as aintergalactic computer networkby MIT researchers in the 1960s, is considered the forerunner of the modern Internet. The ARPANET grew into an interconnected network of networks from just four computers. It eventually led to the development of the Transmission Control Protocol (TCP) and the Internet Protocol (IP). This allowed remote computers to communicate virtually with each other. Packet switching - sending information from one computer to another - also brought machine-to-machine communication from the realm of possibility to reality.
Tim Berners-Lee introduced the World Wide Web, an "internet" that was a network of information that could be accessed by anyone, in 1991. In 1996, the Nokia 9000 Communicator became the world's first Internet-enabled mobile device. At this time, the world's first search engine, the first laptop computer and the first domain search engine were already available. In the late 90s, the search engine giant Google was established.
At the turn of the century, WordPress, an open source web content management system, was developed. This allowed people to move from web consumers to active participants, posting their own content.
IT continues to expand
Since the invention of the World Wide Web, the field of IT has expanded rapidly. Today, IT includes tablets, smartphones, voice-activated technology, nanometer computer chips, quantum computers, and more.
Cloud computing, first invented in the 1960s, is now an integral part of many organizations' IT strategies. In the 1960s and 70s, the concept of time-sharing—sharing computer resources with multiple users at the same time—was developed. And in 1994, the cloud metaphor described virtual services and machines that act like real computer systems.
But it wasn't until 2006 and the creation of Amazon Web Services (AWS) that cloud computing really took off. AWS and its biggest competitors – Google Cloud Platform, Microsoft Azure and Alibaba Cloud – now have the largest share of the cloud computing market. The top three providers - AWS, Google and Azure - were responsible58% of total cloud spending in the first quarter of 2021.
Learn more abouthistory to cloud computingher.
Over the past decade, other technological advances have also affected the IT world. This includes development in:
- Social Media
- internet of things
- artificial intelligence
- computersyn
- machine learning
- robotic process automation
- big data
- mobile computing - in both devices and communication technologies such as 4G and 5G
The connection between systems and networks is also increasing. By 2030, there will be an estimated 500 billion devices connected to the Internet, according to a Ciscorapport.