Which communication technology was first experimented with in the 1960s and 1970s?

As you might expect for a technology so expansive and ever-changing, it is impossible to credit the invention of the internet to a single person. The internet was the work of dozens of pioneering scientists, programmers and engineers who each developed new features and technologies that eventually merged to become the “information superhighway” we know today.

Long before the technology existed to actually build the internet, many scientists had already anticipated the existence of worldwide networks of information. Nikola Tesla toyed with the idea of a “world wireless system” in the early 1900s, and visionary thinkers like Paul Otlet and Vannevar Bush conceived of mechanized, searchable storage systems of books and media in the 1930s and 1940s. 

Still, the first practical schematics for the internet would not arrive until the early 1960s, when MIT’s J.C.R. Licklider popularized the idea of an “Intergalactic Network” of computers. Shortly thereafter, computer scientists developed the concept of “packet switching,” a method for effectively transmitting electronic data that would later become one of the major building blocks of the internet.

The first workable prototype of the Internet came in the late 1960s with the creation of ARPANET, or the Advanced Research Projects Agency Network. Originally funded by the U.S. Department of Defense, ARPANET used packet switching to allow multiple computers to communicate on a single network. 

Scroll to Continue

On October 29, 1969, ARPAnet delivered its first message: a “node-to-node” communication from one computer to another. (The first computer was located in a research lab at UCLA and the second was at Stanford; each one was the size of a small house.) The message—“LOGIN”—was short and simple, but it crashed the fledgling ARPA network anyway: The Stanford computer only received the note’s first two letters.

The technology continued to grow in the 1970s after scientists Robert Kahn and Vinton Cerf developed Transmission Control Protocol and Internet Protocol, or TCP/IP, a communications model that set standards for how data could be transmitted between multiple networks. 

ARPANET adopted TCP/IP on January 1, 1983, and from there researchers began to assemble the “network of networks” that became the modern Internet. The online world then took on a more recognizable form in 1990, when computer scientist Tim Berners-Lee invented the World Wide Web. While it’s often confused with the internet itself, the web is actually just the most common means of accessing data online in the form of websites and hyperlinks. 

The web helped popularize the internet among the public, and served as a crucial step in developing the vast trove of information that most of us now access on a daily basis.

The history that led to the development of IT as it's known today goes back millennia.

But the term information technology is a relatively recent development. The phrase first appeared in a 1958 Harvard Business Review article which predicted its future effects, titled Management in the 1980s:

"Over the last decade a new technology has begun to take hold in American business, one so new that its significance is still difficult to evaluate ... The new technology does not yet have a single established name. We shall call it information technology."

Information technology has evolved and changed ever since. This article will explore that history and the meaning of IT.

What is IT today?

Information technology is no longer just about installing hardware or software, solving computer issues, or controlling who can access a particular system. Today, IT professionals are in demand, and they also:

  • create policies to ensure that IT systems run effectively and are aligned with an organization's strategic goals;
  • maintain networks and devices for maximum uptime;
  • automate processes to improve business efficiency;
  • research, implement and manage new technologies to accommodate changing business needs; and
  • maintain service levels, security and connectivity to ensure business continuity and longevity.

In fact, today's modern hyper-connected data economy would collapse without information technology.  

The slow evolution of computers and computing technology

Before the modern-day computer ever existed, there were precursors that helped people achieve complex tasks.

The abacus is the earliest known calculating tool, in use since 2400 B.C.E. and still used in part of the world today. An abacus consists of rows of movable beads on a rod that represent numbers.

But it wasn't until the 1800s that the idea of programming devices really came along. At this time the Jacquard loom was developed, enabling looms to produce fabrics with intricate woven patterns. This system used punched cards that were fed into the loom to control weaving patterns. Computers well into the 20th century used the loom's system of automatically issuing machine instructions. But electronic devices eventually replaced this method.

In the 1820s, English mechanical engineer Charles Babbage -- known as the father of the computer -- invented the Difference Engine to aid in navigational calculations. This was regarded as the first mechanical computer device.

Then in the 1830s, he released plans for his Analytical Engine. The Analytical Engine would have operated on a punch card system. Babbage's pupil, Ada Lovelace, expanded on these plans. She brought these plans beyond simple math calculations and designed a series of operational instructions for the machine -- now known as a computer program. The Analytical Engine would have been the world's first general-purpose computer. But it was never completed, and the instructions were never executed.

Many of the data processing and execution capabilities of modern IT, such as conditional branches (if statements) and loops, are derived from the early work of Jacquard, Babbage and Lovelace.

Herman Hollerith, an American inventor and statistician, also used punch cards to feed data to his census-tabulating machine in the 1890s. This was an important precursor of the modern electronic computer. Hollerith's machine recorded statistics by automatically reading and sorting cards numerically encoded by perforation position. Hollerith started the Tabulating Machine Company to manufacture these machines in 1911. It was renamed International Business Machines Corp. (IBM) in 1924.

German engineer Konrad Zuse invented Z2, one of the world's earliest electromechanical relay computers, in 1940. It had very low operating speeds that would be unimaginable today. Later in the 1940s came Colossus computers, developed during World War II by British codebreakers. These computers intercepted and deciphered encrypted communications from German cipher machines, code-named "Tunny." Around the same time, British mathematician Alan Turing invented the Bombe. This machine decrypted messages from the German Enigma machine. 

Turing -- immortalized by the Turing Test -- first conceptualized the modern computer in his paper "On Computable Numbers" in 1936. In this piece, Turing suggested that programmable instructions could be stored in a machine's memory to execute certain activities. This concept forms the very basis of modern computing technology.

By 1951, British electrical engineering company Ferranti Ltd. produced the Ferranti Mark 1, the world's first commercial general-purpose digital computer. This machine was based on the Manchester Mark 1, developed at Victoria University of Manchester. 

The IT revolution picks up pace

J. Lyons and Co. released the LEO I computer in 1951 and ran its first business application that same year. MIT's Whirlwind -- also released in 1951 -- was one of the first digital computers capable of operating in real time. In 1956, it also became the first computer that enabled users to input commands with a keyboard.

As computers evolved, so too did what eventually led to the field of IT. From the 1960s onward, the development of the following devices set the stage for an IT revolution:

  • screens
  • text editors
  • the mouse
  • hard drives
  • fiber optics
  • integrated circuits
  • programming languages such as FORTRAN and COBOL

Today's IT sector is no longer the exclusive domain of mathematicians. It employs professionals from a variety of backgrounds and skillsets, such as network engineers, programmers, business analysts, project managers and cybersecurity analysts.

Read more here about the top cybersecurity careers.

The information revolution and the invention of the internet

In the 1940s, '50s and '60s, governments, defense establishments and universities dominated computing IT. However, it also spilled over into the corporate world with the development of office applications such as spreadsheets and word processing software. This created a need for specialists who could design, create, adapt and maintain the hardware and software required to support business processes.

Various computer languages were created and experts for those languages also appeared. Oracle and SAP programmers emerged to run databases, and C programmers to write and update networking software. These were in high demand -- a trend that continues to this day, especially in areas of cybersecurity, AI and compliance.

The invention of email in the 1970s revolutionized IT and communications. Email began as an experiment to see if two computers could exchange a message, but it evolved into a fast and easy way for humans to stay in touch. The term "email" itself was not coined until later, but many of its early standards, including the use of @, are still in use today.

Many IT technologies owe their existence to the internet and the world wide web. However, ARPANET, a U.S. government-funded network that was conceptualized as an intergalactic computer network by MIT scientists in the 1960s, is considered the precursor of the modern internet. ARPANET grew into an interconnected network of networks from just four computers. It eventually led to the development of Transmission Control Protocol (TCP) and Internet Protocol (IP). This enabled distant computers to communicate with each other virtually. Packet switching -- sending information from one computer to another -- also brought machine-to-machine communication from the realm of possibility to fruition.

Tim Berners-Lee introduced the World Wide Web, an "internet" that was a web of information retrievable by anyone, in 1991. In 1996, the Nokia 9000 Communicator became the world's first internet-enabled mobile device. By this time, the world's first search engine, the first laptop computer and the first domain search engine were already available. In the late '90s, search engine giant Google was established.

The turn of the century saw the development of WordPress, an open source web content management system. This enabled humans to move from web consumers to active participants, posting their own content.

IT continues to expand

Since the invention of the world wide web, the IT realm has quickly expanded. Today, IT encompasses tablets, smartphones, voice-activated technology, nanometer computer chips, quantum computers and more.

Cloud computing, first invented in the 1960s, is now an inseparable part of many organizations' IT strategies. In the 1960s and '70s, the concept of time-sharing -- sharing computing resources with multiple users at the same time -- was developed. And by 1994, the cloud metaphor described virtual services and machines that act as real computer systems.  

But it wasn't until 2006 and the creation of Amazon Web Services (AWS) that cloud computing really took off. AWS and its top competitors -- Google Cloud Platform, Microsoft Azure and Alibaba Cloud -- now hold the largest slice of the cloud computing market. The top three providers -- AWS, Google and Azure -- accounted for 58% of the total cloud spending in the first quarter of 2021.

Learn more about the history of cloud computing here.

Over the past decade, other technological advancements have also influenced the world of IT. This includes developments in:

  • social media
  • internet of things
  • artificial intelligence
  • computer vision
  • machine learning
  • robotic process automation
  • big data
  • mobile computing -- in both devices and communications technologies such as 4G and 5G

Connectivity between systems and networks is also on the rise. By 2030, there will be an estimated 500 billion devices connected to the internet, according to a Cisco report.

What was the world's first major communication network?

Which of the following was the world's first major communication network? The telegraph was the first electronic form of communication.

Who developed the first major communication network?

In what was no doubt the first communications network "install," telephone developers Alexander Graham Bell and Thomas Watson in 1876 strung telegraph cable around their Boston neighborhood and held a conversation over a two-mile distance.

What is the goal of the cross platform model?

The goal of cross-platform or cross-channel attribution is to gain visibility into performance across the entire media mix and reveal how each marketing channel, tactic, or campaign contributes to conversions and sales.

What has an average weekly audience of approximately 14 million?

The average weekly unique users who download NPR podcasts, which include some of the most popular podcasts in the Apple Podcasts charts, such as Up First and Fresh Air, rose from 11.3 million in 2019 to 14 million in 2020, according to data provided by NPR.