The History of Information Technology | Complete I.T. (2024)

Timeline of important IT milestones

Although this section could go as far back as 2400 BC with the production of the first known calculator (abacus) in Babylonia, it will focus on the information technology boom in recent centuries.

The first mechanical computer device was conceptualised and invented by English mechanical engineer and polymath Charles Babbage in the early 19thcentury. Called the ‘Difference Engine,’ it was originally created to aid in navigational calculations. Often referred to as the ‘Father of the Computer’, Babbage came up with the more general ‘Analytical Engine’ in 1833 which could be used in fields other than navigation. Funding constraints meant that Babbage died without seeing his machine completed, however his son Henry completed a much simpler version of the machine in 1888, which was successfully demonstrated to the public in 1906.

Early computers were not developed until the mid 1900s, when a more compact analogue electromechanical computer, that used trigonometry, was installed on a submarine to solve a problem with firing torpedoes at moving targets.

The Z2, the first electromechanical digital computer, invented by Engineer Konrad Zuse in 1939, used electric switches to drive, and relays to perform calculations. Devices like the Z2 had very low operating speeds and were eventually succeeded by faster all electric machines, such as the first fully automatic 1941 Z3, also created by Zuse.

Colossus, a set of computers created between 1943 – 1945, are widely recognised as the world’s first programmable electronic digital computers. Popularised by its use during World War II Colossus were used in intercepting and deciphering encrypted German communications from the Enigma machine. English computer scientist, mathematician, and theoretical biologist Alan Turing conceptualised modern computers in his 1936 seminal paper‘On Computable Numbers’, whereby programmable instructions are stored in the memory of a machine.

Another early programmable computer was the Manchester Mark 1 developed by the Victoria University of Manchester. Frederic C. Williams, Tom Kilburn, and Geoff Tootill began working on the machine in August of 1948, but the first operational version of the computer was not available for use until 1949. The Manchester Mark 1 caused controversy when British media outlets referred to it as an electronic brain, which provoked a long-running debate with the department of Neurosurgery at Manchester University. They asked whether an electronic computer could ever be truly creative.

It was not until 1951 when electrical engineering company Ferranti International plc created the Ferranti Mark 1; that the world’s first general-purpose computer was commercially available. Also called the Manchester Electronic Computer, the Ferranti Mark 1 was first utilised by the Victoria University of Manchester.

The first computer used in processing commercial business applications was developed by the Lyons Tea Corporation to increase business output in 1951 – Leo I.

A brief timeline of some other important events is listed below:

1835– Morse Code invented by Samuel Morse

1838– Electric Telegraph invented by Charles Wheatstone and Samuel Morse

1843– Typewriter invented by Charles Thurber

1877– Microphone invented by Emile Berliner

1888– Hertz produces radio waves

1893– Wireless communication invented by Nikola Tesla

1895– Radio signals invented by Guglielmo Marconi

1898– Remote control invented by Nikola Tesla

1907– Radio amplifier invented by Lee DeForest

1919– James Smathers develops the first electric typewriter

1923– Electronic Television invented by Philo Farnsworth

1933– FM radio is patented by inventor Edwin H. Armstrong

1937– Alan Turing conceptualises the computing machine

1948– One of the first programmable computers, the Manchester Mark 1 designed by Frederic C. Williams, Tom Kilburn, and Geoff Tootill

1951 – MIT’s Whirlwind becomes the first computer in the world to allow users to input commands with a keyboard

1956– Optical fibre invented by Basil Hirschowitz, C. Wilbur Peters, and Lawrence E. Curtis

– The hard disk drive invented byIBM

1958 – Silicon Chip: the first integrated circuit is produced by Jack Kilby and Robert Noyce

1959– The first photocopier, the Xerox Machine enters the consumer market

1961– Optical disc invented by David Paul Gregg

1963–Computer mouseinvented by Douglas Engelbart

– Cloud computing invented by Joseph Carl Robnett Licklider

1967– Hypertext software invented by Andries Van Dam and Ted Nelson

1971– E-mail invented by Ray Tomlinson

– Liquid Crystal Display (LCD) invented by James Fergason

– Floppy Disk invented by David Noble

– First commercially available microprocessor, the Intel 4004 is invented

1972 – The first video game console designed for use on TV’s is invented – the Magnavox Odyssey

1973– Ethernet invented by Bob Metcalfe and David Boggs

– Personal computer invented by Xerox

1976– The inkjet digital printer is invented by Hewlett-Packard

1982– WHOIS (pronounced who is) is released as one of the earliest domainsearch engines

1984– The first laptop computer enters the commercial market

1989– World Wide Web (the internet) invented by Sir Tim-Berners Lee

1990– A student at McGill University in Montreal develops the first search engine named Archie

1992 –Complete I.T. Founded

1993– Benny Landau unveils the E-Print 1000 as the world’s first digital colour printing press

– Xerox 914 is released as the first successful commercial plain paper copier

1996– The Nokia 9000 Communicator is released in Finland as the first internet enabled mobile device

1998–Googleestablished

– PayPal is launched, enabling large scale payment via the internet

2000–Microsoftdevelop the first tablet computer

2001– Digital Satellite Radio

–Applereleases the iPod

2003– WordPress, an open source websitecontent management systemis launched by Mike Little and Matt Mullenweg

– LinkedIn is established

2004– Emergence of Web 2.0 – Humans move away from consumers of internet material to active participation

– Facebook established by Mark Zuckerberg

2005– USB Flashdrives replace floppy disks

– Google Analytics established

– YouTube is launched as a video platform

2006– Twitter is launched to the public

2007– Apple Inc. debuts the iPhone

–Amazonreleases the Kindle, marking a new era in reading and book technology

2009 – Bitcoin is developed by unknown programmers under the name of Satoshi Nakamoto

2010– Apple debuts the iPad

– The beginning of responsive website design

2011– 22 nanometre computer chips enter mass production

2012– Quad-core smartphones and tablets are releases, offering faster processing power

2014– 14 nanometre computer chips are released

– The market for smart watches reaches 5 million

2015– Apple releases the Apple Watch

2016– Supercomputers reach 100 petaflops

– Mobile devices overtake wired devices as a means of using the internet

2017– 10 nanometre chips enter service

2018 – AI first publicly emerged alongside 5G technology

2019 – Google released Quantum Supremacy, a machine running on quantum mechanics that can answer questions that would confuse even the world’s top supercomputer

Sharp acquires Complete I.T.

2020– Chatbot-technology and text-producing AI GPT-3 was released.

– The COVID-19 pandemic accelerates digital transformation, leading to remote work and online education

2021 – GitHub Copilot, a programmer assistant AI, was released

– Continued development of electric vehicles (EVs) with advancements in battery technology.

2022 – Chatbot and text-generating AI, ChatGPT is released, Expansion of Metaverse concepts

2023 – Microsoft released ChatGPT-powered Bing

– Sharp launches first-to-market Virtual Showroom – an immersive and interactive experience

The Virtual Showroom is an award-winning 3D virtual environment that allows users to view Sharp’s products and services in a ‘real-life’ setting with interactive elements such as video and the ability to engage with different hotspots and touchpoints to view and learn more about the product/solution.

The History of Information Technology | Complete I.T. (2024)

FAQs

What is information technology question answer? ›

Information technology (IT) is the use of any computers, storage, networking and other physical devices, infrastructure and processes to create, process, store, secure and exchange all forms of electronic data.

What is the summary of the need of information technology? ›

Information technology plays a prominent role in business and provides a foundation for much of our current workforce. From communications to data management and operational efficiency, IT supports many business functions and helps drive productivity.

What is the past of information technology? ›

Based on the storage and processing technologies employed, it is possible to distinguish four distinct phases of IT development: pre-mechanical (3000 BC — 1450 AD), mechanical (1450 — 1840), electromechanical (1840 — 1940), and electronic (1940 to present).

What is information technology 5 points? ›

Information technology (IT) is the use of computer systems to manage, process, protect, and exchange information. It's a vast field of expertise that includes a variety of subfields and specializations. The common goal between them is to use technology systems to solve problems and handle information.

What are the basic IT questions? ›

IT Interview Questions and Answers
  • What is an IP Address? ...
  • What do you do when you can't solve an Issue? ...
  • Tell us about a time you took the lead on a project. ...
  • How do you stay up to date about new technology? ...
  • How familiar are you with the different operating systems? ...
  • How would you make sure a computer network is secure?

What is the simple answer of technology? ›

Technology is the use of scientific knowledge for practical purposes or applications, whether in industry or in our everyday lives. So, basically, whenever we use our scientific knowledge to achieve some specific purpose, we're using technology.

What are the basics of information technology? ›

The four most basic and primary elements involving the use of all information technology include: information security, computer technical support, business software development and database and network management.

What are the 5 importance of information? ›

Those five areas are (in no particular order of importance); 1) decision-making, 2) problem solving, 3) understanding, 4) improving processes, and 5) understanding customers.

Is IT hard to learn information technology? ›

IT requires a great deal of technical knowledge: To be successful in IT, you need to have strong technical skills across a diverse array of topics. This can be difficult for people who don't have a natural aptitude for technology or who haven't had much exposure to it.

What are the 4 main periods of the history of information technology? ›

History of Information Technology
  • The Premechanical Age: 3000 BC and 1450 AD.
  • The Mechanical Age: 1450 – 1840.
  • The Electromechanical Age: 1840 - 1940.
  • The Electronic Age: 1940 - Present.

What is the origin of information technology? ›

The term IT did not appear until the mid-20th century however when an influx of early office technology appeared. The term was first published in the 1958 Harvard Business Review when authors Harold J. Leavitt and Thomas C.

When was the history of technology? ›

The history of technology begins roughly 3 million years ago with the invention of stone tools. Since then, technology has exponentially evolved.

What are 3 examples of IT? ›

Examples of information technology include computer hardware and software, networks and telecommunications systems, databases and information management systems, and Internet and web-based technologies.

What is information technology paragraph? ›

Information Technology (IT) plays a vital role in today's personal, commercial, and not-for-profit uses. In its simplest terms, IT is the application of computers and other electronic equipment to receive, store, retrieve, transmit, and manipulate data.

What is a major in IT? ›

As an information technology (IT) major, you'll study computer science, business, and communications. Along the way, you might focus on one specialty such as web development or digital communications.

What info technology means? ›

Information technology (IT) is the hardware and software used to create, store, transmit, manipulate, and display information and data. Metaphorically, it is the lifeblood of the Information Age.

What is information technology and what do they do? ›

Information technology (IT) is a broad professional category covering functions including building communications networks, safeguarding data and information, and troubleshooting computer problems. Livia Gershon. Aug 11, 2022.

What is information in information technology? ›

Data refers to the raw information. In the context of information technology (IT) and computing, it is information that a software application collects and records. Data is typically stored in a database and includes the fields, records and other information that make up the database.

What is information technology class about? ›

Students will learn about technical theory, networking, programming, and computer hardware. Depending on the specific major, students may also learn about database management, systems analysis, computer circuitry, website development, and IT management.

Top Articles
Latest Posts
Article information

Author: Lakeisha Bayer VM

Last Updated:

Views: 5766

Rating: 4.9 / 5 (69 voted)

Reviews: 92% of readers found this page helpful

Author information

Name: Lakeisha Bayer VM

Birthday: 1997-10-17

Address: Suite 835 34136 Adrian Mountains, Floydton, UT 81036

Phone: +3571527672278

Job: Manufacturing Agent

Hobby: Skimboarding, Photography, Roller skating, Knife making, Paintball, Embroidery, Gunsmithing

Introduction: My name is Lakeisha Bayer VM, I am a brainy, kind, enchanting, healthy, lovely, clean, witty person who loves writing and wants to share my knowledge and understanding with you.