Internet, the - the largest collection of computers (and people) sharing information distributed networks.
Automating shit's been a hot idea for a really long time. Remember the wheel? Automated walking.
I saw a tweet a while back that went something like:
CPUs are just rocks we've just tricked into thinking. Of course, you have to flatten the rock and put lightning inside it first.
It's not surprising that technology has become so advanced as to appear indecipherable from magic - what is surprising is that it's happened in so rapid a period. A rare combination of factors resulted in a set of material conditions that helped speed along technology aiding in automating the reading, writing, computation, and transmission of information.
Computer history traces it's history back to World War II - networks come a bit after that.
Depending on who you ask, you'll get different answers on what ought be considered the 'invention' of computers.
Purists will probably talk about Babbage and Lovelace's (19th century) analytical engines and difference machines. These were wood, iron, and gear machines that could calculate results for arithmetic operations with minimal effort on the part of the user, and could be configured (programmed) to run different sets of calculations. Arguably, the abacus could fit this description too.
The work Babbage and Lovelace did was important in promoting the idea that one could use material objects to emulate algorithmic operations. Due to the limitations of their time, their machines didn't advance past running simple arithmetic operations. The efficiency of the machines were limited too - they were massive, and slow to run.
Babbage and Lovelace's work was not directly continued, but inspired generations of polymaths and engineers to work on programmable machines for algorithmic operations on data.
The first mass use of computing hardware were the American census machines in the late 1900s. They were fancy punch card machines basically! Probably not quite even as theoritically advanced as Babbage's work.
The census dude was Herman Hollerich, who goes on to found International Business Machines (IBM) in 1924.
A quick note - by this time, an international network of communications had already formed. With the telegram! For those with money, globalization had already happened.
Some credit must of course be given to Alan Turing (early to mid 20th century) is a popular figure in the history of computers. Turing was enlisted by the British state during WWII to work with other mathematicians, cryptologists, and linguists (nerds) to crack the German enigma code. The Imitation Game is a dramatic re-telling of this period. Though the film is noteable for its inaccuracies on Turing's personal life, it is enjoyable nonetheless and worth checking out.
Enigma machines were German data encryption & decryption machines used by the state. A soldier would grab an enigma machine, configure it with today's code (provided by officers), and type data into it. The machine would encrypt the data using the configured code and output encrypted text. With the aid of another enigma machine and the same code, one could type the encrypted text back into the machine and receive decrypted output.
In order to crack the code, the brits had rooms of nerds trying their damndest to manually brute force the enigma code. The thing is, German codes changed daily, and by the time a nerd managed to crack the code, they'd already changed and the unecrypted data was no longer useful as military intelligence.
Turing's contribution was the design and development of an electromechnical machine for automating the calculations the nerds had been doing manually. This drastically reduced the labour needed in cracking a daily code. Think difference engine, but with transistors instead of gears. Necessity is the mother of invention and hacking 😀
Also also related to the war effort, Americans were having a solid go with vacuum tube and binary calculation improvements.
Transistors were a BIG deal. They made shit like... 1000 times faster or something. Great for binary too.
Around the same time, John von Neumann was working on.. lots of stuff. Dude's a certifiable genius.
Neumann was not a fan of how long it took to reprogram computers. You needed teams of people who would take weeks to debug and re-configure a machine. Neumann's solution was to store the algorithms as data inside the engine instead of having it external. This meant the introduction of more complex parts into computers - like control and processing units (CPUs!), input and output devices (keyboards! screens!), and integrated circuits for storing data (RAM!). This is generally referred to as von Neumann architecture today, and makes up the criteria for what one could consider a modern computer.
Computers had become powerful, robust and inexpensive enough to find use outside of cases like WWII code cracking. They were still really big and expensive, so ended up being limited to well-off american campuses, research organizations, the government, and big business.
In the 50s, the American census department buys a whole bunch of these old time-y computers - probably to replace the old punch card systems.
In and around the 1970s, Bell labs were responsible for a lot of developements in computer science. Bell labs had enormous resources thanks to their monopolies on telecommunications in North America.
The first operating systems were made at Bell, and they were called Unix. An operating system is a collection of stored programs & algorithms to allow for modification of and interaction with locally stored data on the machine. Typically, this means a system for organizing files, navigators for exploring the files, and shells for interacting with the inner operating system services.
Unix made it much easier for programmers to develop new software (programs) and run it on a computer - they didn't need to worry about the specifics of interacting with the computer hardware, as it was managed by the operating system. The spread of Unix meant that more and more people were working on writing software.
Bell labs had already developed modems (a device that turns data into a transmissible format) for phones, and found it would also apply in moving data from one computer to another. The client-server model for requesting and responding with data evolved around these technologies.
The ability to network computers to share data, and that more and more people were writing software (data describing algorithmic operations / programs), meant that it was only a short time before someone picked up the initiative to network the distributed hotbeds of computer and software research into a unified system.
ARPANET (Advanced Research Projects Agency Network) (what a silly name) used telephone lines to link up computers associated with the American Pentagon (and I think a number of academic instutions as well). This Britannica article does a pretty good job of covering it: https://www.britannica.com/topic/ARPANET.
ARPANET is responsible for packet switching technology, which involves breaking up data and indexing it, then transmitting it in chunks across networks. It means that it's less likely data gets corrupted in transmission! If the receiver is missing a specific bit of the data, it can request a copy of it. It's easier to parity-check on smaller bits of data too.
Packet switching software is the basis for pretty much any software handling data transmission - this is in the modem and routes sitting in many homes for example.
ARPANET really only starts to take off in the 70s. And by takes off, I mean there's about 30 odd servers across America in 1971.
Once the nerds got connected, WOW! Stuff starts to take off. People start writing mailing programs, file transmission programs, etc. Telnet got created during this time.
The transmission protocols implemented by ARPANET weren't published as open standards - they were probably sitting in a binder in a desk somewhere. Inevitebly, other local computer networks sprung up taking after ARPANET. In order to connect them, some common system for identifying and sending data between distributed network systems became necessary.
The set of models, tools, algorithms, protocols for this is called the Internet protocol suite. The main two protocols in this suite are Transmission Control Protocol (an implementation of packet switching) and Internet Protocol (instructions for packaging/directing the data into datagrams, which use an IP address for specifying destination and origin). (TCP/IP).
Sidenote - at some point, some geniuses came along and gave the world DNS (Domain name systems). DNS lets us substitute human-readable language for IP addresses, cause a bunch of servers around the world host lookup tables between the names and the IP addresses.
TCP/IP became the standard for network communications and got written into I think.. Solaris, Sun's implementation of Bell Lab's Unix? Not 100% sure on this one.
By the 70s, computers have gotten pretty magageably sized, and IBM makes a killing selling em to people (eventually contracting Microsoft to build the OS). Apple follows suit.
So at this point, there's definitely, practically speaking, a mass internet. But shit's kinda fractured - people can communicate over TCP IP, but mail as an implementation of hypertext is kind of lacking. There needed to be an easy way to host information on a server that anyone could access and read.
Enter Tim BL sometime in the 90s. We can credit Tim with HTTP and HTML. HTTP is a generic transmissions protocol for transferring hypertext data between two computers in a client-server model. HTML is a syntax for marking up data such that it can be rendered by an HTML reader in a human-readable document.
HTTP software, combined with HTML documents, web browsing software (HTML readers & HTTP request-er / response-reader), DNS software, TCP/IP implementations, and the modern computer make up what we consider today to be the material basis of the internet.
Hardware development has traditionally been commercial. The manufacturers like IBM, Microsoft, Apple, Sun, and the Internet service providers (who historically would route an internet connection through your existing phone lines for a monthly fee) are running the game here. I'm missing out on lots here because I'm missing lots of the history.
After HTTP/HTML's mass adoption, software protocols for data transmission and have mainly been developed by free softare communities from what I understand.
But that's the gist of it!