Here’s what computer stands for:
Computer doesn’t stand for “Common Operating Machine Purposely Used for Technological and Educational Research.”
Computer isn’t an acronym at all.
Computer is a combination of the Latin words Com and Putare, meaning respectively, “with” (com) and “reckon” (putare).
So if you want to learn all about the tech term computer, then you’re in the right place.
Let’s jump right into it!
- INTERNET Full Form: Stands For What?
- IT Full Form: Stands For What?
- GOOGLE Full Form: Stands For What?
- WI-FI Full Form: Stands For What?
What Is the Computer Full Form?
Remember on Star Trek when the Captain used to start a conversation with the LCARS (Library Computer Access/Retrieval System) by saying, “Computer…?”
We’ve always enjoyed personifying computers, haven’t we?
Even in day-to-day life, we get a kick out of talking to Siri, Alexa, or Google Assistant, imagining that we’re conversing with a humanlike persona, our loyal “Computer” that follows our commands.
In fact, forecasts suggest that by 2024, the number of digital voice assistants will reach 8.4 billion units.
Have you ever wondered where the title “Computer” came from or what the correct computer full form is?
What About that Computer Acronym?
The easiest answer suggests the full form of computer is “Common Operating Machine Purposely Used for Technological and Educational Research.”
Unfortunately, this easy answer is also an urban legend. There is no evidence of that acronym appearing in the 20th century when modern computers debuted.
Besides, that acronym seems dishonest when you consider both the etymology and function of a computer.
A programmable machine “computes” or calculates equations.
Technically speaking, and in the words of programmers, a computer’s function is to respond to a set of instructions and then execute a list of further corresponding instructions.
For us, in simple words, these devices:
- Accept and interpret data
- Process the information
- Produce the desired results
- Store and retrieve data
Truly understanding the computer full form requires a bit of a history lesson.
What Are Human Computers?
According to BBC News, since the verb “compute” was used to refer to performing math equations, humans were the first computers.
We simply used pre-mechanical age devices, like the abacus, to compile our data.
The term “compute” comes from the Latin word “Putare.”
According to Professor William J. Rapaport, the word is a combination of Com and Putare, meaning respectively, “with” (com) and “reckon” (putare).
The compound word, Computare, refers to a reckoning of arithmetic, or to “settle an account.” Now that sounds like something a human might have to do. Robots, not so much!
Ancient Blurbs About Computers
References to the computer date well before the 20th century, but they are always about the act of computing, a distinctly human activity.
For example:
- The Roman poet Virgil wrote in Georgics of computing, or pruning vines
- Historian Tacitus used the word when counting soldiers
- Pliny wrote in Natural History how “the breadth of Asia should be “rightly calculated”, which was the same word—computetur)
- Author Richard Braithwait wrote in The Yong Mans Gleanings that he “had read the truest computer of times”
- Navy Administrator Samuel Pepys wrote of computing money for ships
The first known use of the word computer was in 1613 in a book called The Yong Mans Gleanings by English writer Richard Braithwait.
[Wikipedia]
Computers for Hire
Looking for employment today and searching for “computers” would return very different results than a job search in the 18th or 19th centuries. It was quite common for old job ads to say “hiring computers.”
Nineteenth-century steam-powered robots, you ask? Unlikely! No, the ad implied they wanted human calculators or “computers.”
Essayist and Reverend Jonathan Swift once wrote in the Edinburgh Weekly Journal, describing a young married woman’s job as a computer, one who knows of her husband’s income.
The perception changed, of course, when inventors introduced the first mechanical computer.
Computers By Any Other Name
We have to assume that before computer full form existed, people simply called their pre-mechanical devices “tools” or something similar.
For thousands of years, humans used tools to perform equations and keep account of numbers.
- The Ishango Bone dates back to prehistoric Africa and is thought to be a primitive calculator. Etchings are situated in three columns with marks that have been grouped into sets, implying decimals or prime numbers.
- The abacus dates back to at least 2400 B.C. in ancient Babylonia, or perhaps even longer. Since the word for abacus means literally “tablet,” it could etymologically be the closest relative to a mechanical “computer.”
- The Antikythera mechanism is more complex, so much so that this 100 B.C. era device is considered the first primitive analog computing device.
- Physicist Derek J. de Solla Price once wrote that the Antikythera mechanism could calculate astronomical positions.
What Did We Call Modern Computers?
We didn’t call modern computers “computers,” at least not when they first debuted.
Charles Babbage was the “father of the computer,” but to him and his 19th-century contemporaries, these steam-driven devices were called “machines.”
Babbage said, “A tool is usually more simple than a machine; it is generally used with the hand, while a machine is frequently moved by animal or steam power.”
When he created a highly advanced machine that helped perform navigational calculations, he called it an “Analytical Engine.”
Perhaps the reason these amazing analytical machines were never actually called computers back in the day was that they weren’t yet “programmable” in the literal sense.
They were limited to performing only specific functions.
They were also called all sorts of nicknames other than “computer,” such as:
- In 1927 H. L. Hazen and Vannevar Bush’s “mechanized calculus” machine
- In 1937, George Stibitz and his Model K “Adder”
- Alan Turing’s Universal Turing Machine, or the “computing machine”
The First Robotic “Computer”
By 1938, the name computer in full form applied to non-human computers, that is, the programmable electromechanical analog systems we all know and love.
Curiously, the first time we used the word “robot” was in 1920, after Czech author Karel Capek coined the phrase, robota, meaning “slave labor.” That’s right, so whenever Bender Rodriguez complains about the abuse of robots at the hands of humans, you know he’s got a good point.
The modern colloquial usage of the word “computer” also changed after 1938, with new achievements such as:
- The United States Navy naming their submarine-bound analog system a “Torpedo Data Computer”
- John Vincent Atanasoff and Clifford Berry naming the first electronic digital machine the Atanasoff-Berry Computer
- By 1946, the military-created Electronic Numerical Integrator and Computer (ENIAC) debuted, its complete system weighing 30 tons used for processing (its power literally dimmed the lights of the city of Philadelphia!)
- The ENIAC could perform numerous functions besides ballistics, including weather prediction, atomic energy calculations, and other “general purposes”
For more background on the ENIAC, read more about the first computers and their technologies in Scientific American.
The Marketed “Personal Computer”
The last evolution in computer etymology is the result of great achievements in manufacturing and marketing work alike.
Although modern computers developed rapidly from the 1950s to the 1970s, it took some time to transform the image of a giant machine used to perform highly specialized calculations, to a “microcomputer,” a tool for small businesses and individuals.
Much of the groundwork for the modern computer was laid in 1968 when Douglas Engelbart of SRI International gave the “mother of all demos,” showcasing many modern concepts like the mouse, email, word processing, and even video conferencing.
The identity of the “computer” was just now shaping into something that a modern, non-technical minded audience could embrace—something humanlike, a virtual assistant.
The First Microcomputers
Computers only became identifiable, and thus personalized as a persona, when microcomputers appeared. Competition during the 1970s and 1980s was intense, with many companies competing in this brand new market:
- Micral N debuted in 1972 as a mostly “hobbyist” product
- IBM Special Computer APL Machine Portable came in 1973, which gained a reputation as the first real PC
- IBM 5100 was a milestone, as it could be programmed in BASIC and APL
- Xerox Alto debuted a Graphical User Interface
- Altair 8800, which was the first true commercial success for a PC
- By 1980, the Epson HX-20 became the first portable “laptop” computer to run on rechargeable batteries
The next revolution happened in 1976 when Steve Jobs and Steve Wozniak debuted the first Apple computer, which grew beyond the kit-style look of previous computers, marketing to the non-technical minded consumer.
Apple’s revenue from its Macintosh computers has seen an overall growth up to this day, with sales at around 7.1 billion U.S. dollars for the third quarter of 2020 alone.
Tandy made more strides with the TRS- 80, while Commodore released the Amiga, which introduced modern principles like multitasking and a windows operating system.
The Life and Death of a Computer
By 1982, “The Computer” was a household word, publicized by Time magazine, who awarded it “Machine of the Year”.
Marketing evolutions continued in the 1980s and 1990s with video game systems and the universal software-focused approach of Windows and DOS.
All was going well until…well, the computer died! Not so much in a Hal 9000 sort of way, but simply the end of a generation and the reinvention of an old market.
By the time desktops and laptops took a plunge in the 2000 era, in favor of smartphones and tablets, the colloquial use of “Computer” was all but finished.
These new internet-ready devices were computers, but no one called them by the computer full form anymore, nor was a computer acronym necessary.
Worldwide sales for these smartphones have been steadily increasing throughout the years. It is forecasted that by 2021, around 1.5 billion units would be sold to end users.
The “end” of the market even prompted many programmers to give a symbolic eulogy for the death of the computer, a persona we once voted “Machine of the Year.”
Ironically, humans were the first computers, not machines. We didn’t even think of these amazing machines as “computers” until we understood the groundbreaking idea that machines could operate as humans do.
To Be, or Not to Be: Other Vital Tech Words and Their Meanings
Here’s a list of particularly interesting tech acronyms—or maybe they’re not even acronyms?
Learn what these stands for: