What was the first computer?
Was ENIAC the first computer? The answer depends on the definition of “computer”. A number of prior machines had some of the characteristics of modern computers (such as program-controlled processing, capsule or digital computation), generic while machines that came after the ENIAC had even more of the features we typically associate with computers today (such as the ability to store a program as software in memory). Many of the features most popularly associated with today’s computers, such as graphical user interfaces and real-time interactive operator input, were not developed until decades after ENIAC. However, the ENIAC was the first computer built to take full advantage of electronic processing speeds and to “think” for itself using conditional branching and nested subroutines. This made it the first machine capable of being programmed to solve a full array of computing problems. Additionally, all modern computers trace their lineage back to the ENIAC and the design for the ENIAC’s successor machine, the EDVAC. For both of these reasons, the ENIAC deserves to be called “the first computer.”
One early computer dictionary defines computer as a “device capable of accepting information, applying prescribed processes to the information, and supplying the results of these processes”. ANSI defines computer as “a data processor that can perform substantial computation, including numerous arithmetic operations, without intervention by a human operator during a run.”
These definitions differ to the extent that automaticity—freedom from the need for constant human input for operation—is stressed in the latter definition. But neither definition makes explicit that this automaticity must include automatic testing of intermediate processing results for certain conditions and automatic selection of how processing should proceed based on the results of those tests. Such a feature is what we call “conditional branching”. Furthermore, neither definition requires the capacity to make multiple such decisions so as to create what we call “loops” and “nested subroutines”. Without these features, a program-controlled computing machine is so limited in its usefulness that it can hardly be called a computer, because such a machine would not have the ability to “think” for itself, that is, to make decisions and act on those decisions in ways not strictly provided for by the programmer as a linear list of operations. The decision-making process is what distinguishes computers from calculators.
Until World War II, computing machines were “analog”, estimating answers based on mechanical or electrical analogy, or if they were digital, they were either mechanical (using cogs, gears, pulleys, etc.) or electromechanical (using motors, relays, etc.), using moving parts that were slow and prone to failure. After the war—after the ENIAC—computers were built to be fully electronic and digital. The ENIAC changed everything. Once the power of electronic digital computing was realized, subsequent machines followed in the ENIAC’s footsteps.
Even so, the history of computers is rich with clever and advances and technological dead-ends. Even though none of them led directly to the computers we have today, some of the following machines and concepts have claims to being “the first computer”—depending on the definition of computer.
British mathematician and inventor Charles Babbage worked on his design for a mechanical “Analytical Engine” from 1837-1871, but he never finished, and his machine was never built. Most probably, Babbage’s shortfall lay not any flaw in his design but the lack of political will to finance the machine’s construction, which involved first tackling technological challenges in toolmaking. The Analytical Engine was designed to be programmed using punched cards and it had separate places for processing and memory. It was a digital machine that facilitated conditional branching and nested subroutines, making it way ahead of of its time. Despite Babbage’s cleverness, however, because his designs were never pursued to fruition, within 50 years they had been forgotten—becoming so obscure that 20th-century computer developers hadn’t heard of Babbage when they began their work. The Analytical Engine thus never influenced the design of modern computers.
Zuse Z3 and Z4
German engineer Konrad Zuse in 1941 demonstrated his Z3 computer, a fully relay-based computing machine. A marvel both of logical design and wartime engineering, it was programmable, and used the binary floating-point numeral system common in computers today, but it could not conditionally branch. The following year, Zuse began work on a successor machine, the Z4, but his work on the machine was interrupted by the Allied invasion of Germany in 1945 and not resumed until 1948; by that time, the explosion of electronic computer development in the U.S. and Great Britain had passed Zuse by. Because of wartime exigencies, lack of publication on his machines, and perhaps also because of the destruction of the Z3 in a 1943 bombing raid, Zuse’s early work, though commendably impressive, never influenced the development of electronic computers.
British cryptologists and engineers in 1943 and 1944 completed their Colossus 1 and Colossus 2 computers. Like the ENIAC, these were electronic computers that used vaccum tubes, but they were special-purpose—their only task was to crack Axis codes. For reasons of British national security, the work on these machines was kept secret until the 1970s, and thus they, too, had no influence on the development of the modern electronic digital computer.
What about the Atanasoff-Berry Computer (ABC)?
Iowa State College math and physics professor John V. Atanasoff and electrical engineering graduate student Clifford Berry designed a computing machine in the late 1930s and early 1940s. It was noteworthy for using vacuum tubes in its calculation logic, which made the device partially electronic. Yet it required extensive human intervention; rather than running from a program, a human operator had to guide the machine’s progress after every significant step. This lack of automation and programmability means the ABC was a form of calculator and not a computer. Despite this, a judge in 1973 ruled that the ABC was a computer in his decision to invalidate Mauchly and Eckert’s ENIAC patent.
-  Sippl, Computer Dictionary and Handbook, p. 67 (Bobbs-Merrill 1966). ↩
-  American National Standards Institute (ANSI), American National Dictionary for Information Processing (X-3/TR-1-77). ↩
-  “Mathematical Machines and Myths Concerning Their Makers or Babbage vs.
Gutenberg”, NBS Colloquium talk given by John Mauchly, Ambler, Pennsylvania, February 23, 1973. ↩
-  The Z3 had two predecessors: the Z1 was not programmable, and the Z2 was essentially an interim prototype for the Z3. ↩