A microcomputer is a complete computer on a small scale, designed for use by one person at a time. An antiquated term, a microcomputer is now primarily called a personal computer (PC). Common microcomputers today include laptops and desktops. In modern usage, microcomputers are complete computer systems that are smaller than a normal PC, such as single-board computers (SBCs).
Smaller than a mainframe or minicomputer, a microcomputer uses a single integrated semiconductor chip for its central processing unit (CPU). It also contains RAM, input/output (I/O) ports, and a bus or system of interconnecting wires, all housed in a single unit usually referred to as a motherboard.
Common I/O devices include keyboards, monitors, printers and external storage.
History of microcomputers
The term microcomputer dates back to the 1970s. Early computers before this time were all mainframes, consisting of racks of equipment that may have taken up an entire floor or room in a building. These mainframes had the work of the CPU spread over many chips and logic boards. While these mainframes could weigh several tons, the later minicomputers could fit in a single rack of equipment and weigh “only” hundreds of pounds. The processor functions of minicomputers were still usually divided among several chips and logic boards.
The advent of a complete CPU on a single integrated circuit (IC), called a microprocessor, paved the path to the creation of the microcomputer. The first mainstream microprocessor being the Intel 4004 in 1971 and later the Intel 8008 and Intel 8080 microprocessor in 1972 and 1974, respectively. Now, an entire computer could fit on a desk and weigh only tens of pounds.
The first microcomputer was the Micral, released in 1973 by Réalisation d’Études Électroniques. Based on the Intel 8008, it was the first nonkit computer based on a microprocessor. In 1974, the Intel 8008-based MCM/70 microcomputer was released by Micro Computer Machines Inc., later known as MCM Computers.
Though released after the Micral and MCM/70, the Altair 8800 is often considered the first successful commercial microcomputer. Released in 1974, it was designed by Micro Instrumentation Telemetry Systems and was based on the Intel 8080 microprocessor. It retailed for around $400 in kit form and $600 assembled — $2,550 and $3,826 in 2024 dollars, respectively.
As microprocessor chip design matured, so did the processing capacity of microcomputers. By the 1980s, microcomputers were being used for more than games and computer-based recreation, finding widespread use in personal computing, workstations and academia. By the 1990s, microcomputers were being produced as pocket-sized personal digital assistants and later came in the form of cellphones and portable music players.
Microcomputer applications
Personal microcomputers are often used for education and entertainment. Beyond laptops and desktops, microcomputers can include video game consoles, computerized electronics and smartphones.
In the workplace, microcomputers have been used for applications such as data and word processing, electronic spreadsheets, professional presentation and graphics programs, communications and database management systems.
They have been used in business for tasks such as bookkeeping, inventory management and communication; in medical settings to record and recall patient data, manage healthcare plans, complete schedules and process data; in financial institutions to record transactions, track billing, prepare financial statements and payrolls, and perform auditing; and in military applications for training devices, among other uses.
Microcomputers and IoT
The term microcomputer is today most often applied to complete computers that are smaller than traditional desktops or laptops that can still run a full operating system (OS). These may be SBCs or specialized PC form factors, such as handheld computers.
The Raspberry Pi is the most popular SBC. It is often used for internet of things (IoT) prototyping, education and applications. It has been joined by many other SBCs, which offer different features, such as the Nvidia Jetson. These types of microcomputers can be further minimized into a computer module form factor so they can be easily plugged into larger systems to act as the brains of the unit.
Microcomputers can be used for similar tasks in IoT applications as microcontrollers, however. Certain IoT devices, such as smart TVs, refrigerators and other connected appliances, are sometimes referred to as microcomputers.
Where a microcomputer fits in computer hierarchy
The ascending hierarchy of general computer sizes is as follows:
- Embedded systems, which are fixed inside something and don’t support direct human interaction but nonetheless meet all other criteria of microcomputers.
- Microcomputers, which are single-board computers.
- Workstations, which were formerly described as a more powerful PC for special applications.
- Minicomputers, which are now called midrange servers.
- Mainframes, which are now usually referred to by manufacturers as large servers or server racks.
- Supercomputers, which are large servers, sometimes including systems of computers using parallel processing.
- Parallel processing systems, which are systems of interconnected computers that work on the same application together, sharing tasks that can be performed concurrently.
Microcomputers vs. microcontrollers
A microcontroller is an IC designed to govern a specific operation in an embedded system. These are usually system on a chip (SOC) designs that have onboard RAM, read-only memory (ROM) and peripherals.
These systems usually run on low power and do not run a full OS. Instead, they often directly run a compiled program stored on the SOC.
Microcontrollers have been referred to as single microcomputers.
Microcomputers vs. microprocessors
A microprocessor is a computer processor on a microchip that contains all or most CPU functions. Microprocessors do not have RAM, ROM or other peripherals. As such, microprocessors cannot perform standalone tasks.
Rather, systems such as microcomputers, which contain microprocessors, are built, which contain all the additional needed components to form a full computer.
A microcomputer can technically be described as the combination of a microprocessor and its peripheral I/O devices, circuitry and memory — just not on a single chip.
Microcomputers vs. minicomputers
While microcomputers generally refer to laptops or desktops, minicomputers were a variety of computer primarily used in the 1960s to 1980s. Minicomputers were larger than microcomputers — some stood more than six feet tall and weighted up to 700 pounds — and boasted higher processing speeds at a significantly smaller size and price than mainframes and supercomputers available at the time.
While microcomputers were often used at home and in the office, minicomputers were primarily found in academia, research labs and small companies, and they were used for word processing, accounting and teaching aids.
Digital Equipment Corporation’s Programmed Data Processor-1 (PDP-1) was announced in 1960 and sold for $120,000 — $1,275,870 in 2024 dollars. Its descendant, the PDP-8, was introduced in 1965 and sold for nearly $18,500 — $184,629 in 2024 dollars. Considered one of the most successful minicomputers and first example of a commercial minicomputer, the 12-bit PDP-8 has been compared to the size of a small household refrigerator.
Minicomputers did not contain microprocessors. In the 1980s, the minicomputer’s prevalence declined as microprocessors became more powerful and available at lower cost.
An antiquated term, minicomputers are often referred to as midrange computers.
Microcomputers vs. mainframes
A mainframe computer is a high-performance computer used for large-scale computing purposes that require greater availability and security than small-scale machines can provide. Mainframes can process requests from a number of users simultaneously, whereas a microcomputer is designed to be used by one person at a time.
As such, a mainframe computer can be described as a system that interconnects a number of microcomputers.
The history of IT began long before the modern-day computer was ever invented. Check out this brief history of the evolution and growth of IT. Also, dive into the history of server hardware, and read about IoT trends to keep an eye on.