Why software APIs and hardware protocols are two expressions of the same modular system
When people speak of computer science and computer engineering, they often imagine two separate territories: one abstract, algorithmic, and mathematical; the other tangible, electrical, and physical. But step back, and a different picture emerges. Both fields are simply building blocks in a larger, self-organizing order — where modules define clear functionality, interfaces enforce shared rules, and integration yields systems greater than the sum of their parts.
A POSIX system call and a PCI bus handshake may look worlds apart, but at their essence, they are both contracts. They define how one part of a system communicates with another, regardless of the implementation beneath. This lens — seeing software and hardware as parallel expressions of the same principle — reveals not a divide, but an emergent order that underpins all of computer science and engineering.
“A System” → The Modular Lens
Computer engineering could be perceived as a process of discovering various productive arrangements of functional modules (or functional blocks). Such an abstract module can be expressed as a self-contained unit with two key qualities:
- Functionality — What it does.
- Interface rules — How inputs are furnished and outputs channeled.
This principle is universal, whether it’s the components in a IoT device talking to a software cloud storage service or an Android phone multi-casting audio on Alexa or a graphics pipeline inside a compute cluster powering AI. The same principle comes to life differently in a software and hardware view of the world. But the modular lens lets us see their symmetry. The same qualitative attributes of functionality and interface rules are integral to both software and the hardware paradigms. Quite easily illustrated by the below diagram.

The distinction between software and hardware is merely in methodology, both are conceptually an integration of abstract modules communicating via shared rules to achieve a larger purpose.
“Software Hardware Architecture” → Functions and Contracts
Software is a network of modules linked by recognizable data structures. The output of one module often becomes the input of another, creating layered stacks of functionality.
At the higher levels, interfaces may be standardized — for example, POSIX APIs in operating systems. But internally, each module may have its own contracts. Even at the assembly level, the object code is an interface: once decoded, the hardware ALU or Load/Store unit is signaled to act. Eventually the behavior of a generic processor unit will depend on the sequence and also the content of the application’s object code.
Hardware Hardware design mirrors this approach. High-level IP blocks interconnect through bus protocols like AMBA, PCI, or USB. Each functional block samples recognizable input patterns, processes them, and emits outputs across agreed-upon channels. For example, a DMA unit connected to multiple RAM types must support multiple port interfaces, each with its own protocol. A unified abstract representation of such a functional block is attempted below.

Functional Block → The Atom
At a highly abstract level, computer science is either about designing such functional blocks, or about integrating them into coherent wholes. So eventually an electronic product can be perceived as an organization of abstract functional modules, each communicating via shared interface rules, together they deliver a complex use cases valuable to its end user.
When employed at scale human cognition recognizes objects by their abstract high level qualities, not their gritty details. But details matter when developing these abstract functional blocks. So a process intended for engineering a complex system at scale will need to harness specialized detailed knowledge dispersed across many individuals and the functional modules they implement.
For instance, the application engineer cannot be expected to comprehend file system representation of data on the hard disk and similarly a middle-ware engineer can afford to be ignorant of device driver read/write protocols as long as the driver module plays by documented interface rules. An integration engineer need to know only the abstract functionality of modules and their corresponding interface rules to combine them for delivering a product.
Thus by lowering the knowledge barrier we reduced the cost and time to market. Now the challenge of implementing functional blocks lies in balancing abstraction with performance. Too much modular generality slows a system; too little makes it rigid and fragile.
“Open v/s Closed Source” → Impact of the Extended Order
Open-source framework accentuates the advantages of the previously mentioned modular construction. While proprietary systems evolve only among a set of known collaborators, open source leverages on a global extended order, enabling contributions from both known and unknown individuals. So from the development perspective it harnesses the expertise of a larger group.
Market economics is about finding the most productive employment of time and resources, in our case it would be about discovering all the possible uses of an abstract functional module. The lower barrier to knowledge within the open-source market accelerates this discovery process. It also leverages on the modular structure for coordinating dispersed expertise. In other words, depending on their individual expertise any one can integrate new functional blocks, improve or tailor the existing ones. For instance, a generic Linux kernel driver might eventually end up on a server or a TV or a smartphone depending on how that module is combined with rest of the system.

The above Venn diagrams illustrate how the nature of an order can influence the development, cohesion and organization of these functional blocks.
“Universal Epistemological Problem” → The Knowledge Challenge
What emerges from these modular interactions is not merely technology, but an order — a living system shaped by countless contracts, shared rules, and dispersed expertise.
This is the emergent order of computer science and engineering: a subset of the larger economic order, subject to the same knowledge problem Friedrich Hayek famously described. No single mind can master it—yet through modularity, openness, and shared rules, it flourishes.