Why do computers talk a bit at a time these days?

tl;dr

Handling a bit at a time is easier and allows for much faster communication than having multiple bits at a time, due to the imperfect nature of reality

Time scales, the speed of light not in a vacuum, and too much chonk

It might be non-intuitive to think that a vast majority of computer communication, at least in the high-speed domains of megabytes to gigabytes per second, is done with one bit at a time. After all, if communication in the past was largely multiple bytes at a time, how come we couldn’t improve on that technology?

It has to do with three things:

  • Time scales
    As we approach hundreds of megabytes to gigabytes per second, the time scale between each bit of information sent goes down to the nanoseconds (1/1,000,000,000 of a second) and picoseconds (1/1,000,000,000,000 of a second). If humans could perceive this time scale, the speed of light in a vacuum starts to look like something comprehensible. Grace Hopper, an early pioneer of computer programming, used to hand out 30cm lengths of wires as a way of putting something abstract to humans as the speed of light and nanoseconds of time into something more concrete.
  • Speed of light, not in a vacuum
    The speed of light is often reported as approximately 300,000 km/sec. But this is in a vacuum. The speed an electromagnetic wave travels differs depending on the medium. For example, the speed of light in water is 225,000 km/sec. In electrical conductors, it varies. The speed of light, or more appropriately the wave propagation speed, in a Cat5e Ethernet cable is 192,000 km/sec. This is known as the velocity factor.
  • Bulk of the signal carrier
    Or to put it this way, two wires, which is the minimum to transmit a single bit at a time, is nothing compared to say 64 wires for a reliable 32-bit transmission over a cable or how many traces are needed on circuit board to support this.

The bulkiness factor is easy to see why engineers didn’t simply add more bits to transmit over communication lines. But surely, like all engineering problems, they could find a compromise between how many bits can be transferred at once versus how big the cables have to be. This is where time scales and the speed of light come into play.

Contrast to what us slow humans can see, electrical signals don’t transmit instantly from point A to B and as mentioned earlier, they don’t even travel at the speed of light in a vacuum. Physical aspects of the circuit, including how far the signal needs to travel and even the temperature of the conductor, can impede the progress of an electromagnetic wave. This creates minuscule amounts of what’s called propagation delay. Propagation delay creates a problem with sending bits at once: not all of the bits when sending them at once arrive at the same time. This isn’t a problem in slower communication as the bits have some leeway to arrive before the receiver looks to see what arrived. However, as communication becomes faster, this tolerance gets smaller and smaller. For a bus transmitting at 1 GHz, each bit has to arrive within 1 nanosecond of each other. Trying to make sure X-number of wires, traces, or whatever can have the bits arrive within that 1 nanosecond is pretty hard to do.

Take a look at a PCB board in your computer whenever you have the chance. You may notice squiggly lines on it similar to this:

(Probably from this Reddit post poster)

These squiggly lines are there to deliberately add delay so that whatever signal it’s carrying arrives at roughly the same time as another signal.

So instead of trying to make a number of communication lines can transfer a signal so they arrive within a tight timing window, transmit everything one bit at a time. Though often times, high-speed communication is sent as a differential signal for noise tolerance. While this might make it seem like communication is effectively 2-bit, differential signal transceivers can be made to be really close to each other, reducing the problem of accounting for propagation delay (it still has to be accounted for, but not as much as say 8 bits)

With some smart engineering, we can also achieve effectively X-bits at once anyway by combining multiple single-bit transmitters and reconstructing the data later. The PCI-Express bus uses this to achieve higher and higher transmission rates.

To recap, the reason why computer communication these days is 1-bit at a time rather than multiple bits at a time is:

  • Faster transmission rates mean less leeway for the bits to arrive, which is really hard to do with a lot of bits
  • The signal propagation speed itself is affected by physical aspects of the circuit, further complicating when bits arrive
  • Cables, circuit board traces, and the like would take up a lot of space

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create your website at WordPress.com
Get started
%d bloggers like this: