10 Laws Every Geek Should Know
If we’re going to add Koomey’s Law to the canon, suggesting that power requirements of a unit of computing will decline by half every 18 months, we thought it was important to make sure our readers were also aware of some of the other big laws governing technology today. So get your geek on as we explain 10 other principles that govern the growth and use of technology and why they matter. It’s unlikely this will come in handy at a cocktail party, but it might help you win an argument on the internet.
- Amdahl’s Law. This is named after computer architect Gene Amdahl and is used to find the maximum expected improvement to an overall system when only part of the system is improved. Put another way: It discusses why adding more of something doesn’t always result in a doubling of capacity. It’s applicable to the computing industry as chipmakers add more cores to their chips and for areas like high performance computing, when adding more nodes doesn’t linearly improve performance.
- Brooks’ Law. This one is for the software people out there and is from Fred Brooks, who stated it originally: “Adding people to a late software project just makes it later.” This law has proven itself true in other industries, but others don’t take it as empirical fact. (How could they? It’s not like physics are involved here).
Godwin’s Law. This is a bit lighthearted, but relevant enough for the geekerati. The lawstated originally, “As a Usenet discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1.” This was back in 1990 and has now been expanded to any discussion on the web. I do wonder if we can include the Downfall meme in that.
- The Jevons paradox. This isn’t a law, but it’s worth mentioning, since energy efficiency is gaining in relevance. The paradox, attributed to English economist William Stanley Jevons, says when improvements in technology make it possible to use a fuel more efficiently, consumption of that fuel tends to go up. Outside of energy I’ve also wondered if it could apply to the microprocessor market. As we make more energy-efficient chips, we’ll use more of them and thus use more energy.
- Marconi’s Law. If Guglielmo Marconi had his way, we’d have a world without wires, and fittingly, his law says the maximum signaling distance of a given device will vary as the square of the height of the antenna. Put simply, taller antennas work better, although this ignores the problems of water and buildings in the way, which is one reason building a cell network is so hard.
- Metcalfe’s Law. The law, named after Robert Metcalfe, the inventor of Ethernet, refers to the network effect, or how adding more users increases the value of a network. One person on Twitter is useless, but between two, there is a relationship. Add more, and you get a valuable service. This is used more often in terms of building web services or marketplaces but can also refer to device connections.
Moore’s Law. Named after Gordon Moore, the founder of Intel, this law holds that the number of transistors on a chip will double every 18 months. Moore’s Law is the reason we have more computing power on a smartphone than we had on supercomputers of a few decades ago and why you can buy ever-increasing storage drives for less and less money.
- Ohm’s Law — No, not Om’s law, but the law named after Georg Simon Ohm, a German scientist who figured out that the voltage applied to a conductor had a direct relationship on the resulting current. This resulted in an equation that electrical engineers and chip designers use often, because it describes the relationship between voltage, current and resistance on an electric circuit.
Shannon’s Law. Broadband nerds, Claude Shannon is your guy. The father of information theory, Shannon first envisioned the bit, the most basic unit of digital communications. His law defines the maximum amount of usable data that can be transmitted over any communications channels, whether wireless, wireline or even plain speech — a limit wireless and optical engineers are starting to bump up against today.
- Zuckerberg’s Law. Three years ago at a Web 2.0 Summit, Facebook founder and CEO Mark Zuckerberg got up on stage and said, “I would expect that next year, people will share twice as much information as they share this year, and next year, they will be sharing twice as much as they did the year before.” This isn’t really an empirical law that can be tested, so hardcore geeks might shudder, but those who appreciate Godwin’s Law, and and even Metcalfe’s Law, will likely feel Zuck’s statement belongs. The rest of us can go back to our caves to hide from all the oversharing.
And finally, because I think it bears repeating in this age of ever increasing technological accomplishment and wonder, I’ll offer Clarke’s Third Law, which is named after the sci-fi author Arthur C. Clarke and states, “Any sufficiently advanced technology is indistinguishable from magic.”
There are no comments yet, you'are the first one!