Software Development is a recent discipline; we are just at the beginning of finding out how to build good software.
1 years ago, “serverless” system was unknown, 5 years ago we were all working on relational database, 10 years ago nobody was writing unit tests, 15 years ago we were working with waterfall processes instead of agile.
We, developers, are writing the history of Software Development. It is so exciting to be at the beginning of a new science.
As I read books and Wikipedia, I discover laws and I wanted to compile them into one big list. Let me know if I’m missing some, so I can update it.
Parallel computing is relevant only for a low number of processors and very parallelizable programs.
If a program needs 20 hours using a single processor core, and a particular part of the program which takes one hour to execute cannot be parallelized, while the remaining 19 hours of execution time can be parallelized, then regardless of how many processors are devoted to a parallelized execution of this program, the minimum execution time cannot be less than that critical one hour. [Wiki]
Augustine’s Second Law
For every scientific (or engineering) action, there is an equal and opposite social reaction
Too often, science or engineering is perceived as the cause of problems rather than the solution.
Passion is inversely proportional to the amount of real information available
That could explain why some of us don’t keep coding after 10/15 years of experiences (or it’s because of Peter’s principle :P).
Adding manpower to a late software project makes it later
Nine women don’t do a baby in one month. Seriously!
Every function which would naturally be regarded as computable can be computed by the universal Turing machine.
Any real-world computation can be translated into an equivalent computation involving a Turing machine.
Clarke’s third Law
Any sufficiently advanced technology is indistinguishable from magic.
This could explain why engineers are so excited by new technologies during Microsoft Keynotes.
Any piece of software reflects the organizational structure that produced it.
If you have four groups working on a compiler, you’ll get a 4-pass compiler.
Dilbert Principle / And the Peter’s principle
The most ineffective workers are systematically moved to the place where they can do the least damage: management.
In a hierarchy, every employee tends to rise to his level of incompetence.
That would explain why I’m an engineering manager!
Deutsch’s Seven Fallacies of Distributed Computing
Reliable delivery; Zero latency; Infinite bandwidth; Secure transmissions; Stable topology; Single administrator; Zero cost.
Wow, we lied to us! Is the cloud not fixing most of those? 🙂
Anything that can go wrong, will.
The Law of False Alerts
As the rate of erroneous alerts increases, operator reliance, or belief, in subsequent warnings decreases.
This is why it matter to have the right alert mechanisms.
There does not now, nor will there ever, exist a programming language in which it is the least bit hard to write bad programs.
Bandwidth grows at least three times faster than computer power.
The cost of computing systems increases as the square root of the computational power of the systems.
Whatever the state of a project, the time a project-leader will estimate for completion is constant.
Hoare’s Law of Large Programs
Inside every large problem is a small problem struggling to get out.
Divide and conquer!
A task always takes longer than you expect, even when you take into account Hofstadter’s Law.
Users spend most of their time on other sites. This means that users prefer your site to work the same way as all the other sites they already know.
Not sure this quote for people working at Google or Facebook though.
Given enough eyeballs, all bugs are shallow.
Pair programming is a good solution to root causing big bad bugs.
People under time pressure don’t think faster.
In network theory, the value of a system grows as approximately the square of the number of users of the system.
The number of transistors on an integrated circuit will double in about 18 months.
The first 90% of the code accounts for the first 90% of the development time. The remaining 10% of the code accounts for the other 90% of the development time.
Law of the Conservation of Catastrophe
The solutions to one crisis pave the way for some equal or greater future disaster.
Avoid patching prod 🙂
Everything that’s worth understanding about a complex system, can be understood in terms of how it processes information.
The explanation requiring the fewest assumptions is most likely to be correct.
Variables won’t; constants aren’t.
Work expands so as to fill the time available for its completion.
Next time, make sure you dont exaggerate your costing, it will not give you buffer, it will just give you extra work :).
Red Queen Principle
For an evolutionary system, continuing development is needed just in order to maintain its fitness relative to the system it is co-evolving with.
Sixty percent of software’s dollar is spent on maintenance, and sixty percent of that maintenance is enhancement.
Yep, money is NOT in building systems, but maintaining them. Why do you think we have that much consulting firms?
Tesler’s Law of Conservation of Complexity
You cannot reduce the complexity of a given task beyond a certain point. Once you’ve reached that point, you can only shift the burden around.
Artificial Intelligence is whatever hasn’t been done yet.
Spafford’s Adoption Rule
For just about any technology, be it an operating system, application or network, when a sufficient level of adoption is reached, that technology then becomes a threat vector.
Software gets slower faster than hardware gets faster.
Any code of your own that you haven’t looked at for six or more months might as well have been written by someone else.
Lehman’s laws of software evolution
A system must be continually adapted or it becomes progressively less satisfactory
A system evolves, its complexity increases unless work is done to maintain or reduce it
As a system evolves, all associated with it, developers, sales personnel and users, for example, must maintain mastery of its content and behaviour to achieve satisfactory evolution. Excessive growth diminishes that mastery. Hence the average incremental growth remains invariant as the system evolves.