Amdahl’s law states that the overall performance improvement gained by optimising a single part of a system is limited by the fraction that the improved part is actually used.
Takeaway: optimise the largest/most expensive parts of your program.
This is especially relevant when programming parallelised code. If the speedup isn’t huge, maybe not worth it to parallelise.
See also
- Law of diminishing returns, the real life equivalent