...the time to open your application or to visualize your website...
...the number of requests, the bundle size, the page load, ... aka Web Performance Optimization (WPO)...
...the time to process requests, the number of requests/sec, scalability, ...
...hardware resources (CPU, Memory, IOs) ...
...the time of business transaction, the number of business transactions aka Througput...
... cloud costs, uptime, competitors...
The word performance in computer performance means the same thing that performance means in other contexts, that is, it means ...
(... in all situations ...)
When your website or application is broken, nobody will use it ;)
The slower your website or application loads and displays, the less people will use it
We can suppose the system is working well :(
...But in fact we just don't know
four datasets that have nearly identical simple statistical properties but very different when graphed
We don't want something to be fast on average, we want it to be fast for most of the users
Most people have more than the average number of legs - Hans Rosling
Page loads in more than the average for 30% of the users, even more than 21 secs for 5%
You won't see ANY performance issues during unit tests
You won't see ALL performance issues in Development
You can’t optimize everything, you will never have the time to work only on performance
Don't try to guess, never automatically apply tips, best practices... optimizing performance is meaningless if you do not have effective tools for measuring it.
Benchmarking is hard, it’s very easy to end up not measuring, what you think you are measuring
common pitfalls : Cold-start, Dynamic selection of benchmarks, Loop optimizations, Dead-code elimination, Constant foldings, Non-throughput measures, Synchronize iterations, Multi-threaded sharing, Multi-threaded setup/teardown, False-sharing, Asymmetric benchmarks, Inlining
It's great if your site performs well for a single user..
But how will it do when a storm of users hits?
The throughput is not linear : it depends on the load.
The response time increases proportionally to the user load.
We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%
Donald Knuth
Tip : use averges percentiles
As the software evolves, Performance regression is
Never give up your performance accidentally
Rico Mariani