Performance testing, automating

From CitconWiki
Jump to navigationJump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

Using TPCH, for evaluating a database engine

  • Improving an analytics platform
  • unnamed database vendor, data, scale, queries against data

Write performance stories

  • We have a speculation that makes a story
  • Some teams are unable to estimate a story for performance
  • PO should give acceptance criteria performance
  • Should each story allow for x time to find performance issues

Make it work

  • Make it work well
  • Make it work fast

JMeter

  • How does a process scale with # of users, throughput
  • Finding limits of the system, 20 second responses, for example, is that too slow? or is that part of subjective metrics?
  • Definition of

Solution:

  • Set baselines for metrics
  • At each release, we want to find out if we slowed down or sped up.
  • How do we allow tradeoffs to interact, when there are thread-lock interactions slowing down eachother
  • Set up a CI system that will always do performance testing, no commit shall decrease performance

When does do we evaluate performance?

Performace evaluation requires a different skillset than other development.

  • Does this require a different set of eyes? A "Performance QA" agent, who can watch file io, network io, GUI, thread locking, Operating systems, L1 cache hit rate/miss rate, etc.?
  • How does one learn these skills?
    • College? Not always taught in college well, or directed at performance.