Performance testing, automating

From CitconWiki
Revision as of 13:31, 24 August 2013 by Macetw (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

Using TPCH, for evaluating a database engine

  • Improving an analytics platform
  • unnamed database vendor, data, scale, queries against data

Write performance stories

  • We have a speculation that makes a story
  • Some teams are unable to estimate a story for performance
  • PO should give acceptance criteria performance
  • Should each story allow for x time to find performance issues

Make it work

  • Make it work well
  • Make it work fast

JMeter

  • How does a process scale with # of users, throughput
  • Finding limits of the system, 20 second responses, for example, is that too slow? or is that part of subjective metrics?
  • Definition of

Solution:

  • Set baselines for metrics
  • At each release, we want to find out if we slowed down or sped up.
  • How do we allow tradeoffs to interact, when there are thread-lock interactions slowing down eachother
  • Set up a CI system that will always do performance testing, no commit shall decrease performance

When does do we evaluate performance?

Performace evaluation requires a different skillset than other development.

  • Does this require a different set of eyes? A "Performance QA" agent, who can watch file io, network io, GUI, thread locking, Operating systems, L1 cache hit rate/miss rate, etc.?
  • How does one learn these skills?
    • College? Not always taught in college well, or directed at performance.