From 0 percent to greater than 0 percent Test Coverage

From CitconWiki
Jump to navigationJump to search

Basic Notes -> Needs editing for readability:

UI test first if low or 0% test coverage?

UI test only "brittle things" - if it breaks each release

Use JBehave as a part of continuous deployment

- connect Jira requirements to an actual specification

Pre-requisite is to manage dependencies (dependency injections)

Two strategies - turn from ice cream cone to pyramid OR how to improve test usability/quality

- "I don't live in an ideal world, I can only control how many sprinkles I put on top"

Potentially meta tag certain tests to run for fast feedback when developing in a certain area

If not doing a 24hr development -> run the whole suite at night

- What to do if a large number fail over night?

Code coverage (can a tool tell us what needs to be run, or does it need to be manually tagged)

- Clover is a unit testing could be useful

Devs wish they had information quicker from UI testings (not segmented)

Avg devs do not usually have the patience to do UI testing

- want to feel productive so do not want to focus on tests

TDD for bug fixes

Techniques to deal with ice cream cone

Code coverage (less about absolute numbers and more about trends)

- Ncover does not apply to mono-based projects

- Rcover

- Clover => looks at recently changed files and can look into it

- Sonar => similar to sonar

- publishing of code coverage

-- publish code coverage once per release (1 month)

-- make it on every commit!

-- make sure to announce as many positive things as you do broken builds

- viewing of test results

-- what is the max time for the test results to work?

--- keep under 10 min? this may even be stretching it

--- aim for 5 +/-

-- identify what are the super reliable tests is this worth saying to devs to care about the certain set

-- make sure there are team ownership to tests

--- if a test fails (even if because flakey) have a QA and dev pair rather than pointing fingers

--- you can time box even flakey test improvement to 10min for every time it fails

--- if you can not fix in 10 min then add to tech debt board

Are pre-commit and CI tests different?

- yes. Unit at the pre-commit and integration/UI is done at the CI

It does not happy over night, people may lose patience if they do not see value

Gut feeling is it takes to get to 60% coverage to feel confident with regression testing

Fail fast particularly in CD

- does it matter if it is rock solid or can you fix as you go to the next delivery

Cost of defect curve may be flattening with CD

Delivering at a "ok enough" level, defect costs are low enough that it doesn't matter

- but this does support heroism and insanity during release

Maybe mindset built into the team is more important than the code coverage metrics

- must be more than just the dev team, it needs to be the whole company

- "our customers are resilient" - high barrier to switching so reasonably safe

When is it important to find bugs?

- requirements => get a dev in there for understanding of different architectures etc

- pre-customers => people may be used to computers screwing up, but that doesn't reflect well


Off topic but interesting:

- Specification by Example - Gojko Adzic

- strategies for analysis

-- need a cultural shift to allow kickoffs with dev/qa/ba

-- low barrier to entry, make the conversations quick and easy

-- is there a list to QA things? ie: a sanity check ideas