Difference between revisions of "AcceptanceTesting"

From CitconWiki
Jump to navigationJump to search
Line 1: Line 1:
Top 5(ish) reasons why teams fail with acceptance testing
+
== Top 5(ish) reasons why teams fail with acceptance testing ==
  
 
# No collaboration
 
# No collaboration
Line 19: Line 19:
 
Dislike term 'acceptance testing' - could mean definition as above (Test Driven Requirements, Example Driven Requirements, etc), but often is thought as being equivalent to 'UAT' - this is *not* what is being discussed.  'Specification Workshop' has been successful as a term.
 
Dislike term 'acceptance testing' - could mean definition as above (Test Driven Requirements, Example Driven Requirements, etc), but often is thought as being equivalent to 'UAT' - this is *not* what is being discussed.  'Specification Workshop' has been successful as a term.
  
Business people and testers collaborating
+
 
 +
== Business people and testers collaborating ==
  
 
Currently in 2nd Sprint
 
Currently in 2nd Sprint
Result for earlier approach
+
* Result for earlier approach was not good
50k hours in one release
+
* 50k hours in one release
Siloed teams
+
* Siloed teams
9months before the software was finished
+
* It was 9 months before the software was finished
 +
* now switching to 6 scrum teams (2 designers, 1 tester, 4 developers)
 +
* (but switching to new application also)
 +
* positive results so far
  
now switching to 6 scrum teams (2 designers, 1 tester, 4 developers)
+
Collaboration as a sys admin -> team 'hand over' the application ...
(but switching to new application also)
+
* lots of arguing during deployment
 +
* team started to ask sysadmin to verify things 'up front'
 +
* then brought sys admin into team
 +
* eventually contributing to prioritisation of stories
  
Collaboration as a sys admin -> team 'hand over' the application ...
+
Another story
lots of arguing during deployment
+
* Waterfall model, siloed
team started to ask sysadmin to verify things 'up front'
+
* To help a move to agile, have management showcase the project
then brought sys admin into team
+
* Writing requirements is a collaborative activity, involving the whole team
eventually contributing to prioritisation of stories
+
* Everyone can voice an opinion + help define the acceptance criteria
 +
* Try to automate as much as possible
 +
 
 +
The way the F15 was designed
 +
* Customers said 'we want a 2.5 mach airplane'
 +
* Designers attempted it, and couldn't (for the right cost)
 +
* Go back, and asked 'why?'
 +
* We need to get away from Russian planes really quickly
 +
* How would a more agile plane work?
 +
* Yes, yes - that would be fine!
 +
* Developers know the technical limitations - tell them what the problem is, and maybe they'll come up with a different/better solution - get everyone in the same room to discuss it
 +
 
 +
If you have a waterfall project with lots of specifications, should you throw them away?
 +
* Yes - but be mindful of the political ramifications - perhaps suggest that they need 'clarification'?
  
not easy - because of silo-culture
 
  
the problem of fractional people
+
Other notes
also, risk of resources being pulled, 'unreliable resources'
+
* not easy - because of silo-culture
 +
* the problem of fractional people
 +
* it is an accounting issue - and you should refuse to carry the burden of the cost of a partial resource when you aren't getting the benefit
 +
* should be looking at throughput of delivered value, not the costs of utilisation of resources
 +
* also, risk of resources being pulled, 'unreliable resources'
 +
* don't take a requirements document, then write acceptance tests from that
 +
* 'translation problem': http://www.brokenpicturetelephone.com

Revision as of 04:42, 19 September 2009

Top 5(ish) reasons why teams fail with acceptance testing

  1. No collaboration
  2. Focusing on 'how' not on 'what'
  3. Tests unusable as live documentation
  4. Acceptance testing is not considered as an 'value-adding' activity
  5. Expecting acceptance tests to be a full regression suite
  6. Focusing on tools
  7. Automation code is not considered as important as 'production code' - 'it's only test code' - normal code rules are not applied - 'test code' is not maintained 'with love'

Acceptance tests are a specification of a system - in order to be a good specification, they should be exemplars, but don't need to be dealing with every single edge case (if they are to remain readable/useable as documentation)

You could split out more exhaustive testing into a separate section, separate suite, or (better?) a separate tool.

Don't reject acceptance testing because you don't like the tool - start with the tasks you need to achieve. If it is difficult to automate, it doesn't mean it can be ignored - it is still an 'acceptance test' and it still needs to be run.

Definition of 'acceptance test': whatever you've agreed with the client (not just that that can be automated)

Dislike term 'acceptance testing' - could mean definition as above (Test Driven Requirements, Example Driven Requirements, etc), but often is thought as being equivalent to 'UAT' - this is *not* what is being discussed. 'Specification Workshop' has been successful as a term.


Business people and testers collaborating

Currently in 2nd Sprint

  • Result for earlier approach was not good
  • 50k hours in one release
  • Siloed teams
  • It was 9 months before the software was finished
  • now switching to 6 scrum teams (2 designers, 1 tester, 4 developers)
  • (but switching to new application also)
  • positive results so far

Collaboration as a sys admin -> team 'hand over' the application ...

  • lots of arguing during deployment
  • team started to ask sysadmin to verify things 'up front'
  • then brought sys admin into team
  • eventually contributing to prioritisation of stories

Another story

  • Waterfall model, siloed
  • To help a move to agile, have management showcase the project
  • Writing requirements is a collaborative activity, involving the whole team
  • Everyone can voice an opinion + help define the acceptance criteria
  • Try to automate as much as possible

The way the F15 was designed

  • Customers said 'we want a 2.5 mach airplane'
  • Designers attempted it, and couldn't (for the right cost)
  • Go back, and asked 'why?'
  • We need to get away from Russian planes really quickly
  • How would a more agile plane work?
  • Yes, yes - that would be fine!
  • Developers know the technical limitations - tell them what the problem is, and maybe they'll come up with a different/better solution - get everyone in the same room to discuss it

If you have a waterfall project with lots of specifications, should you throw them away?

  • Yes - but be mindful of the political ramifications - perhaps suggest that they need 'clarification'?


Other notes

  • not easy - because of silo-culture
  • the problem of fractional people
  • it is an accounting issue - and you should refuse to carry the burden of the cost of a partial resource when you aren't getting the benefit
  • should be looking at throughput of delivered value, not the costs of utilisation of resources
  • also, risk of resources being pulled, 'unreliable resources'
  • don't take a requirements document, then write acceptance tests from that
  • 'translation problem': http://www.brokenpicturetelephone.com