Difference between revisions of "AcceptanceTesting"

From CitconWiki
Jump to navigationJump to search
Line 1: Line 1:
Business people and testers collaborating
 
 
 
Top 5(ish) reasons why teams fail with acceptance testing
 
Top 5(ish) reasons why teams fail with acceptance testing
  
Line 6: Line 4:
 
# Focusing on 'how' not on 'what'
 
# Focusing on 'how' not on 'what'
 
# Tests unusable as live documentation
 
# Tests unusable as live documentation
## Acceptance testing is not considered as an 'value-adding' activity  
+
# Acceptance testing is not considered as an 'value-adding' activity  
 
# Expecting acceptance tests to be a full regression suite
 
# Expecting acceptance tests to be a full regression suite
 
# Focusing on tools
 
# Focusing on tools
## Automation code is not considered as important as 'production code' - 'it's only test code' - normal code rules are not applied - 'test code' is not maintained 'with love'
+
# Automation code is not considered as important as 'production code' - 'it's only test code' - normal code rules are not applied - 'test code' is not maintained 'with love'
  
 
Acceptance tests are a specification of a system - in order to be a good specification, they should be exemplars, but don't need to be dealing with every single edge case (if they are to remain readable/useable as documentation)
 
Acceptance tests are a specification of a system - in order to be a good specification, they should be exemplars, but don't need to be dealing with every single edge case (if they are to remain readable/useable as documentation)
Line 20: Line 18:
  
 
Dislike term 'acceptance testing' - could mean definition as above (Test Driven Requirements, Example Driven Requirements, etc), but often is thought as being equivalent to 'UAT' - this is *not* what is being discussed.  'Specification Workshop' has been successful as a term.
 
Dislike term 'acceptance testing' - could mean definition as above (Test Driven Requirements, Example Driven Requirements, etc), but often is thought as being equivalent to 'UAT' - this is *not* what is being discussed.  'Specification Workshop' has been successful as a term.
 +
 +
Business people and testers collaborating
 +
 +
Currently in 2nd Sprint
 +
Result for earlier approach
 +
50k hours in one release
 +
Siloed teams
 +
9months before the software was finished
 +
 +
now switching to 6 scrum teams (2 designers, 1 tester, 4 developers)
 +
(but switching to new application also)
 +
 +
Collaboration as a sys admin -> team 'hand over' the application ...
 +
lots of arguing during deployment
 +
team started to ask sysadmin to verify things 'up front'
 +
then brought sys admin into team
 +
eventually contributing to prioritisation of stories
 +
 +
not easy - because of silo-culture
 +
 +
the problem of fractional people
 +
also, risk of resources being pulled, 'unreliable resources'

Revision as of 05:29, 19 September 2009

Top 5(ish) reasons why teams fail with acceptance testing

  1. No collaboration
  2. Focusing on 'how' not on 'what'
  3. Tests unusable as live documentation
  4. Acceptance testing is not considered as an 'value-adding' activity
  5. Expecting acceptance tests to be a full regression suite
  6. Focusing on tools
  7. Automation code is not considered as important as 'production code' - 'it's only test code' - normal code rules are not applied - 'test code' is not maintained 'with love'

Acceptance tests are a specification of a system - in order to be a good specification, they should be exemplars, but don't need to be dealing with every single edge case (if they are to remain readable/useable as documentation)

You could split out more exhaustive testing into a separate section, separate suite, or (better?) a separate tool.

Don't reject acceptance testing because you don't like the tool - start with the tasks you need to achieve. If it is difficult to automate, it doesn't mean it can be ignored - it is still an 'acceptance test' and it still needs to be run.

Definition of 'acceptance test': whatever you've agreed with the client (not just that that can be automated)

Dislike term 'acceptance testing' - could mean definition as above (Test Driven Requirements, Example Driven Requirements, etc), but often is thought as being equivalent to 'UAT' - this is *not* what is being discussed. 'Specification Workshop' has been successful as a term.

Business people and testers collaborating

Currently in 2nd Sprint Result for earlier approach 50k hours in one release Siloed teams 9months before the software was finished

now switching to 6 scrum teams (2 designers, 1 tester, 4 developers) (but switching to new application also)

Collaboration as a sys admin -> team 'hand over' the application ... lots of arguing during deployment team started to ask sysadmin to verify things 'up front' then brought sys admin into team eventually contributing to prioritisation of stories

not easy - because of silo-culture

the problem of fractional people also, risk of resources being pulled, 'unreliable resources'