My Definition of Done for performance and loadtest cases

Load and performance testcases are frequently used for multiple objectives:

  • Determining where a system’s breaking point is
  • Determining if a system is able to meet a required load

Here are the criteria I use to make sure my load/performance testcases are good enough to use and maintain for all the objectives. I’ve been using this for regular load tests on a huge SharePoint farm at one of my clients.

Criteria Explanation
The following information is parametrized into context parameters:

  1. Server name(s)

  2. Site name(s)

  3. List/library/page name(s)

  4. Filenames (up- and download) specific to the testcase

User credentials are included in the testcase
Think-times on the various requests are configured conform test design I always check that the agreed thinktimes are present. I configure the loadtest to either use them all or ignore them all
The testcase includes a transaction encapsulating only the requests relevant to the scenario Test results always state how fast pages are. However, we also want to see fast the system is able to respond to user scenarios that include multiple requests
Testcase has logic to fail if generic errors / messages are returned
Testcase has logic to fail if the correct content has not been returned A simple validation rule to check for a specific string in the output is usually enough
Testcase has logic determine if a performance threshold has been exceeded
The decision to fail the testcase based on the threshold, can be enabled or disabled without changing the testcase I usually include a ‘responsetime goal’ validation rule and set its level to Low or Medium. Then at runtime, I can indicate what level of validation rules should apply to the entire loadtest. That will allow me to run the entire loadtest to stress the farm without a few performance delays causing the run to fail
Testcase is able to run standalone or as part of a loadtest with concurrent users
The testcase should only delete or modify information that it owns. If this is not possible, then the testcase must gracefully accepts errors when concurrent instances of it touch that information
Testcases that create, modify or delete information must include information that uniquely identifies the specific instance of the testcase In practice I use the following combination concatenated into a string:

  1. The id of the virtual user (WebTestUserId)

  2. The id of the iteration (WebTestIteration)

  3. The id of the agent running the test (AgentName)

Testcases conform to the testdesign in their decision to parse and execute dependent requests (.css, .js, images etc etc)