Visual Studio Performance Ing Quick Reference Guide

User Manual: Pdf

Open the PDF directly: View PDF PDF.
Page Count: 153

DownloadVisual Studio Performance Ing Quick Reference Guide
Open PDF In BrowserView PDF
MICROSOFT

Visual Studio Performance
Testing Quick Reference
Guide
A quick reference for users of the Team Testing
performance features of Visual Studio
Visual Studio Performance Testing Quick Reference Guide
4/1/2010

VSTS Rangers
This content was created by Geoff Gray with help from the Visual Studio Rangers team. “Our
mission is to accelerate the adoption of Visual Studio by delivering out of band solutions for
missing features or guidance. We work closely with members of Microsoft Services to make sure
that our solutions address real world blockers.” -- Bijan Javidi, VSTS Rangers Lead

Visual Studio Performance Testing Quick Reference Guide

Page 1

Summary
This document is a collection of items from public blog sites, Microsoft® internal discussion aliases
(sanitized) and experiences from various Test Consultants in the Microsoft Services Labs. The idea is to
provide quick reference points around various aspects of Microsoft Visual Studio® performance testing
features that may not be covered in core documentation, or may not be easily understood. The different
types of information cover:





How does this feature work under the covers?
How can I implement a workaround for this missing feature?
This is a known bug and here is a fix or workaround.
How do I troubleshoot issues I am having?

The document contains two Tables of Contents (high level overview, and list of every topic covered) as
well as an index. The current plan is to update the document on a regular basis as new information is
found.

The information contained in this document represents the current view of Microsoft Corporation
on the issues discussed as of the date of publication. Because Microsoft must respond to changing
market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and
Microsoft cannot guarantee the accuracy of any information presented after the date of
publication.
This document is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS,
IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS DOCUMENT.
Microsoft grants you a license to this document under the terms of the Creative Commons
Attribution 3.0 License. All other rights are reserved.
 2010 Microsoft Corporation.
Microsoft, Active Directory, Excel, Internet Explorer, SQL Server, Visual Studio, and Windows are
trademarks of the Microsoft group of companies.
All other trademarks are property of their respective owners.

Visual Studio Performance Testing Quick Reference Guide

Page 2

Revision History






Version 2.0
o Released 2/16/09
o Available externally on CodePlex
o Major reformat of document
o Added comprehensive index
Version 3.0
o Release Candidate published 3/23/2010
o Added many VS 2010 performance testing articles
o Added and updated articles about VS 2010 how-to’s, issues, etc.
o Added or updated articles for features “changed in 2010”
o Updated many articles on issues with VS 2008
o Added some deep dive articles about how VS performance testing works (both 2008 and
2010)
Version 3.0a
o Final release version for 3.0. This is the official release that should be used.
o Published on 4/1/2010

NOTE
All items that are not marked with a version note should be considered to apply to both VS 2008 and VS 2010

Visual Studio Performance Testing Quick Reference Guide

Page 3

List of Topics
NOTE FROM THE AUTHOR

8

HOW IT WORKS

9

How Web Tests Handle HTTP Headers
General Info (including order of execution) of load and web test plugins and rules
Client Code does not execute because Web Tests Work at the HTTP Layer
File Downloads, Download Size and Storage of files during Web Tests
When is the “Run unit tests in application domain” needed?
How the “Test Iterations” Setting impacts the total number of tests executed
Test timeout setting for load test configuration does not affect web tests
How user pacing and “Think Time Between Test Iterations” work
Load test warmup and cool down behaviors
What is the difference between Unique, Sequential and Random Data Sources
Comparing new users to return users
Goal based user behavior after the test finishes the warmup period
Threading models in Unit tests under load
Simulation of Browser Caching during load tests
The difference between Load Test Errors and Error Details
How parameterization of HIDDEN Fields works in a webtest
Testing execution order in Unit Tests
How machines in the test rig communicate
Changing the Default Port for Agent-Controller Communication
How to Add Agents To A Test Rig

ITEMS NEW TO VS 2010
“Find” feature now available in Webtest playback UI
“Go To Web Test” feature now available in Webtest playback UI
Recorder Log Available
Add extraction rule directly from the playback UI
New “Reporting Name” property for web requests
LoadTestResultsTables now differentiate between GET and POST requests
Virtual user visualization now available
New Excel reporting features built into load test results
New Load Test and Load Test Rig Licensing and configurations
New test mix: “Sequential Test Mix”
Query String and FORM POST URLs get parameterized
New options on Load Test Scenarios
Loops and Conditionals

CONFIGURATIONS AND SETTINGS
How to Change the Location Where Agents Store Run Files
How to set a proxy server for web tests
How to configure Web Tests so Fiddler can capture playback info

Visual Studio Performance Testing Quick Reference Guide

9
9
12
12
12
12
13
13
13
14
14
17
18
19
20
21
23
25
26
26

27
27
28
29
30
31
32
33
39
40
44
46
47
48

50
50
50
50

Page 4

Controlling the amount of memory that the SQL Server Results machine consumes
How to configure the timeouts for deployment of load tests to agents
How to set the number of Load Test Errors and Error Details saved
Multi-proc boxes used as agents should have .NET garbage collection set to server mode
Location of list of all agents available to a controller

NETWORKS, IP SWITCHING, TEST STARTUPS
IP Address Switching anatomy (how it works)
Gotcha: IP Address Switching is ONLY for WEB TESTS
Gotcha: IP Addresses used for switching are not permanent
How to Setup IP Switching
Troubleshooting invalid view state and failed event validation
Startup: Slowness Restarting a Test Rig with Agents Marked as “Offline”
Startup: Multiple Network Cards can cause tests in a rig to not start
Startup: Slow startup can be caused by _NT_SYMBOL_PATH environment variable
Startup: tests on a Rig with Agents on a Slow Link
“Not Bound” Exception when using IP Switching is not really an error
How to configure the timeout for deployment of load tests to agents

PERFORMANCE COUNTERS AND DATA
Customizing the Available Microsoft System Monitor counter sets
Performance Counter Considerations on Rigs with slow links
Increase the performance counter sampling interval for longer tests
Changing the default counters shown in the graphs during testing
Possible method for fixing “missing perfmon counters” issues
How and where Performance data gets collected

DATA AND RESULTS
Custom Data Binding in UNIT Tests
Verifying saved results when a test hangs in the “In Progress” state after the test has finished
The metrics during and after a test differ from the results seen.
How new users and return users affect caching numbers
data sources for data driven tests get read only once
Consider including Timing Details to collect percentile data
Consider enabling SQL Tracing through the Load Test instead of separately
How to collect SQL counters from a non-default SQL instance
How 90% and 95% response times are calculated
Transaction Avg. Response Time vs. Request Avg. Response Time
Considerations for the location of the Load Test Results Store
Set the recovery model for the database to simple
How to clean up results data from runs that did not complete
InstanceName field in results database are appended with (002), (003), etc.
Layout for VSTS Load Test Results Store
How to view Test Results from the GUI
SQL Server Reporting Services Reports available for download
How to move results data to another system
Load Test Results without SQL NOT stored

Visual Studio Performance Testing Quick Reference Guide

51
51
52
53
53

54
54
54
54
55
58
58
59
59
60
60
61

62
62
64
65
65
65
66

67
67
67
68
69
70
71
72
72
72
73
73
73
74
74
74
75
75
75
76

Page 5

Unable to EXPORT from Load Test Repository
Web Test TRX file and the NAN (Not a Number) Page Time entry
Proper understanding of TRX files and Test Results directory
Understanding the Response Size reported in web test runs

ERRORS AND KNOWN ISSUES
CSV files created in VSTS or saved as Unicode will not work as data sources
Incorrect SQL field type can cause errors in web tests
Leading zeroes dropped from datasource values bound to a CSV file
Recorded Think Times and paused web test recordings
After opening a webtest with the VS XML Editor, it will not open in declarative mode.
Calls to HTTPS://Urs.Microsoft.Com show up in your script
Possible DESKTOP HEAP errors when driving command line unit tests
Goal based load tests in VSTS 2008 do not work after applying SP1
Using Named Transactions in a Goal-Based Load Profile can cause errors
Debugging Errors in Load Tests
Debugging OutOfMemory Exceptions in Load Tests
Memory leak on load test when using HTTPS
“Not Trusted” error when starting a load test
Detail Logging may cause “Out of disk space” error
Error details and stack traces no longer available in VSTS 2010
VSTS does not appear to be using more than one processor
Changes made to Web Test Plugins may not show up properly
Socket errors or “Service Unavailable” errors when running a load test
Error “Failed to load results from the load test results store”
Hidden Field extraction rules do not handle some fields
Test results iteration count may be higher than the max test iterations set
In flight test iterations may not get reported
Completion of Unit Test causes spawned CMD processes to terminate
Bug with LoadProfile.Copy() method when used in custom goal based load tests
Errors in dependent requests in a Load Test do not show up in the details test log
WCF service load test gets time-outs after 10 requests
Loadtestitemresults.dat size runs into GBs

TROUBLESHOOTING
How to enable logging for test recording
Diagnosing and fixing Web Test recorder bar issues
User Account requirements and how to troubleshoot authentication
How to enable Verbose Logging on an agent for troubleshooting
Error that Browser Extensions are disabled when recording a web test
Troubleshooting invalid view state and failed event validation
Troubleshooting the VSTS Load Testing IP Switching Feature
Troubleshooting Guide for Visual Studio Test Controller and Agent

HOW TO, GOTCHAS AND BEST PRACTICES
How to call one coded web test from another
How to use methods other than GET and POST in a web test

Visual Studio Performance Testing Quick Reference Guide

76
77
78
79

80
80
80
80
80
81
81
81
82
82
83
83
83
84
85
85
85
85
86
87
87
87
88
88
89
90
92
92

93
93
93
94
95
95
96
97
99

111
111
111

Page 6

How to filter out certain dependent requests
How to handle ASP.NET Cookie-less Sessions
How to use Client-side certificates in web tests
How to remove the “If-Modified-Since” header from dependent requests
How to handle custom data binding in web tests
How to add a datasource value to a context parameter
How to test Web Services with Unit Tests
How to add random users to web tests
How to add think time to a Unit Test
How to add details of a validation rule to your web test
How to mask a 404 error on a dependent request
How to parameterize Web Service calls within Web Tests
How to pass Load Test Context Parameters to Unit Tests
How to create Global Variables in a Unit Test
How to use Unit Tests to Drive Load with Command Line Apps
How to add Console Output to the results store when running Unit tests under load
How to add parameters to Load Tests
How to Change the Standard Deviation for a NormalDistribution ThinkTime
How to programmatically access the number of users in Load Tests
How to create a webtest plugin that will only execute on a predefined interval
How to support Context Parameters in a plug-in property
How to stop a web test in the middle of execution
How To: Modify the ServicePointManager to force SSLv3 instead of TLS (Default)
How To: Stop a Test in the PreRequest event
How to make a validation rule force a redirection to a new page
How to add a Web Service reference in a test project
How to remotely count connections to a process
How to hook into LoadTest database upon completion of a load test
How to deploy DLLs with MSTEST.EXE
How to authenticate with proxy before the test iteration begins
How to enumerate WebTextContext and Unit TestContext objects
How to manually move the data cursor
How to programmatically create a declarative web test
How to modify the string body programmatically in a declarative web test
Gotcha: Check Your Validation Level in the Load Test Run Settings
Gotcha: Do not adjust goals too quickly in your code
Gotcha: Response body capture limit is set to 1.5 MB by default
Gotcha: Caching of dependent requests is disabled when playing back Web Tests
Best Practice: Blog on various considerations for web tests running under load
Best Practice: Coded web tests and web test plug-ins should not block threads
Best Practice: considerations when creating a dynamic goal based load test plugin:
Best Practice: Add an Analysis Comment
Best Practice – Using comments in declarative webtests

EXTENSIBILITY
New Inner-text and Select-tag rules published on Codeplex

Visual Studio Performance Testing Quick Reference Guide

111
112
112
113
113
113
114
114
114
115
116
117
117
117
118
118
119
119
120
120
121
122
122
123
123
127
129
129
130
131
132
132
133
134
134
134
134
135
135
135
136
136
136

138
138

Page 7

How to Add Custom Tabs to the Playback UI

139

ITEMS NOT SPECIFIC TO THE VSTS TESTING PLATFORM

146

Using the VSTS Application Profiler
VSTS 2008 Application Profiler New Features
Using System.NET Tracing to debug Network issues
Logparser tips and tricks
Logparser WEB Queries
LogParser Non-Web Queries

146
146
146
147
147
148

OLDER ARTICLES

149

Content-Length header not available in Web Request Object
SharePoint file upload test may post the file twice
Some Hidden Fields are not parameterized within AJAX calls
(FIX) Unit Test threading models and changing them
Bug in VSTS 2008 SP1 causes think time for redirected requests to be ignored in a load test
New Load Test Plugin Enhancements in VSTS 2008 SP1
Four New Methods added to the WebTestPlugin Class for 2008 SP1

149
149
149
149
150
150
150

INDEX

151

Note from the author
This new version of the Quick Reference Guide has been rearranged to attempt to make things easier to
find. Many of the sub-topics have been removed and all of the main topics have been changed to reflect
actions or needs instead of the components of the tool.
There is a full section near the beginning just on new features in Visual Studio 2010. This list is not even
close to complete WRT all of the new Performance Testing features, let alone the tons of other testing
features in general. You will also find information about changes to 2010 and issues with 2010
throughout the rest of the document. All of these should have a balloon stating that it is new or
different.
Also please note that the Microsoft Visual Studio team has renamed the suite. Now the full suite (which
contains the load testing features) is called “Visual Studio Ultimate”. Therefore you will see me referring
to much of the 2010 stuff with “VS 2010” as opposed to the older style “VSTS 2008”.
Thanks to all of the people who have contributed articles and information. I look forward to hearing
feedback as well as suggestions moving forward.
Sincerely,
Geoff Gray, Senior Test Consultant – Microsoft Testing Services Labs

Visual Studio Performance Testing Quick Reference Guide

Page 8

How It Works
How Web Tests Handle HTTP Headers
There are three different types of HTTP headers handled by Web tests:
1) Recorded Headers and headers explicitly added to the request. By default, the Web test
recorder only records these headers:
 “SOAPAction”
 “Pragma”
 “x-microsoftajax”
 ”Content-Type”
2) You can change the list of headers that the Visual Studio 2008 and 2010 web test recorder
records in the registry by using regedit to open:
 HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\9.0\EnterpriseTools\QualityTools\Web
LoadTest
 Add a string value under this key with the name "RequestHeadersToRecord” and
value="SOAPAction;Pragma;x-microsoftajax;Content-Type; Referrer"
If you do this and re-record your Web test, the Referrer header should be included in the request like
this:

Referrer header in a declarative web test

3) Headers handled automatically by the engine. Two examples: 1) headers sent and received as
part of authentication. These headers are handled in the Web test engine and can’t be
controlled by the test. 2) cookies, which can be controlled through the API.

General Info (including order of execution) of load and web test plugins and rules
WebTestPlugins get tied to a webtest at the main level of the test. The order of precedence is:
class WebTestPluginMethods : WebTestPlugin
{
public override void PreWebTest(object sender, PreWebTestEventArgs e) { }
public override void PreTransaction(object sender, PreTransactionEventArgs e) {}
public override void PrePage(object sender, PrePageEventArgs e) {}

Visual Studio Performance Testing Quick Reference Guide

Page 9

public override void PreRequestDataBinding(object sender,
PreRequestDataBindingEventArgs e) {}
public override void PreRequest(object sender, PreRequestEventArgs e) {}
public
public
public
public
}

override
override
override
override





void
void
void
void

PostRequest(object sender, PostRequestEventArgs e) {}
PostPage(object sender, PostPageEventArgs e) {}
PostTransaction(object sender, PostTransactionEventArgs e) { }
PostWebTest(object sender, PostWebTestEventArgs e) { }

PreWebTest fires before the first request is sent.
PreTransaction is fired before all user defined transaction in the test.
PrePage fires before any explicit request in the webtest. It also fires before any

PreRequest method.





PreRequestDataBinding fires before data from the context has been bound into

the request. Gives an opportunity to change the data binding.
PreRequest fires before ALL requests made, including redirects and dependant
requests. If you want it to act on only redirects, or skip redirects. use the
e.Request.IsRedirectFollow property to handle code flow.
All Post follow the exact opposite order as the Pre

WebTestRequestPlugins get set at an individual request level and only operate on the request(s) they
are explicitly tied to, and all redirects/dependant requests of that request.
class WebTestRequestPluginMethods : WebTestRequestPlugin
{
public override void PreRequestDataBinding(object sender,
PreRequestDataBindingEventArgs e) {}
public override void PreRequest(object sender, PreRequestEventArgs e) { }
public override void PostRequest(object sender, PostRequestEventArgs e) { }
}

ValidationRules can be assigned at the request level and at the webtest level. If the rule is assigned at
the webtest level, it will fire after every request in the webtest. Otherwise it will fire after the request it
is assigned to.
public class ValidationRule1 : ValidationRule
{
public override void Validate(object sender, ValidationEventArgs e) { }
}

ExtractionRules can be assigned at the request level. It will fire after the request it is assigned to.
public class ExtractionRule1 : ExtractionRule
{
public override void Extract(object sender, ExtractionEventArgs e) { }
}

NOTE: If you have multiple items attached to a request, then the order of
precedence is:
1) PostRequest (request plugins fire before WebTestRequest plugins)
2) Extract
3) Validate
Visual Studio Performance Testing Quick Reference Guide

Page 10

LoadTestPlugins get tied to the load tests directly. With VS 2005 and VS 2008, there can be only 1
plugin per loadtest, while VS 2010 adds >1 per test as well as LoadTestPlugin properties such that they
are consistent with WebTestPlugins. The methods available are divided into three categories as shown
below:

1

class LoadTestPlugins : ILoadTestPlugin
{
void LoadTest_LoadTestStarting(object sender, EventArgs e) { }
void LoadTest_LoadTestFinished(object sender, EventArgs e) { }
void LoadTest_LoadTestAborted(object sender, LoadTestAbortedEventArgs e) { }
void LoadTest_LoadTestWarmupComplete(object sender, EventArgs e) { }
void LoadTest_TestFinished(object sender, TestFinishedEventArgs e) { }
void LoadTest_TestSelected(object sender, TestSelectedEventArgs e) { }
void LoadTest_TestStarting(object sender, TestStartingEventArgs e) { }

2

void LoadTest_ThresholdExceeded(object sender, ThresholdExceededEventArgs e) { }
void LoadTest_Heartbeat(object sender, HeartbeatEventArgs e) { }
}

3

1)
2)
3)
4)

These fire based on the load test (meaning each one will fire only once during a full test run)
These fire once per test iteration, per vUser.
Heartbeat fires once every second, on every agent.
ThresholdExceeded fires each time a given counter threshold is exceeded.

NOTE: Each method in section 1 will fire once PER physical agent machine, however since the agent
machines are independent of each other, you do not need to worry about locking items to avoid
contention.
NOTE: If you create or populate a context parameter inside the LoadTest_TestStarting method, it will
not carry across to the next iteration.
Changed in 2010





In VSTS 2010, you can have more than one LoadTest plugin, although there is no guarantee about
the order in which they will execute.
You can now control whether a validation rule fires BEFORE or AFTER dependent requests.
at the end of recording a Web test, we now automatically add a Response Time Goal Validation rule
at the Web test level, but this doesn’t help much unless you click on the Toolbar button that lets you
edit the response time goal as well as Think Time and Reporting Name for the Page for all recorded
requests in a single grid

Visual Studio Performance Testing Quick Reference Guide

Page 11

Client Code does not execute because Web Tests Work at the HTTP Layer
The following blog outlines where and how web tests work. This is important to understand if you are
wondering why client side code is not tested.
http://blogs.msdn.com/slumley/pages/web-tests-work-at-the-http-layer.aspx

File Downloads, Download Size and Storage of files during Web Tests

The web test engine does not write responses to disk, so you don’t need to specify a location for the file.
It does read the entire response back to the client, but only stores the first 1.5M of the response in
memory
You can override that using the WebTestRequest.ResponseBodyCaptureLimit property in the
request’s section of a coded web test.

When is the “Run unit tests in application domain” needed?
When a unit test is run by itself, a separate application domain is created in the test process for each
unit test assembly. There is some overhead associated with marshalling tests and test results across the
application domain boundary. An app domain is created by default when running unit tests in a load
test. You can turn off the app domain using the load test run by using the Load Test editor’s Run
Setting’s “Run unit tests in application domain”. This provides some performance boost in terms of the
number of tests per second that the test process can execute before running out of CPU. The app
domain is required for unit tests that use an app.config file.

How the “Test Iterations” Setting impacts the total number of tests executed
In the properties for the Run Settings of a load test, there is a property called “Test Iterations” that tells
VSTS how many tests iterations to run during a load test. This is a global setting, so if you choose to run
5 iterations and you have 10 vusers, you will get FIVE total passes, not fifty. NOTE: you must enable this
setting by changing the property “Use Test Iterations” from FALSE (default) to TRUE.

Visual Studio Performance Testing Quick Reference Guide

Page 12

Test timeout setting for load test configuration does not affect web tests
The “Test Timeout” setting in the Test Run Configuration file (in the “Test -> Edit Test Run Configuration”
menu) does not have an effect in all cases.




Uses the setting
o Running a single unit test, web test, ordered test, or generic test by itself
o Running any of the above types of tests in a test run started from Test View, the Test
List editor, or mstest.
o Tests running in a load test (except Web tests)
Does not use the setting
o Running a Web test in a load test
o The load test itself

This particular test timeout is enforced by the agent test execution code, but load test and Web test
execution are tightly coupled for performance reasons and when a load test executes a Web test, the
agent test execution code that enforces the test timeout setting is bypassed.

How user pacing and “Think Time Between Test Iterations” work
The setting “Think Time Between Test Iterations” is available in the properties for a load test scenario.
This value is applied when a user completes one test, then the think time delay is applied before the
user starts the next iteration. The setting applies to each iteration of each test in the scenario mix.
If you create a load test that has a test mix model “Based on user pace”, then the pacing calculated by
the test engine will override any settings you declare for “Think Time Between Test Iterations”.

Load test warmup and cool down behaviors
For information about how warmup and cooldown affect the results, see the next section.
Warmup:
When you set a warmup time for a load test, VSTS will start running test iterations with a single
user, and will ramp up to the proper initial user count over the duration of the warmup. The number
of users ramped up are as follows:
o Constant User Load – the total number of users listed
o Step Load Pattern – the initial user count. The test will ramp from this number to the
maximum number of users during the actual test run.
Cool down:
Changed in 2010

In 2008
The Load test Terminate method does not fire unless you use a cool down period.
In 2010
The Load test Terminate method always fires.
Visual Studio Performance Testing Quick Reference Guide

Page 13

What is the difference between Unique, Sequential and Random Data Sources
Single Machine running tests
Sequential – This is the default and tells the web test to start with the first row then fetch rows in order
from the data source. When it reaches the end of the data source, loop back to the beginning and start
again. Continue until the load test completes. In a load test, the current row is kept for each data source
in each web test, not for each user. When any user starts an iteration with a given Web test, they are
given the next row of data and then the cursor is advanced.
Random – This indicates to choose rows at random. Continue until the load test completes.
Unique – This indicates to start with the first row and fetch rows in order. Once every row is used, stop
the web test. If this is the only web test in the load test, then the load test will stop.
Multiple machines running as a rig
Sequential – This works that same as if you are on one machine. Each agent receives a full copy of the
data and each starts with row 1 in the data source. Then each agent will run through each row in the
data source and continue looping until the load test completes.
Random – This also works the same as if you run the test on one machine. Each agent will receive a full
copy of the data source and randomly select rows.
Unique – This one works a little differently. Each row in the data source will be used once. So if you
have 3 agents, the data will be spread across the 3 agents and no row will be used more than once. As
with one machine, once every row is used, the web test will stop executing.

Comparing new users to return users
There is a property in the Load Test Scenario settings for “Percentage of new users”. This setting has
impact on a few different aspects of the load test execution. The percentage is a measure of how many
of the simulated users are pretending to be “brand new” to the site, and how many are pretending to be
“users who have been to the site before”.
A better term to describe a new user is “One Time User”. This is because a new user goes away at
the end of its iteration. It does not “replace” a different user in the pool. Therefore, the term “New
User” should be considered to be a “One Time” user.

Visual Studio Performance Testing Quick Reference Guide

Page 14

The “Percentage of New Users” affects the following whether the tests contained within the load test
are Web tests or unit tests:




The value of the LoadTestUserId in the LoadTestUserContext object. This only matters for unit
tests and coded Web tests that use this property in their code. On the other hand if you set
the number of test iterations equal to the user load, then you should get a different
LoadTestUserId regardless of the setting of “Percentage of New Users”.
If you are using the load test feature that allows you to define an “Initial Test” and/or a
“Terminate Test” for a virtual user, then it affects when the InitializeTest and TerminateTest are
run: for “new users” (a more accurate name might be “one time users”, the InitializeTest is run
for the virtual user, the “Body Test” is run just once, and then the “Terminate Test” is run. For
users who are NOT “new users”, the InitializeTest is run once, the Body Test is run many times
(until the load test completes), and then the TerminateTest runs (which might be during the
cool-down period).

The “Percentage of New Users” affects the following Web test features that are not applicable for unit
tests:




The simulation of browser caching. The option affects how the VUser virtual browser cache is
maintained between iterations of Tests. “New users” have an empty cache (not the responses
are not actually cached, only the urls are tracked), “return users” have a cache. So if this value is
100% all Vusers starting a Test will be starting with an empty browser cache. If this value is 0%
all VUsers will maintain the state of the browser cache between iterations of Web Tests. This
setting affects the amount of content that is downloaded. If an object sits in a Vuser cache and if
the object has not been modified since the last time the Vuser downloaded it, the object will not
be downloaded. Therefore, new users will download more content versus returning users with
items it their browser cache.
The handling of cookie for a Web test virtual user: new users always start running a Web test
with all cookies cleared. When a user who is not a “new user” runs an Web test after the first
one run, the cookies set during previous Web tests for that virtual user are present.

Visual Studio Performance Testing Quick Reference Guide

Page 15

The below graphs (taken from test runs in VSTS 2010) demonstrate the difference between a new user
and a return user. The graphs are based on a 10 user / 50 iteration run, but with different percentages
for “new users” on each run. NOTE: The graphs below are new to VSTS 2010, but the way in which users
are simulated is the same as in VSTS 2008. For a better understanding of these graphs, go to the section
called “Virtual user visualization now available”.

Zero percent new users shows a graph where each of the 10 vusers is constantly reused.

Fifty percent new users shows a graph where each of the 10 vusers is constantly reused by half of the
iterations, but the other half are split out among new vusers which never get reused.

One hundred percent new users shows a graph where none of the vusers is ever reused.

Visual Studio Performance Testing Quick Reference Guide

Page 16

Goal based user behavior after the test finishes the warmup period
1. The user load starts at the value specified by the Initial User Count property of the Goal Based
Load Pattern.
2. At each sampling interval (which defaults to 5 seconds, but can be modified by the “Sample
Rate” property in the load test run settings), the performance counter defined in the goal based
load pattern is sampled. (If it can’t be sampled for some reason, an error is logged and the user
load remains the same.)
3. The value sampled is compared with the “Low End” and “High End” properties of the “Target
Range for Performance Counter”.
4. If the value is within the boundaries of the “Low End” and “High End”, the user load remains the
same.
5. If the value is not within the boundaries of the “Low End” and “High End”, the user load is
adjusted as follows:
 The midpoint of the target range for the goal is divided by the sample valued for the goal
performance counter to calculate an “adjustment factor”.
 For example, if the goal is defined as “% Processor Time” between 50 and 70, the midpoint
is 60. If the sampled value for % Processor Time is 40, then AdjustmentFactor = 60/40 =
1.5, or if the sampled value is 80, the AdjustmentFactor = 60/80 = 0.75.
 The AdjustmentFactor is multiplied by the current user load to get the new user load.
 However, if the difference between the new user load and the current user load is greater
than the “Maximum User Count Increase/Decrease” property (whichever applies), then the
user load is only adjusted by as much as max increase/decrease property. My experience
has been that keeping these values fairly small is a good idea; otherwise the algorithm tends
to cause too much fluctuation (the perf counter keeps going above and below the target
range).
 The new user load can also not be larger than the value specified by the goal based pattern’s
MaximumUserCount property or less than the Minimum User Count property.
 Two more considerations based on special properties of the goal based load pattern:
o If the property “Lower Values Imply Higher Resource Use” is True (which you might
use for example for a performance count such as Memory\Available Mbytes), then
the user load is adjusted in the opposite direction: the user load is decreased when
the sampled counter value is less than the Low End of the target range and
increased when the user load is greater than the High End of the target range.
o If the property “Stop Adjusting User Count When Goal Achieved” is True, then once
the sampled goal performance counter is within the target range for 3 consecutive
sampling intervals, then the user load is no longer adjusted and remains constant
for the remainder of the load test.
 Lastly, as is true for all of the user load patterns, in a test rig with multiple agents, the new
user load is distributed among the agents equally by default, or according to the “Agent
Weightings” if these are specified in the agent properties.

Visual Studio Performance Testing Quick Reference Guide

Page 17

Threading models in Unit tests under load
When running unit tests in a load test, there is one thread for each virtual user that is currently running
a unit test. The load test engine doesn’t know what’s going on inside the unit test and needs to run each
on a separate thread to ensure that a thread will be available to start the next unit test without delay.
However, if you specify the Test Mix Based on User Pace feature (or specify a non-zero value for “Think
Time Between Test Iterations” (a property on each Scenario in the load test)), then the number of
concurrent virtual users is less than the total number of virtual users, and there is only one thread
needed in the thread pool for each concurrent virtual user.
There is an extra thread for each unit test execution thread that is used to monitor the execution of the
unit test, implement timing out of the test, etc. However, the stack size for this thread is smaller than
the default size so it should take up less memory.
More information can be found at: http://blogs.msdn.com/billbar/pages/features-and-behavior-of-loadtests-containing-unit-tests-in-vsts-2008.aspx

Visual Studio Performance Testing Quick Reference Guide

Page 18

Simulation of Browser Caching during load tests
In a VSTS load test that contains Web tests, the load test attempts to simulate the caching behavior of
the browser. Here are some notes on how that is done:










There is a property named on each request in a Web test named “Cache Control” in the Web
test editor (and named “Cache” on the WebTestRequest object in the API used by coded Web
tests).
When the Cache Control property on a request in the Web test is false, the request is always
issued.
When the Cache Control property is true, the VSTS load test runtime code attempts to emulate
the Internet Explorer caching behavior (with the “Automatically” setting).This includes reading
and following the HTTP cache control directives.
The Cache Control property is automatically set to true for all dependent requests (typically for
images, style sheets, etc embedded on the page).
In a load test, the browser caching behavior is simulated separately for each user running in the
load test.
When a virtual user in a load test completes a Web test and a new Web test session is started to
keep the user load at the same level, sometimes the load test starts simulates a “new user” with
a clean cache, and sometimes the load test simulates a return user that has items cached from a
previous session. This is determined by the “Percentage of New Users” property on the
Scenario in the load test. The default for “Percentage of New Users” is 0.

Important Note: When running a Web test by itself (outside of the load test), the Cache Control
property is automatically set to false for all dependent requests so they are always fetched; this is so
that they can be displayed in the browser pane of the Web test results viewer without broken images.

Visual Studio Performance Testing Quick Reference Guide

Page 19

The difference between Load Test Errors and Error Details
There's a distinction between "Errors" and "Error Details" within Load Test results.
1. “Load Test Errors” refers to any type of error that occurs in the load test. The info saved is the
user/requestURI/error text information. By default the load test results will save only 1000
errors of a particular type. This value is configured through a config file.
2. "Load Test Error Details" refers to the additional detail we capture for errors on Web test
requests: mostly the request and response body. The default value is 100. This value is
configured in the Load Test GUI.

Each of these is a separate type of
error and gets its own quantity of
“errors” (#1) and “error details” (#2)

This is the display of the Errors table in the test results viewer.

Each line here is one of the
“errors” entries (#1).

The number of “errors” is shown in
the Count column. Clicking on one
of the numbers will bring up the
Load Test Errors dialog below. There
is no count displayed for “error
details”.

Any “errors” entry (#1) that has an associated “error details” will
have a link in one or both of the last columns. Click on these to
get the details about that specific error instance.

Visual Studio Performance Testing Quick Reference Guide

Page 20

How parameterization of HIDDEN Fields works in a webtest
For each extract hidden fields (using the built in “Extract Hidden”) rule in a webtest, any context items
with the same name will be removed prior to extracting the new values. So if request 1 extracts 4
hidden values into a context “Hidden1”, then request 2 extracts only 2 hidden values, also into a context
called “Hidden 1”, then the resultant collection for “Hidden1” will contain ONLY the two values
extracted for request 2.
“Hidden Field Buckets”
In the example above, Hidden1 and Hidden2 represent hidden field buckets. We call the number at the
end as the bucket number, e.g. $HIDDEN0 is bucket 0.
The easiest example to explain is a frames page with two frames. Each frame will have an independent
bucket, and requests can be interleaved across the frames. Other examples that require multiple
buckets are popup windows and certain AJAX calls (since web tests support correlation of viewstate in
ASP.NET AJAX responses).
Hidden field matching
The algorithm to determine that a given request matches a particular bucket uses the heuristic that the
hidden fields parsed out of the response will match form post fields on a subsequent request.
E.g. if the recorder parses out of a response






Then on a subsequent post we see Field1 and Field2 posted, then this request and response match and a
hidden field bucket will be created for them. The first available bucket number is assigned to the hidden
field bucket.
Once a bucket is “consumed” by a subsequent request via binding, that bucket is made available again.
So if the test has a single frame, it will always reuse bucket 0:





Page 1
o
Page 2
o
Page 3
o
Page 4
o

Extract bucket 0
Bind bucket 0 params
Extract bucket 0
Bind bucket 0 params

If a test has 2 frames that interleave requests, it will use two buckets:

Visual Studio Performance Testing Quick Reference Guide

Page 21






Frame 1, Page 1
o Extract bucket 0
Frame 2, Page 1
o Extract bucket 1
Frame 2, Page 2
o Bind bucket 1 params
Frame 1, Page 2
o Bind bucket 0 params

Or if a test uses a popup window, or Viewstate, you would see a similar pattern as the frames page
where multiple buckets are used to keep the window state.
Why are some fields unbound?
Some hidden fields values are modified in java script, such as EVENT_ARGUMENT. In that case, it won’t
work to simply extract the value from the hidden field in the response and play it back. If the recorder
detects this is the case, it put the actual value that was posted back as the form post parameter value
rather than binding it to the hidden field.
A single page will have have just one hidden field extraction rule applied. If there are multiple forms on a
given page, there is still just one down-stream post of form fields, resulting in one application of the
hidden field extraction rule.

Visual Studio Performance Testing Quick Reference Guide

Page 22

Testing execution order in Unit Tests
I think that most confusion comes from some user’s expectation of MSTest to execute like the Nunit
framework. They execute differently since Nunit instantiates a test class only once when executing all
the tests contained in it, whereas MSTest instantiates each test method’s class separately during the
execution process, with each instantiation occurring on a separate thread. This design affects 3 specific
things which often confuse users of MSTest:
1. ClassInitialize and ClassCleanup: Since ClassInitialize and ClassCleanUp are static, they are only
executed once even though several instances of a test class can be created by MSTest.
ClassInitialize executes in the instance of the test class corresponding to the first test method in
the test class. Similarly, MSTest executes ClassCleanUp in the instance of the test class
corresponding to the last test method in the test class.
2. Execution Interleaving: Since each instance of the test class is instantiated separately on a
different thread, there are no guarantees regarding the order of execution of unit tests in a
single class, or across classes. The execution of tests may be interleaved across classes, and
potentially even assemblies, depending on how you chose to execute your tests. The key thing
here is – all tests could be executed in any order, it is totally undefined.
3. TextContext Instances: TestContexts are different for each test method, with no sharing
between test methods.

For example, if we have a Test Class:
[TestClass]
public class VSTSClass1
{
private TestContext testContextInstance;
public TestContext TestContext
{
get
{
return testContextInstance;
}
set
{
testContextInstance = value;
}
}
[ClassInitialize]
public static void ClassSetup(TestContext a)
{
Console.WriteLine("Class Setup");
}
[TestInitialize]
public void TestInit()
{
Console.WriteLine("Test Init");
}

Visual Studio Performance Testing Quick Reference Guide

Page 23

[TestMethod]
public void Test1()
{
Console.WriteLine("Test1");
}
[TestMethod]
public void Test2()
{
Console.WriteLine("Test2");
}
[TestMethod]
public void Test3()
{
Console.WriteLine("Test3");
}
[TestCleanup]
public void TestCleanUp()
{
Console.WriteLine("TestCleanUp");
}
[ClassCleanup]
public static void ClassCleanUp ()
{
Console.WriteLine("ClassCleanUp");
}
}

(This consists of 3 Test Methods, ClassInitialize, ClassCleanup, TestInitialize, TestCleanUp and an explicit
declaration of TestContext)
The execution order would be as follows:
Test1 [Thread 1]: new TestContext -> ClassInitialize -> TestInitialize -> TestMethod1 ->
TestCleanUp
Test2 [Thread 2]: new TestContext -> TestInitialize -> TestMethod2 -> TestCleanUp
Test3 [Thread 3]: new TestContext -> TestInitialize -> TestMethod2 -> TestCleanUp ->
ClassCleanUp
The output after running all the tests in the class would be:
Class Setup
Test Init
Test1
TestCleanUp
Test Init
Test2
TestCleanUp
Test Init
Test3
TestCleanUp
ClassCleanUp

Visual Studio Performance Testing Quick Reference Guide

Page 24

How machines in the test rig communicate
The below Visio diagrams that shows which ports are used during setup and when the agent and
controller run tests.

Controller-Agent Communications

And here are the connections used during agent setup:

Visual Studio Performance Testing Quick Reference Guide

Page 25

Controller-Agent Communications

Changing the Default Port for Agent-Controller Communication
The default port for communication is 6910. To change this, see the following post:
http://blogs.msdn.com/billbar/archive/2007/07/31/configuring-a-non-default-port-number-for-the-vsteam-test-controller.aspx

How to Add Agents To A Test Rig
When you uninstall the controller software and reinstall it, the local user group that contains the agent
accounts used to connect is reset. You must repopulate the group with the appropriate users. From
Start -> Run, type in “lusrmgr.msc” and then expand the Groups items and open the
“TeamTestAgentService” group. Add the user account(s) used when setting up your agents.
Next, open VSTS and open up the Test Rig Management dialog (Test -> Administer Test Controllers) and
add each agent back to the list.
Or if you have VS 2010, you can go to each agent and re-run the config tool, which will automatically add
the agent back to the controller.
Visual Studio Performance Testing Quick Reference Guide

Page 26

Items new to VS 2010
“Find” feature now available in Webtest playback UI
In VS 2010, you can now directly search for values in the playback window of the UI. With the playback
window active, press Ctrl-F to open the “find” dialog box. You then type in the phrase to search for. You
can also choose whether to look in the request, the response, the headers, all text, etc. You can further
refine the search by limiting to the currently highlighted request.

You can also right-click on a form post or query string parameter in the request tab to start a search.

Visual Studio Performance Testing Quick Reference Guide

Page 27

“Go To Web Test” feature now available in Webtest playback UI
In VS 2010, you can now highlight a specific value shown in the playback window, right-click, and choose
“Go to web test”. This will open the web test window itself and highlight the item whose value you
chose. The feature works on the specific request currently highlighted, so if you have several requests
with the same parameter name, you will be directed to the request that directly corresponds to the
request you were looking at in the playback window.

Visual Studio Performance Testing Quick Reference Guide

Page 28

Recorder Log Available
In VS 2010, as you record a new Web test the recorded requests are saved to a Web test log file. Any
time you are in a new playback screen for this Web test, you can click on the Recorded Result menu bar
command to open the recorded requests and responses. (NOTE: if you upgrade a project from 2008 or
if you manually delete the original playback file, the button will be grayed out).
The recording will have the same name appended with “*Recorded+.” This gives you the ability to see
the requests the browser made and the responses during recording, and compare them to what the
web test is sending and receiving. You can also search the recording for specific values that were
recorded.

Visual Studio Performance Testing Quick Reference Guide

Page 29

Add extraction rule directly from the playback UI
In the playback window, you can highlight any static value from a response that you wish to extract for
use in future requests. Simply highlight the value, right click, and choose Add Extraction Rule. It will
automatically name the rule, name the parameter and add the rule to the right request in the test. You
will still have to go to the subsequent request(s) where you want to use the parameter and add the
parameter to the request. If the value is found in the Web test, you will also be prompted to do a search
and replace of the value with the context parameter binding.
Tip: if this is value changes each time the test is run, the value from the result viewer will not be in the
editor. So rather than adding the extraction rule from the test result, add it from the recorder log
instead (since this will have the recorded value, which will also be in the Web test).

Visual Studio Performance Testing Quick Reference Guide

Page 30

New “Reporting Name” property for web requests
Web requests now have a new property exposed called “Reporting Name.” This property allows you to
define any string to use in test results instead of the actual request URL. This is very handy for requests
with very long URLS or tests where there are several requests to the exact same URL. In the following
Web test, most requests are to the same URL, but the results are changed to show the “Reporting
Name” values set.

A request without any
reporting name defined.

Visual Studio Performance Testing Quick Reference Guide

Page 31

LoadTestResultsTables now differentiate between GET and POST requests
If the webtest in the previous section (“Reporting Name Property”) is executed in a load test, there are
two features you can see in the results.
1) Any Reporting Names you used will show up in the results table.
2) Any requests with the same name but with different methods will be reported separately.

The call from above
with a reporting name

The calls from above
without a reporting
name. Even though they
are the same requests,
some have a GET
method and some have
a POST method.

Visual Studio Performance Testing Quick Reference Guide

Page 32

Virtual user visualization now available
NOTE: This feature is only available on tests where the “Timing Details Storage” property for the Run
Settings is set to “All Individual Details”

How to view activity visualization
In VSTS 2010, you can view a map of the virtual users activity AFTER a test run completes by clicking on
the “Details” button in the results window.

Visual Studio Performance Testing Quick Reference Guide

Page 33

What is shown in the visualization window
3 choices:
1) Test
2) Transaction
3) Page

View shows users in relation to each other
(Y-axis) and durations of a single instance
of each user’s measured activity (X-axis).
For complete details on this, see the entry
“New users versus One Time users”

Use the “Zoom to time” slider to
control how much of the test details
you wish to see.
Hover the mouse pointer over an
instance to get a popup of the info
about that instance.

Visual Studio Performance Testing Quick Reference Guide

Page 34

More Information
Here are the table definitions from the LoadTest2010 Results Store:
For the LoadTestTestDetail table, the big differences are that you get the outcome of the tests, which
virtual user executed it, and the end time of the test.
[LoadTestRunId] [int] NOT NULL ,
[TestDetailId] [int] NOT NULL ,
[TimeStamp] [datetime] NOT NULL ,
[TestCaseId] [int] NOT NULL ,
[ElapsedTime] [float] NOT NULL,
[AgentId] [int] NOT NULL,
[BrowserId] [int],
[NetworkId] [int],
New to 2010
[Outcome] [tinyint],
[TestLogId] [int] NULL,
[UserId] [int] NULL,
[EndTime] [datetime] NULL,
[InMeasurementInterval] [bit] NULL

For the LoadTestPageDetail table, you now get the end time of the page as well as the outcome of the
page.
[LoadTestRunId] [int] NOT NULL ,
[PageDetailId] [int] NOT NULL ,
[TestDetailId] [int] NOT NULL ,
[TimeStamp] [datetime] NOT NULL ,
[PageId] [int] NOT NULL ,
[ResponseTime] [float] NOT NULL,
[ResponseTimeGoal] [float] NOT NULL,
[GoalExceeded] [bit] NOT NULL,
New to 2010
[EndTime] [datetime] NULL,
[Outcome] [tinyint] NULL,
[InMeasurementInterval] [bit] NULL

Visual Studio Performance Testing Quick Reference Guide

Page 35

For the LoadTestTransactionDetail table the big changes are you get the response time of the
transaction and the end time. Statistics for transactions such as Min, Max, Avg, Mean, StdDev, 90%,
95% and 99% are being calculated. These statistics are based on the ResponseTime column, not the
ElapsedTime. The difference between the 2 is that elapsed time includes think time whereas the
response time does not.
[LoadTestRunId] [int] NOT NULL ,
[TransactionDetailId] [int] NOT NULL ,
[TestDetailId] [int] NOT NULL ,
[TimeStamp] [datetime] NOT NULL ,
[TransactionId] [int] NOT NULL ,
[ElapsedTime] [float] NOT NULL,
New to 2010
[EndTime] [datetime] NULL,
[InMeasurementInterval] [bit] NULL,
[ResponseTime] [float] NULL

Another change in VS 2010 is that the default for whether or not to collect details has changed. In VS
2005 and VS 2008 the default was to not collect this detail data. In VS 2010, the default is to collect the
detail data. This is controlled by the Timing Details Storage property on the Run Settings node in a load
test.
So you can still run your own analysis on this data, but there is also a new view in VS that you can use to
get a look at the data. The view is the Virtual User Activity Chart. When a load test completes, there will
be a new button enabled on the load test execution toolbar. It is the detail button below:

When you click on this button you will brought to the Virtual User Activity Chart. It looks like the
following:

Visual Studio Performance Testing Quick Reference Guide

Page 36

Here is what you are looking at. Each horizontal row represents a virtual user. Each line in a horizontal
row represents a test, page or transaction. If you look at top left of this view, you will see a combo box
that shows which type of detail you are looking at. So in my case this is showing pages. Each color
represents a different page in the test. The length of the line represents the duration of the page. So
you can quickly tell which pages are running long.
If you look at the bottom of the chart, you will see a zoom bar. The zoom bar allows you to change the
range that you are looking at. The zoom bar overlays one of the graphs from the graph view. So
whichever graph is selected in the graph view, you will see that on the zoom bar. This makes it very
easy to correlate spikes in a graph with what tests/pages/transactions are occurring during that spike.
The legend on the left also has some filtering and highlight options. If you uncheck a page, then all
instances of that page are removed from the chart. If you click to Highlight Errors, then all pages that
failed will have their color changed to red. If you look at bottom part of the legend, you will see all the
errors that occurred during the test. You can choose to remove pages with certain errors or remove all
successful pages so you only see errors.
There is one other very useful feature of this view. You can hover over any line to get more information
about the detail and possibly drill into the tests that the detail belongs to. For example this is what it
looks like when you hover a detail:

Visual Studio Performance Testing Quick Reference Guide

Page 37

You see information about user, scenario, test, url , outcome, etc. For this detail, there is also a test log
link. If you click this, you will see the actual test that the page was a part of. For example, when I click
test log, I see the following:

You see the full set of details collected for the test in the usual web test playback view that you are use
to. If it was a unit test, you would have seen the unit test viewer instead.

Visual Studio Performance Testing Quick Reference Guide

Page 38

New Excel reporting features built into load test results
There are two new features for reporting through Excel built into the load test results window
1) Load Testing Run Comparison Report
http://blogs.msdn.com/slumley/archive/2009/11/07/vsts-2010-feature-load-testing-run-comparisonreport-in-excel.aspx
2) Load Test Trend Report
http://blogs.msdn.com/slumley/archive/2009/05/22/dev10-feature-load-test-excel-reportintegration.aspx

Visual Studio Performance Testing Quick Reference Guide

Page 39

New Load Test and Load Test Rig Licensing and configurations
This information was taken straight from a blog post by Ed Glas
(http://blogs.msdn.com/edglas/archive/2010/02/07/configuration-options-for-load-testing-with-visualstudio-2010.aspx)
Using Visual Studio Ultimate enables you to generate 250 virtual users of load. To go higher than 250
users, you need to purchase a Virtual User Pack, which gives you 1000 users. You can use the 1000 users
on any number of agents. Note that if you install the Virtual User Pack on the same machine as Visual
Studio Ultimate, you do not get 1250 users on the controller. The 250 virtual users you get with Ultimate
can only be used on “local” runs, not on a Test Controller. If you need to generate more 1000 users, you
purchase additional Virtual User Packs, which aggregate or accumulate on the Test Controller. In other
words, installing 2 Virtual User Packs on one controller gives you 2000 Virtual Users, which can be run
on any number of agents.
Configuration 1: “Local” Load Generation

This is what you get when you install Visual Studio Ultimate, which is the ability to generate
load “locally” using the test host process on the same machine that VS is running on. In addition
to limiting load to 250 users, it is also limited to one core on the client CPU.
Note that purchasing Ultimate also gives you the ability to collect ASP.NET profiler traces by
using a Test Agent as a data collector on the Web server.

Visual Studio Performance Testing Quick Reference Guide

Page 40

Configuration 2: Distributed Test Controller and Test Agents

This is a common configuration if you are scaling out your load agents. With this configuration,
the Test Controller and each Test Agent is on a separate machine.
The advantage of this configuration is the controller is easily shared by team members, and
overhead from the controller does not interfere with load generation or operation of the client.
Note the Test Controller must have one or more Virtual User Packs installed to enable load
testing. Load agents in this configuration always use all cores on the machine.

Visual Studio Performance Testing Quick Reference Guide

Page 41

Configuration 3 A and B: Stacked Configuration

With configuration A, you install the Test Controller and Test Agent on the same machine as VS,
then configure the Test Controller with Virtual User Packs. This enables you to generate >250
virtual users from the client machine, and unlocks all cores in the processor. Configuration B
shows an alternative configuration, enabled if you configure the machine with Virtual User
Packs using the VSTestConfig command line.
Note that a Virtual User Pack can only be used on one machine at a time, and configuring it on a
machine ties it to that machine for 90 days. So you can’t have the same Virtual User Pack
installed on both the VS client and a separate machine running the Test Controller. See the
Virtual User Pack license for details.

Visual Studio Performance Testing Quick Reference Guide

Page 42

Configuration 4: Stacked Controller, Distributed Agents

In this configuration, the controller is running on the same machine as the Test client, with
distributed agents running as load generators. This configuration is recommended if you have a
solo performance tester. If your test controller and test agents will be shared by a team, we
recommend running the controller on a separate box. Note that test agents are tied to a single test
controller. You can’t have two test controllers controlling the same agent.

If you are using Visual Studio 2008, these options should look familiar to you as the VS 2008
load agents and controller offered the same configuration options. The new twist with VS 2010 is
the Virtual User Packs, which offer you more flexibility in how you configure your load agents.

The Test Controller and Test Agent are “free” when you purchase Ultimate.

Visual Studio Performance Testing Quick Reference Guide

Page 43

New test mix: “Sequential Test Mix”

It is not recommended to use ordered tests in a load test. In the load test results, you do not get the
pass/fail results, test timings or transaction timings for any of the inner tests. You just get a Pass/Fail
result and duration for the overall ordered test.
To address this issue, there is a new test mix type in VS2010 called Sequential Test Mix. Here is what it
looks like in the load test wizard:

For this mix type, you set the order of tests that each virtual user will run through. You can mix web and
unit tests in the mix and you will get the individual test, page and transaction results. When a virtual
user completes the last test in the mix, it will cycle back to the first test in the mix and start over.

Visual Studio Performance Testing Quick Reference Guide

Page 44

If you just want to control the order of web tests, you could also use a main web test that calls all of the
tests in order as “nested tests”. This is called “Web Test Composition.” For example, suppose I have
WebTest1 and WebTest2 and I want 1 to run before 2. I would create a third web test that has no
requests, but references tests 1 and 2. To create this kind of test, first record web tests 1 and 2. Then
add a third web test and just hit stop in the web test recorder. When you are back in the web test
editor, right click on the root node and select “Add Call to Web Test...”

This will launch a dialog and then select WebTest1. Then do same steps and add WebTest2. Now just
run WebTest3 and you will execute both tests. WebTest composition has been available since VS2008

Visual Studio Performance Testing Quick Reference Guide

Page 45

Query String and FORM POST URLs get parameterized

When you choose to parameterize the web servers in a web test, you may see more webservers listed
than your test actually calls. This is expected behavior.
that the parameter parser is finding websites that reside inside query strings. Notice this in the .webtest
file:







Any Query String that has a URL gets added to the server list
Any Form Post parameter that has a URL gets added to the server list
NO added header value makes it into the list
If the form post or query parameter NAME is a URL (not the value, but the name of the
parameter), it does NOT get added.

This button will cause VSTS to detect URLs
and create parameters for them.
This web test has only ONE request, but
VSTS detects four web servers.
 Any Query String that has a URL
gets added to the server list
 Any Form Post parameter that has
a URL gets added to the server list
 If the form post or query
parameter NAME is a URL (not the
value, but the name of the
parameter), it does NOT get
added.
 NO added header value makes it
into the list


Visual Studio Performance Testing Quick Reference Guide

Page 46

New options on Load Test Scenarios
There are some new properties exposed for load test scenarios that make it easier to control how your
tests run.

Agents to Use
The agent names that are entered should be the names of agents that are connected to the controller to which the
load test will be submitted. They should be the simple computer names of the agents (as seen in the “Computer
Name” field in the Control Panel). Unfortunately, at this time, if you switch to submitting the load test to a
different controller, you will need to change the value for “Agents to Use” as there is no way to parameterize
this list to vary depending on the controller used. This list of agents designates a subset of those the agents that
are connected to the controller, and are in the Ready state when the load tests starts (they may be running a
different load test or other test run when the load test is queued as long as they become Ready when the load test
is taken out of the Pending state and starts running), and that meet any agent selection criteria to allow the test
run to be run on the agent. The Scenario will run on all agents in the list that meet these criteria, and the user
load for the Scenario will be distributed among these agents either evenly (by default) or according to any agent
weightings specified in the Agent properties for the agents (from the “Administer Test Controllers” dialog in Visual
Studio).

Delay Start Time
Amount of time to wait after the load test starts before starting any tests in this scenario.
Disable During Warmup
If true, the delay time does not begin until after warmup completes.

Visual Studio Performance Testing Quick Reference Guide

Page 47

Loops and Conditionals
In Visual Studio 2008, if you wanted to conditionally execute some requests or you wanted to
loop through a series of requests for a given number of times, you had to convert a declarative
web test to a coded web test. In VS2010, these options are exposed directly in declarative
webtests.
The ability to add these are exposed by right-clicking on a request and selecting the option you
want from the context menu:

The context menu showing the loop and condition insert options

Sample dialog box for setting the properties of a loop

Visual Studio Performance Testing Quick Reference Guide

Page 48

What the entries look like in the declarative test

Loop results when the test is played back

What results look like if a conditional call fails

What the results look like if a conditional call succeeds.

Visual Studio Performance Testing Quick Reference Guide

Page 49

Configurations and Settings
How to Change the Location Where Agents Store Run Files
If you need to move the location that an agent uses to store the files downloaded to it for executing
tests, the following steps will take care of this. On each agent machine,




Open QTAgentService.exe.config
Add "" under the  node.
Create the  folder.

How to set a proxy server for web tests
By default, there is no proxy set on a web test, so it doesn’t matter what the Internet Explorer® (“IE”)
proxy settings are. If your test sets a specific proxy server within the web test then the IE setting is still
not used. In coded web tests or web test plug-ins, you can set the proxy name using the WebProxy
property of the WebTest class. NOTE that this method is broken in Visual Studio Team Test (“VSTT”)
2008 RTM, but is fixed in SP1 for VSTT 2008.
If you wish to use the machine’s IE proxy settings then you can set the Proxy property to “default”
(without the quotes). In this case you should turn off Automatic Proxy Detection on each agent.
Automatic Proxy detection is very slow and can greatly impact the amount of load you can drive on an
agent.

How to configure Web Tests so Fiddler can capture playback info
Changed in 2010

In 2008
By default, web test playback ignores proxy servers set for localhost, so enabling a proxy for 127.0.0.1
(which is where Fiddler captures) will not result in any captured data. To make this work, either add a
plugin with the following code, or put the following code in the Class constructor for a coded web test:
this.Proxy = "http://localhost:8888";
WebProxy webProxy = (WebProxy)this.WebProxy;
webProxy.BypassProxyOnLocal = false;
In 2010
To get fiddler to work in VS 2010, simply open Fiddler, then start playing the web test. There is no need
to code for anything.

Visual Studio Performance Testing Quick Reference Guide

Page 50

Controlling the amount of memory that the SQL Server Results machine consumes
The default behavior for SQL Server is to consume as much memory as it thinks it can, the workload on
the machine may not be allowing SQL Server to correctly identify memory pressure and hence give back
some memory. You can configure SQL Server to a max memory limit, which if all you are doing is
inserting results should be fine.
The below is how you can set memory to 512mb. The size of the memory you use will vary based on the
machine, testing and how much memory you have.
sp_configure 'show advanced options', 1
RECONFIGURE
GO
sp_configure 'max server memory', 512
RECONFIGURE
GO

How to configure the timeouts for deployment of load tests to agents
The file to change is “Microsoft Visual Studio 9.0\Xml\Schemas\vstst.xsd”. look for the run config
schema. Then search for “timeout”:










Change the values as needed and note that the time is in milliseconds.

Visual Studio Performance Testing Quick Reference Guide

Page 51

How to set the number of Load Test Errors and Error Details saved
Load Test Errors:
You can change the total number of errors stored for a run in the appropriate configuration file
(depending on whether this is for local runs or for test rig runs):
Version Run Type File Name
Location
2008

Local

VSTestHost.exe.config

2008

Remote

QTController.exe.config

2010

Local

DevEnv.exe.config

2010

Remote

QTController.exe.config

\Microsoft Visual
9\Common7\IDE\
\Microsoft Visual
Team Test Load Agent\LoadTest\
\Microsoft Visual
9\Common7\IDE\
\Microsoft Visual
9\Common7\IDE\

Studio
Studio 9.0
Studio
Studio

Add a key to the "appSettings" section of the file (add the "appSettings" section if needed) with the
name "LoadTestMaxErrorsPerType" and the desired value.




Load Test Error Details:

Visual Studio Performance Testing Quick Reference Guide

Page 52

Multi-proc boxes used as agents should have .NET garbage collection set to server
mode
Changed in 2010

In 2008
To enable your application to use Server GC, you need to modify either the VSTestHost.exe.config or
the QTAgent.exe.config. If you are not using a Controller and Agent setup, then you need to modify the
VSTesthost.exe.config. If you are using a controller and agent, then modify the QTAgent.exe.config for
each agent machine. Open the correct file. The locations are
VSTestHost.exe.config - C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE
QTAgent.exe.config - C:\Program Files\Microsoft Visual Studio 9.0 Team Test Load
Agent\LoadTest

To enable gcserver you need to add the following highlighted line in the runtime section:









In 2010
The agent service in VS 2010 is now set to Server GC by default. No need to take any action here.

Location of list of all agents available to a controller
Changed in 2010

To retrieve a list of agents assigned to a controller without using the VSTS IDE, look in:
In 2008
\Microsoft Visual Studio 9.0 Team Test Load
Agent\LoadTest\QTControllerConfig.xml
In 2010
\Microsoft Visual Studio
10.0\Common7\IDE\QTControllerConfig.xml

Visual Studio Performance Testing Quick Reference Guide

Page 53

Networks, IP Switching, Test Startups
IP Address Switching anatomy (how it works)
Each agent is assigned a range of up to 256 IP addresses to use. At the start of a test run, the agent
service configures the IP addresses on the network card. When the test starts running, new connections
are round-robined through the pool of IP addresses.
The most common use for IP Switching is when load testing against a load balancer. Load balancer
typically use the IP address to route requests to a particular Web server in the farm. So if you have 2
agents driving load to 3 Web servers, since all traffic is coming from two IPs (one on each agent), only
two of the web servers would get all the traffic. IP Switching provides a way to have traffic come from
multiple IPs on the same agent, enabling the load balancer to balance load across the farm.
VSTT currently limits the number of unique IP addresses to 256 per agent. In most testing situations, this
will be plenty of addresses. The main place where this limitation might impact you is if you are running a
large test where every single user must have a separate IP Address for some sort of session state. This is
pretty unusual.
In VS 2008, there is no way to have a given virtual user use the same IP. That is, with IP switching turned
on, a given user will multiple IPs out of the IP pool, and may use different IPs on subsequent iterations.
In VS 2010, the Web test engine tries to ensure that the same user will always use the same IP address,
but there is no guarantee that it will be the case.
The biggest problem with assigning unique IP Addresses to every user is that currently the IP switching
configuration limits you to a range of 256 IP addresses per agent, which would mean you would also be
limited to 256 virtual users per agent. One solution is to use VMs to get multiple load test agents on a
single physical machine.

Gotcha: IP Address Switching is ONLY for WEB TESTS
The IP Switching feature will NOT work with Unit Tests

Gotcha: IP Addresses used for switching are not permanent
When you choose to use multiple IP addresses from each agent machine during load testing (known as
IP address switching or spoofing), most testing tools require you to add those IP addresses to the NIC of
the machine, and they are always available and always show up on the machines. VSTS allows you to
set a range of IP addresses directly in the test project. Then VSTS dynamically adds the addresses to the
agent(s) when the test run starts, and removes them when the test run stops. . If you need to perform
IP switching, a controller/agent setup is required.

Visual Studio Performance Testing Quick Reference Guide

Page 54

How to Setup IP Switching
There are 2 parts to setting up IP Switching. First, you must configure the Test Rig Agents to use IP
Switching. Then you must tell the Load Test itself that it should take advantage of that. Here are the
steps and the pitfalls involved:
Setting up the agents
1. Open up the Test Rig Administration dialog (Test -> Administer Test Controller)
2. Highlight each of the agents and bring up the Properties for the agent
3. Fill out all of the appropriate information (as outlined in the picture below)

Where to configure Agent Properties

Visual Studio Performance Testing Quick Reference Guide

Page 55

Make sure you pick the correct
adapter here. Use the Network
Connections properties built into
Windows along with the IPCONFIG
command to see which NIC is
assigned to what subnet (see below).

The base address is 3 octets and
should be representative of the
subnet you are on. If you are using a
class B subnet, you still need a third
octet for the base.

The output from the IPCONFIG
command in a CMD window.

C:\Documents and Settings>ipconfig
Windows IP Configuration
Ethernet adapter Secondary:

The information as shown in the
Network Connections dialog box in
Windows. You may need to hover the
mouse over the NIC to see the entire
name of the NIC.
Getting the proper IP Address info for spoofing

Connection-specific
IP Address. . . . .
Subnet Mask . . . .
Default Gateway . .

DNS
. .
. .
. .

Suffix
. . . .
. . . .
. . . .

.
.
.
.

:
: 10.69.200.3
: 255.255.0.0
: 10.69.0.1

Suffix
. . . .
. . . .
. . . .

.
.
.
.

:
: 10.99.3.3
: 255.255.0.0
: 10.99.0.1

Ethernet adapter Primary:
Connection-specific
IP Address. . . . .
Subnet Mask . . . .
Default Gateway . .

Visual Studio Performance Testing Quick Reference Guide

DNS
. .
. .
. .

Page 56

Setting up The Load Test
Once the test rig is setup, you can configure which Load Test will actually use IP Switching by setting the
correct property for the Load Test:

Where to enable IP Switching for the Load Test Itself (after configuring the agents to use it)

Visual Studio Performance Testing Quick Reference Guide

Page 57

Troubleshooting invalid view state and failed event validation
ASP.NET uses __VIEWSTATE and __EVENTVALIDATION hidden fields to round-trip information across
HTTP requests. The values for these fields are generated on the server and should be posted unchanged
on a post back request. By default, these values are signed with a so-called validationKey to prevent
tampering with the values on the client.
If you just record the values in a web test and post the recorded values, you can run into ASP.NET error
messages about invalid view state or failed event validation. The Visual Studio web test recorder will
normally automatically detect the __VIEWSTATE and __EVENTVALIDATION hidden fields as dynamic
parameters. This means the dynamically extracted values will be posted back instead of the recorded
values.
However, if the web server is load balanced and part of a web farm you may still run into invalid view
state and failed event validation errors. This occurs when not all servers in the web farm use the same
validationKey and the post back request is routed to a different server in the farm than the one on
which the page was rendered.
To troubleshoot, ViewState MAC checking can be disabled by setting enableViewStateMac to false.
However, this is not suitable for use on a production environment because it disables an important
security feature and has performance implications. The recommended fix is to define the same value for
the validationKey on all machines.
Instructions for manually creating a validationKey are detailed at http://msdn.microsoft.com/enus/library/ms998288.aspx. For IIS 7 a machine key can easily be created through IIS Manager, see
http://technet.microsoft.com/en-us/library/cc772287(WS.10).aspx.
For more background information on ViewState and EventValidation go to
http://msdn.microsoft.com/en-us/magazine/cc163512.aspx.

Startup: Slowness Restarting a Test Rig with Agents Marked as “Offline”
If you have agent machines that are either disabled (powered off, service stopped, etc) or that no longer
exist, but you only mark them as “Offline” in the “Administer Test Controllers” dialog, restarting the rig
will take a long time. The controller will attempt to contact all agents listed in the dialog regardless of
their status, and it will take approximately one minute or more for each missing machine.

Visual Studio Performance Testing Quick Reference Guide

Page 58

Startup: Multiple Network Cards can cause tests in a rig to not start
Problem: When running tests against a controller and test agents the tests start with pending state but
then nothing else happens.
Visual Studio Client Resolution: The problem is that you have two network adapters on the client
machine. The following entries in the controller log confirm that this is the problem:
[I, 2972, 11, 2008/06/26 13:02:59.780] QTController.exe: ControllerExecution: Calling
back to client for deployment settings.
[E, 2972, 11, 2008/06/26 13:06:51.155] QTController.exe: StateMachine(RunState):
Exception while handling state Deploying: System.Net.Sockets.SocketException: A
connection attempt failed because the connected party did not properly respond after a
period of time, or established connection failed because connected host has failed to
respond 65.52.230.25:15533

This is exactly the type of error message we see when the controller communication with Visual Studio
fails because the client has network cards: To configure your Visual Studio installation to communicate
with the controller, try this:
In regedit:




Find the key:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\VisualStudio\9.0\EnterpriseTools\QualityTools
Add a new key under the above key named “ListenPortRange”
In the key “ListenPortRange”, add a new string value with the name “BindTo” and the IPv4
address for the client (65.52.230.25 in your case) as the BindTo value.

Test Rig Resolution:
Read the following support article for the steps to resolve this issue on a test rig:
http://support.microsoft.com/kb/944496

Startup: Slow startup can be caused by _NT_SYMBOL_PATH environment variable
If you have the environment variable _NT_SYMBOL_PATH defined on your systems, your tests may stay
in the “pending” state for a long time. This happens whenever the symbol path defines a symbol server
that is external to your environment and you do not have a local cache of symbols available. To work
around this, do the following:
1. Remove the _NT_SYMBOL_PATH in the environment where you start devenv.exe from.
2. Change _NT_SYMBOL_PATH, by putting a cache location in front of the symbol store location.
For more information about symbol paths and symbol servers, go to:
http://msdn.microsoft.com/en-us/library/ms681416(VS.85).aspx

Visual Studio Performance Testing Quick Reference Guide

Page 59

Startup: tests on a Rig with Agents on a Slow Link
The load test does not actually start on any agents until deployment of all files has occurred to all agents
(by the way, this means that the slow up start of a load test on a rig with many agents could have been
caused by slow deployment to one or more agents).
A common root cause is the _NT_SYMBOLS_PATH variable defined in the environment that points to
somewhat slow symbol server (like \\symbols\symbols).
Try one these workarounds:
-

Undefine _NT_SYMBOLS_PATH in the environment where you start devenv.exe from.
Change _NT_SYMBOLS_PATH, by putting a cache in front, such as cache*c:\symcache. This is
will make 1st run same slow but all subsequent runs fast.

“Not Bound” Exception when using IP Switching is not really an error
New to 2010

The below error may appear several times when running a load test where you are using IP Switching. In
most cases, this can be ignored.
00:51:35
AGENT02



Exception
LoadTestException
151 Web test requests were not bound to either the correct IP
address for IP switching, or the correct port number for network emulation, or both.

The one situation where the presence of this error may indicate a real issue with the test is when the
application is relying on a given iteration to always come through on the same IP address for purposes of
maintaining a session (such as a load balancer like Microsoft ISA Server with the IP Sticky setting turned
on).

Visual Studio Performance Testing Quick Reference Guide

Page 60

How to configure the timeout for deployment of load tests to agents
Changed in 2010

You might encounter timeouts when deploying load tests to agents when the deployment contains
many or large files. In that case you can increase the timeout for deployment. The default value is 300
seconds.
In 2010
You have to change the .testsettings file that corresponds to your active test settings in Visual Studio,
because the deployment timeout setting is not exposed via the Visual Studio UI. Check via the menu
Test | Select Active Test Settings (Visual Studio 2010) which file is active. You can find the file in the
Solution Items folder of your solution. Open it in the XML editor, by right clicking it, choosing “Open
With…” and selecting “XML (Text) Editor”.
The TestSettings element will have an Execution element. Add a child element called Timeouts, if not
already present, to the Execution element. Give it a deploymentTimeout attribute with the desired
timeout value in milliseconds. For example:


(…)



IntelliSense should help you out when adding/editing this.
In 2008
In 2008 you have to change the .testrunconfig file that corresponds to your active test run configuration,
Add a child element Timeouts under the TestRunConfiguration element if no such element is already
present. Check via the menu Test | Select Active Test Run Configuration which file is active. You can find
the file in the Solution Items folder of your solution. Give it a deploymentTimeout attribute with the
desired timeout value in milliseconds. For example:




IntelliSense should help you out when adding/editing this.

Visual Studio Performance Testing Quick Reference Guide

Page 61

Performance Counters and Data
Customizing the Available Microsoft System Monitor counter sets
The counter set templates for VSTS are located in the following directory (assuming a typical install):
In 2008
C:\Program Files\Microsoft Visual Studio
9.0\Common7\IDE\Templates\LoadTest\CounterSets
In 2010
C:\Program Files\Microsoft Visual Studio
10.0\Common7\IDE\Templates\LoadTest\CounterSets

These files are standard XML files and can be modified to allow for quick and easy re-use of custom sets.
It is recommended that you copy the counter set you wish to enhance and add the name CUSTOM to it
so you will always remember that it is a custom counter set. Or you can create your own totally
independent counter set. The following shows the layout of the file:

Visual Studio Performance Testing Quick Reference Guide

Page 62





















This all needs to be on one line. Make

sure you format it properly when

New To 2010

putting it in the final file.
Range specifies the

graph range.

New To 2010

HigherIsBetter is used for highlighting better

or worse results in the Excel reports.











New To 2010

RangeGroup uses a common range for all counters in that range group.















Visual Studio Performance Testing Quick Reference Guide

Page 63

Performance Counter Considerations on Rigs with slow links
Having a slow WAN between the controller and agents may definitely cause some timeouts or delays in
performance counter collection. Each performance counter category is read in a separate operation:
that’s one method call at the level of the .NET classes that we call, and I don’t know if each call results in
just one or more than one network read.
There are some timeout settings for performance counter collection that you can change by editing the
QTController.exe.config file (or VSTestHost.exe.config file when running locally on VSTS 2008, or in
devenv.config.exe for 2010) and adding these lines:





The values are in ms, so 9000 is 9 seconds. If you make this change, also change the load test sample
rate to be larger than this: at least 10 or preferably 15 seconds, and yes with many agents located far
from the controller, it is recommended to delete most of the categories in the Agent counter set
(perhaps just leave Processor and Memory).
The .NET API that used to read the performance counters is
PerformanceCounterCategory.ReadCategory(), so the entire category is read even if the counter set
definition only includes one counter and one instance. This is a limitation at the OS level in the way
performance counters are read.
The defaults in VSTS are:



LoadTestCounterCategoryReadTimeout: 2000 ms (2 seconds)
LoadTestCounterCategoryExistsTimeout: 10000 ms

Visual Studio Performance Testing Quick Reference Guide

Page 64

Increase the performance counter sampling interval for longer tests
Choose an appropriate value for the “Sample Rate” property in the Load Test Run Settings based on the
length of your load test. A smaller sample rate, such as the default value of five seconds, requires more
space in the load test results database. For longer load tests, increasing the sample rate reduces the
amount of data collected.
Here are some guidelines for sample rates:
Load Test Duration Recommended Sample Rate
< 1 Hour
5 seconds
1 - 8 Hours
15 seconds
8 - 24 Hours
30 seconds
> 24 Hours
60 seconds

Changing the default counters shown in the graphs during testing
If you want to change the default set of counters that show up in the graphs when you start a test, you
can go into each of the .counterset XML files (same directory as above) and set or add to the
DefaultCounter entries in the following section (at the bottom of the files):




Possible method for fixing “missing perfmon counters” issues
On the controller machine for your rig, map a drive to each of the machines you will be collecting perf
counters for within the load test. Then, before you kick off a test, open each drive you mapped and
verify that you have connectivity. Leave the window open during the test.

Visual Studio Performance Testing Quick Reference Guide

Page 65

How and where Performance data gets collected
There are two types of data collected by VSTS during a test run: Real perfmon counters and pseudo
perfmon counters. All real perfmon counters are collected directly by the VSTS Controller machine.
In the Load Test editor, all of the performance counter categories that start with “LoadTest:” (see the
LoadTest counter set in the load test editor) is data that is collected on the agents by the load test
runtime engine. These are not real Perfmon counters in the sense that if you try to look at them with
Perfmon you won’t see them, though we make them look like Perfmon counters for consistency in the
load test results database and display. The agents send this some of this data (see below) in messages to
the controller every 5 seconds which rolls up the agent (e.g. Requests / sec across the entire rig rather
than per agent). The controller returns the rolled up results to Visual Studio for display during the run
and also stores them in the load test results database.
[Requests Per Second Counters] The VS RPS does not count cached requests, even though VSTS is
sending an http GET with if-modified-since headers.
What data is sent every 5 seconds? we do everything possible to limit how much data is sent back in
that message. What we do send back is the average, min, max values for all of the pseudo
performance counters in the categories that start with “LoadTest:” that you see under the “Overall”,
“Scenarios” and “Errors” nodes in the load test analyzer tree (nothing under the “Machines” node).
Note that the biggest factor in the size of these result messages is the number of performance counter
instances, which for Web tests is mostly determined by the number of unique URLs reported on during
the load test. We also send back errors in these 5 seconds messages, but details about the failed
requests are not sent until the end of the test, so tests with lots of errors will have bigger messages.
Lastly, we only send back metadata such as the category names and counter names once and use
numeric identifiers in subsequent messages, so the messages at the start of the load test may be slightly
larger than later messages.
One thing you could do to reduce the size of the messages is to reduce the level of reporting on
dependent requests. You could do this by setting the “RecordResult” property of the
WebTestRequest object to false. This eliminate the page and request level reporting for that request,
but you could add a transaction around that request single request and that would really match the
page time for that request

Visual Studio Performance Testing Quick Reference Guide

Page 66

Data and Results
Custom Data Binding in UNIT Tests
The first thing to do is create a custom class that does the data initialization (as described in the first
part of this post: http://blogs.msdn.com/slumley/pages/custom-data-binding-in-web-tests.aspx). Next,
instantiate the class inside your unit test as follows:
[TestClass]
public class VSTSClass1
{
private TestContext testContextInstance;
public TestContext TestContext
{
get { return testContextInstance; }
set { testContextInstance = value; }
}
[ClassInitialize]
public static void ClassSetup(TestContext a)
{
string m_ConnectionString = @"Provider=SQLOLEDB.1;Data
Source=dbserver;Integrated Security=SSPI;Initial Catalog=Northwind";
CustomDs.Instance.Initialize(m_ConnectionString);
}
[TestMethod]
public void Test1()
{
Dictionary dictionary = customDs.Instance.GetNextRow();
//......Add the rest of your code here.
}

Verifying saved results when a test hangs in the “In Progress” state after the test has
finished
If you run a test and either the test duration or the number of iterations needed for completion of the
test have been reached, but the test stays in the “In Progress” state for a long time, you can check if all
of the results have been written to the load test results repository by running this SQL query against the
LoadTest database:
select LoadTestName, LoadTestRunId, StartTime, EndTime from
LoadTestRun where LoadTestRunId=(select max(LoadTestRunId) from
LoadTestRun);
If the EndTime has a non-NULL value then the controller is done writing results to the load test results
database and it should be safe to restart the rig (killing anything if needed).
This doesn’t necessarily mean that all results from all agents (if the agents got hung) were successfully
written to the load test database, but it does mean that there’s no point in waiting before killing the
agents/tests.
Visual Studio Performance Testing Quick Reference Guide

Page 67

The metrics during and after a test differ from the results seen.
Scenario 1:
When you run load tests and look at the numbers you get while the tests are running, the values you see
may not be the same values that you get when you load the completed test results at a later point. This
behavior is not unexpected, based on warmup and cooldown settings.

Comparison of a test with and without warmup. Notice the total number of tests run is different, but the recorded times are
close enough to be valid for reporting.

Scenario 2:
When you compare the summary page results to the detailed results values, there can be a difference in
what is reported. This is due to the implementation of collecting the timing details, which are currently
flushed when a test iteration ends. For iterations that are in progress with in-flight requests, we give the
iteration 10 seconds (configurable via cooldown) to complete any in-flight requests. If they do not
complete, the transactions in those iterations are not counted in the details, but are counted in the
summary page.

Visual Studio Performance Testing Quick Reference Guide

Page 68

How new users and return users affect caching numbers
Comparing VSTS Results to IIS Results for 100% new vs. 100% return
This section shows how VSTS handles caching and how to interpret the numbers shown for total
requests and cached requests.

From
IIS Logs

TOR 09 - Caching - ReturnUsers
HTM 268
HTML 263
GIF 83
BMP 32719
200 OK - 3871
304 Not Modified - 29462
VSTS Requests: 33,333
VSTS Requests Cached: 84,507

From
IIS Logs

TOR 10 - Caching - NewUsers
HTM 276
HTML 271
GIF 276
BMP 90243
200 OK - 46639
304 Not Modified - 44427
VSTS Requests: 89,384
VSTS Requests Cached: 43,758

Comparing New Users to Return Users (WRT caching):
New users are simulated by “clearing” the cache at the start of
each new iteration, whereas the cache is carried from iteration
to iteration for return users.
This results in many more requests being cached with return
users.
NOTE: The total # of requests made by VSTS is a sum of the two
VSTS values. In other words, “Total Requests” in the IDE does
not include cached requests.

From
IIS Logs

Comparing the same tests using HTML’s Content Expiration setting
TOR 12 - Caching - ReturnUsers - Content Expiration
HTM 270
HTML 264
GIF 85
BMP 3330
200 OK - 3874
304 Not Modified - 75
VSTS Requests: 3,949
VSTS Requests Cached: 84,842

From
IIS Logs

TOR 11 - Caching - NewUsers - Content Expiration
HTM 268
HTML 262
GIF 268
BMP 44622
200 OK - 45286
304 Not Modified - 134
VSTS Requests: 44,742
VSTS Requests Cached: 42,090

Looking at the impact of “content expiration” on the
overall network and web server activity (For more
information, see the section “Add an Expires or a
Cache-Control Header” from
http://developer.yahoo.com/performance/rules.html).
Notice that VSTS honors the content expiration (this is
actually handled by the underlying System.NET
component). However, VSTS still reports the cached
file request, even though no call went out the wire.
This is expected behavior since the request was a part
of the site. In order to see how many requests went on
the wire, you need to use IIS logs or network traces.

Visual Studio Performance Testing Quick Reference Guide

Page 69

Notes:
 All 4 tests above were run for the same duration with the same number of users executing the same
test.
 Although the numbers do not match exactly, they are close enough to show the behavior of the
tests. The discrepancy is due to a few things, including cool down of the test and the possible misalignment of the query I used to gather data from the IIS logs.
 The IIS Log items for “200 –OK” and “304-Not Modified” were gathered using LogParser and the
following query:
SELECT
sc-status, COUNT(*) AS Total
FROM *.log
WHERE
to_timestamp(date, time) between
timestamp('2010-02-12 02:13:22', 'yyyy-MM-dd hh:mm:ss')
and
timestamp('2010-02-12 02:18:22', 'yyyy-MM-dd hh:mm:ss')
GROUP BY
sc-status

data sources for data driven tests get read only once
When initializing data driven tests the data is read ahead of time, and only retrieved once. Therefore
there is no need to optimize the connection to the data source.

Visual Studio Performance Testing Quick Reference Guide

Page 70

Consider including Timing Details to collect percentile data
There is a property on the Run Settings in the Load Test Editor named "Timing Details Storage". If
Timing Details Storage is enabled, then the time to execute each individual test, transaction, and page
during the load test will be stored in the load test results repository. This allows 90th and 95th
percentile data to be shown in the load test analyzer in the Tests, Transactions, and Pages tables. VS
2010 adds 99th percentile and standard deviation stats. Also, in VS 2010 this setting is on by default.
Consider turning it off for very large load tests, as with a many-agent test it can take up to half the time
of the load test to process all the timing details. In other words, if you have a 12 hour load test running
on 30 agents it could take 6 hours to collect and crunch all the data. In VS 2010, the details data is also
used to populate the virtual user activity chart.
The amount of space required in the load test results repository to store the Timing Details data may be
very large, especially for longer running load tests. Also, the time to store this data in the load test
results repository at the end of the load test is longer because this data is stored on the load test agents
until the load test has finished executing at which time the data is stored into the repository. For these
reasons, Timing Details is disabled by default. However if sufficient disk space is available in the load
test results repository, you may wish to enable Timing Details to get the percentile data. Note that
there are two choices for enabling Timing Details in the Run Settings properties named "StatisticsOnly"
and "AllIndividualDetails". With either option, all of the individual tests, pages, and transactions are
timed, and percentile data is calculated from the individual timing data. The difference is that with the
StatisticsOnly option, once the percentile data has been calculated, the individual timing data is deleted
from the repository. This reduces the amount of space required in the repository when using Timing
Details. However, advanced users may want to process the timing detail data in other way using SQL
tools, in which case the AllIndividualDetails option should be used so that the timing detail data is
available for that processing.

Visual Studio Performance Testing Quick Reference Guide

Page 71

Consider enabling SQL Tracing through the Load Test instead of separately
There is a set of properties on the Run Settings in the Load Test Editor that allow the SQL tracing feature
of Microsoft SQL Server to be enabled for the duration of the load test. If enabled, this allows SQL trace
data to be displayed in the load test analyzer on the "SQL Trace" table available in the Tables dropdown.
This is a fairly easy-to-use alternative to starting a separate SQL Profiler session while the load test is
running to diagnose SQL performance problems. To enable this feature, the user running the load test
(or the controller user in the case of a load test run on a rig) must have the SQL privileges needed to
perform SQL tracing, and a directory (usually a share) where the trace file will be written must be
specified. At the completion of the load test, the trace file data is imported into the load test repository
and associated with the load test that was run so that it can be viewed at any later time using the load
test analyzer.

How to collect SQL counters from a non-default SQL instance
If you want to collect performance counters from a SQL Server instance while running a load test, you
can do this easily by checking the SQL counter set in the "Manager Counter Sets" dialog in the VSTS load
test editor. Doing this includes the default counter set for SQL Server in your load test. The
performance counter category names that are specified in this counter set begin with "SQLServer:": for
example "SQLServer:Locks".
However, if you are trying to monitor another SQL Server instance that is not the default SQL server
instance, the names of the performance counter categories for that instance will have different category
names. For example, if your SQL server instance is named "INST_A", then this performance counter
category will be named "MSSQL$INST_A:Locks". To change the load test to collect these performance
counters, the easiest thing to do is open the .loadtest file with the XML editor or a text editor and
replace all instances of "SQLServer:" by "MSSQL$INST_A:Locks" (correcting the replacement string for
your instance name).

How 90% and 95% response times are calculated
Within the load test results summary page, the percentile values mean that:



90% of the total transactions were completed in less than 

Source Exif Data:
File Type                       : PDF
File Type Extension             : pdf
MIME Type                       : application/pdf
PDF Version                     : 1.5
Linearized                      : No
Page Count                      : 153
Language                        : en-US
Tagged PDF                      : Yes
Title                           : Visual Studio Performance Testing Quick Reference
Author                          : Visual Studio Performance Testing Quick Reference Guide
Subject                         : A quick reference for users of the Team Testing performance features of Visual Studio
Creator                         : Microsoft® Word 2010
Create Date                     : 2010:04:01 10:08:29-04:00
Modify Date                     : 2010:04:01 10:08:29-04:00
Producer                        : Microsoft® Word 2010
EXIF Metadata provided by EXIF.tools

Navigation menu