Skip to content

Modifying Test Case Content

After recording a test case, you'll almost always need to customize it before it's ready for load testing. The recording captures exactly what happened during one session, but load testing requires flexibility: handling variations, removing unnecessary elements, and configuring data-driven behavior.

This guide covers the most common customization tasks: file uploads with different files per virtual user, dynamic file downloads with unique URLs, editing JSON and XML content for API testing, changing recorded hostnames and URLs, and removing ad-servers or click-trackers that have no business being in your load test.


Why Customization Matters

A raw recording is rarely ready for load testing. Here's why:

  1. File uploads: By default, Load Tester uploads the same file recorded during the session. For realistic testing, each virtual user should upload different content.

  2. Dynamic file downloads: When file downloads use unique identifiers in the URL path (rather than query parameters), Load Tester can't automatically correlate them, so you need to configure extraction and substitution.

  3. JSON/XML data: Modern applications send structured data (JSON, XML) in HTTP requests. You need to customize these payloads to vary data per user or iteration.

  4. URL and hostname changes: You may need to test against a different environment (staging vs. production) or modify recorded URLs to handle dynamic content.

  5. Third-party services: Ad-servers and click-trackers inflate metrics, trigger fraud detection, and make configuration harder. It's usually best to remove them entirely.


File Uploads: Different Files Per User

When a test case has a file upload, Load Tester automatically handles the upload during replays and load tests. However, in many tests each user should upload different file contents (and possibly use a different filename). Load Tester has streamlined this process to make it very easy to handle.

Step 1: Gather the Files to Upload

Collect all the files you want virtual users to upload during the load test. Store them in a folder on your local machine.

Example:

/uploads/
  document1.pdf
  document2.pdf
  document3.pdf
  image1.jpg
  image2.jpg

Step 2: Import Files into Load Tester

  1. Open Preferences: Window → Preferences
  2. Navigate to: Web Performance → File Upload
  3. Click: Add Files button
  4. Browse to the folder containing your upload files
  5. Select all files you want to import
  6. Click: Open

Load Tester copies the files into its own private area (so if you change the contents of the original files later, you'll need to re-import them for the changes to be recognized).


Step 3: Generate Dataset from Imported Files

After importing files, Load Tester can automatically generate a dataset mapping filenames to file paths:

  1. Still in File Upload Preferences, click Generate Dataset
  2. Enter dataset name: e.g., UploadFiles
  3. Click: OK

Load Tester creates a dataset with two columns:

  • filename: The name of each file (e.g., document1.pdf)
  • filepath: The full path to the file in Load Tester's storage

Step 4: Find the File Field in the Test Case

  1. Open test case in Test Case Editor
  2. Open Fields View: Window → Show View → Fields View
  3. Select the transaction with the file upload (usually a POST request)
  4. Locate the file upload field in Fields View (field type will show as File)

Step 5: Configure Field to Use Dataset

  1. Double-click the file upload field to open Field Assignment dialog
  2. Datasource: Select Dataset from dropdown
  3. Choose dataset: Select the dataset you created (e.g., UploadFiles)
  4. Choose field: Select the filepath column
  5. Click: OK

The test case is now ready to run! Load Tester will automatically send the files to the load engine prior to starting the next load test.

Automatic File Distribution

Load Tester copies the uploaded files into its own private area and automatically distributes them to load engines before running load tests. You don't need to manually copy files to engine machines.

Re-Import After File Changes

If you modify the contents of the original upload files after importing them into Load Tester, you must re-import the files (repeat Step 2) for the changes to be recognized and sent to load engines.

Ask the AI to Configure File Uploads

If you're having trouble configuring file uploads:

I need to configure my test case so each virtual user uploads a different
PDF file. I have 50 PDF files in a folder. Can you walk me through the
steps to import them and configure the upload field?

The AI can:

  • Guide you through the file import and dataset generation process
  • Help locate the file upload field in Fields View
  • Troubleshoot file upload configuration issues
  • Explain how Load Tester distributes files to load engines

File Downloads: Handling Dynamic URLs

In most cases, Load Tester handles file downloads during a test case automatically. Fundamentally, there's no difference between downloading a spreadsheet or an image on the page.

But in some cases, Load Tester does not automatically handle dynamic URLs used in certain situations. Most commonly, these are cases where:

  • The file is being generated on-demand
  • The file is assigned a unique identifier that is part of the URL path (not a query parameter)
  • There's no unique field name or query parameter name Load Tester can use to locate the dynamic value

Example of problematic dynamic file URL:

https://example.com/docs/e5fb4c74-31ef-4f5b-b920-0a28016c5969/RecordedDocument.pdf

In this URL, two path segments are dynamic:

  • e5fb4c74-31ef-4f5b-b920-0a28016c5969 (unique file ID)
  • RecordedDocument.pdf (filename)

Because there's no query parameter name (like ?fileId=...), Load Tester doesn't automatically correlate these values. Instead, it requests the same URL each time, which fails when the file ID changes between sessions.


Customizing for Dynamic File Downloads (3 Steps)

  1. Locate the source for the dynamic values
  2. Configure an extractor to put those values into user state
  3. Configure the URL to use the extracted values

Step 1: Locate the Source of Dynamic Values

The source will typically be in the previous web page. For our example, the source might look like this in the HTML response:

<a href="/docs/e5fb4c74-31ef-4f5b-b920-0a28016c5969/RecordedDocument.pdf">Download</a>

Find this:

  1. Select the transaction before the file download in Test Case Editor
  2. Open Response Content view: Window → Show View → Response Content
  3. Search for the dynamic value (use the Search tab)
  4. Note the pattern around the value (the href attribute in this case)

Step 2: Configure Extractor to Capture Values

  1. Open Actors View: Window → Show View → Actors
  2. Select: Extractors tab
  3. Click: Add Extractor (+) button

Configure the extractor:

  • Extractor Type: Regular Expression
  • Applied To: Select the transaction containing the source (previous page)
  • Pattern: /docs/([\w\-]*)/([\w\.]*)
  • First capture group: ([\w\-]*) matches the file ID
  • Second capture group: ([\w\.]*) matches the filename
  • Variable Names: file_path file_name (space-separated, one name per capture group)
  • Instance: 1 (extract first match)

Verify extraction: The Value selected for extraction field at the bottom of the dialog should show the values extracted from the recorded page. If it's blank, your pattern doesn't match. Adjust the regular expression.

Regular Expressions

Regular expressions are a complex topic outside the scope of this guide. There are many great sources available that cover regex in detail. The key is to create a pattern that matches the dynamic parts of the URL and uses capture groups (...) to extract specific values.


Step 3: Configure URL to Use Extracted Values

With the extractor configured, the final step is to configure the download URL to use these values during replay.

Edit the Request-Line (URL path):

  1. Select the file download transaction in Test Case Editor (the one with the dynamic URL)
  2. Open Headers View: Window → Show View → Headers
  3. Click: Edit the Start Line button (next to the URL/Start Line)

Configure each dynamic path segment:

  1. Select the second path segment (the file ID: e5fb4c74-31ef-4f5b-b920-0a28016c5969)
  2. Choose: Use User Variable
  3. Enter variable name: file_path (the variable name from the extractor)
  4. Repeat for the third path segment (filename: RecordedDocument.pdf)
  5. Choose: Use User Variable
  6. Enter variable name: file_name
  7. Click: OK

That's it! The test case will now download the correct file during each replay, using the values extracted from the previous page.


JSON and XML: Editing Structured Content

Modern web applications and APIs use JSON and XML to send structured data in HTTP requests. Load Tester automatically recognizes JSON and XML content and breaks it down into editable fields.

Automatic JSON Recognition (v4.3+)

Starting with Load Tester 4.3, Load Tester automatically recognizes JSON content in any HTTP request. Each JSON element becomes a configurable name-value pair field in the Fields View.

Example JSON in a POST request:

{
  "customer_id": "12345",
  "address_id": "67890",
  "address": "123 Main St"
}

In the Fields View, you'll see:

  • customer_id = 12345
  • address_id = 67890
  • address = 123 Main St

Each field can be configured with a different datasource (constant, dataset, user variable, script).


ASM Auto-Configuration for JSON

Whenever Load Tester's Application State Management (ASM) tool runs, it searches for incoming name-value pairs (in JSON responses) that have matching outgoing name-value pairs (in JSON requests).

When ASM detects a match, it automatically configures the test case to:

  1. Extract the value from the JSON response
  2. Store it in user state
  3. Inject it into subsequent JSON requests

This works for:

  • JSON numbers
  • JSON strings

JSON Auto-Configuration Limitations

While other types of auto-configuration performed by Load Tester are highly reliable, the correctness of auto-configuration in JSON content cannot in general be guaranteed. Automatically configured JSON fields will be incorrect in a handful of situations.

When to verify JSON configuration manually:

  • Complex nested JSON structures
  • JSON arrays with variable-length content
  • Fields with the same name but different meanings in different contexts

Always run a replay after ASM and verify that JSON requests contain the correct values.


Manual JSON Field Configuration

If Load Tester doesn't automatically recognize JSON content (e.g., JSON hidden inside a query parameter), you can manually mark any field as JSON content:

  1. Open Fields View and select the transaction with JSON
  2. Double-click the field containing JSON content
  3. In Field Assignment dialog, select Parsers tab
  4. Choose parser: JSON
  5. Click: OK

Load Tester now treats that field's content as JSON, breaking it down into editable sub-fields.


XML Auto-Configuration

Load Tester automatically recognizes XML content in HTTP requests and responses, just like JSON.

XML elements appear in Fields View as name-value pairs, and ASM automatically correlates XML values between responses and requests.

Example XML in a SOAP request:

<soap:Envelope>
  <soap:Body>
    <GetCustomer>
      <customerId>12345</customerId>
      <sessionToken>abc-def-ghi</sessionToken>
    </GetCustomer>
  </soap:Body>
</soap:Envelope>

In the Fields View:

  • customerId = 12345
  • sessionToken = abc-def-ghi

If sessionToken was provided by the server in an earlier response, ASM will automatically configure extraction and correlation.

XML Automation (v5.0+)

Load Tester PRO 5.0 introduced automatic XML recognition and auto-configuration, making it much easier to configure SOAP web service tests and other XML-based applications. ASM handles XML correlation just like it handles form fields and JSON content.


Editing URLs and Hostnames

There are several scenarios where you need to modify recorded URLs:

  1. Change hostname: Test against a different environment (staging vs. production)
  2. Edit path segments: Handle dynamic content in URL paths
  3. Modify query parameters: Vary data per user or iteration
  4. Delete unwanted URLs: Remove transactions that aren't part of the test scenario

Changing Hostnames (Testing Different Environments)

When you record against one environment (e.g., staging.example.com) but need to test against another (e.g., production.example.com), you can change the hostname for all transactions at once.

Method 1: Hostname Resolution (Global)

  1. Open Preferences: Window → Preferences
  2. Navigate to: Web Performance → Hostname Resolution
  3. Click: Add
  4. Original hostname: staging.example.com
  5. Replacement hostname: production.example.com
  6. Click: OK

All transactions to staging.example.com will now be sent to production.example.com during replay.

Hostname Resolution vs. Editing URLs

Hostname Resolution is global: it affects all test cases in your workspace. If you only want to change the hostname for one specific test case, use Method 2 (edit URLs directly).


Method 2: Edit URLs Directly (Per Test Case)

  1. Select the transaction with the URL you want to edit
  2. Open Headers View
  3. Click: Edit the Start Line button
  4. Modify the URL components:
  5. Protocol: http or https
  6. Hostname: Change to the target environment
  7. Port: Change if needed
  8. Path segments: Edit individual path elements
  9. Query parameters: Edit parameter values
  10. Click: OK

Editing URL Path Segments for Dynamic Content

For URLs with dynamic path segments (like the file download example earlier), you can configure each segment to use a user variable, dataset value, or constant.

Example dynamic URL:

https://api.example.com/v2/users/12345/orders/67890

To make 12345 (user ID) and 67890 (order ID) dynamic:

  1. Select the transaction in Test Case Editor
  2. Open Headers View
  3. Click: Edit the Start Line
  4. Select path segment: 12345
  5. Choose: Use User Variable or Use Dataset
  6. Enter variable name (e.g., user_id) or select dataset field
  7. Repeat for 67890 (order ID)
  8. Click: OK

Removing Unwanted Transactions

Why Remove Ad-Servers and Click-Trackers

We get a lot of questions about load testing websites that include 3rd-party components like advertisements and user tracking. For most users, we recommend leaving these out of the load test entirely.

Here's why.


Advantages of Including Third-Party Services

  1. The entire system is load tested - including these third-party systems
  2. The test is more realistic because it does more of what a real user would do

That first point seems compelling. Surely a thorough test should measure the capabilities of these important 3rd-party services, right?


Disadvantages of Including Third-Party Services

Not necessarily. There are several reasons including them may be counterproductive, depending on your situation:

  1. Results may not be useful: Since you don't control these systems (and you're testing a live system), there's little reason to believe that performance witnessed during the load test won't change day to day (or minute to minute) as the 3rd-party system experiences traffic changes due to their other customers or changes to their system.

  2. Results may not be actionable: There's little you can do to improve the performance of third-party systems you don't control.

  3. Page render time vs. total page load time: If your page architecture has been well-optimized, then the end-user should not experience noticeable changes in page render time (when enough of the page is visible for the user to start interacting) even when these 3rd-party systems are slow. Most load-testing tools (including ours) measure total page load time, not page render time, so in many cases the tests will better reflect reality when these 3rd-party servers are excluded.

  4. Inflated user-tracking metrics: The user-tracking metrics collected for your site could be affected by the load tests, artificially inflating visit rates.

  5. Ad fraud detection triggered: If the ad vendors are paying for each impression on your site, fraud-detection algorithms may be triggered by the load test, possibly resulting in actions that could interrupt ad revenue for the site.

  6. Much harder to configure: Test cases are typically much harder to configure when third-party services are included. This is not unique to our product; all load testing tools require additional work to handle these cases.


How to Remove Third-Party Services (Transaction Blocking)

If you agree, the next question is how to keep these services out of your load test.

Use Transaction Blocking during recording:

  1. Open Preferences: Window → Preferences
  2. Navigate to: Web Performance → Recording → Transaction Blocking
  3. Click: Add Host button
  4. Enter hostname to block: e.g., doubleclick.net
  5. Use wildcards for multiple subdomains:
  6. .adsrus.com blocks server1.adsrus.com, server2.adsrus.com, etc.
  7. Click: OK

After entering these hosts, record your test case and you'll find that requests to blocked hosts have been automatically left out of the test case.

Common Ad-Servers to Block

Common third-party services to block:

  • .doubleclick.net (Google Ads)
  • .googlesyndication.com (Google AdSense)
  • .googleadservices.com (Google Ad Services)
  • .facebook.net (Facebook tracking)
  • .scorecardresearch.com (Analytics)
  • .quantserve.com (Analytics)

Deleting URLs from Existing Test Cases

If you've already recorded a test case and want to remove specific transactions:

Method 1: Delete Transactions Individually

  1. Select the transaction in Test Case Editor
  2. Right-clickDelete
  3. Confirm deletion

Method 2: Delete Multiple Transactions

  1. Hold Ctrl (Cmd on macOS) and click each transaction you want to delete
  2. Right-clickDelete
  3. Confirm deletion

Delete Carefully

Only delete transactions that are truly unnecessary (ads, trackers, analytics). Deleting transactions that the application depends on (JavaScript libraries, CSS files, API calls) will cause replay failures.


Troubleshooting Customization

File Upload Fails: "File not found"

Symptom: Replay fails with error indicating file cannot be found.

Likely causes:

  1. Files not imported into Load Tester
  2. Field not linked to dataset properly
  3. Dataset column name wrong (should use filepath column, not filename)

Solution:

  • Verify files are imported: Window → Preferences → Web Performance → File Upload
  • Check field configuration: Fields View → Double-click file upload field → Verify datasource is Dataset and column is filepath

Dynamic File Download Gets 404 Error

Symptom: File download transaction fails with 404 Not Found during replay.

Likely causes:

  1. Extractor not configured to capture dynamic URL values
  2. URL not configured to use extracted variables
  3. Extractor pattern doesn't match the source content

Solution:

  • Check extractor: Actors View → Extractors tab → Verify extractor shows "Value selected for extraction" (if blank, pattern doesn't match)
  • Check URL configuration: Headers View → Edit Start Line → Verify path segments use correct user variables
  • Test extraction manually: Run a replay and check Fields View to see if user variables are populated

JSON Fields Not Appearing in Fields View

Symptom: JSON content in request body, but Fields View doesn't show individual JSON fields.

Likely causes:

  1. Content-Type header incorrect (not application/json)
  2. JSON hidden inside another field (query parameter, form field)
  3. Malformed JSON (syntax error)

Solution:

  • Check Content-Type: Headers View → Verify Content-Type is application/json
  • Manually mark as JSON: Fields View → Double-click field → Parsers tab → Select JSON
  • Validate JSON syntax: Copy JSON to online validator to check for syntax errors

ASM Doesn't Correlate JSON Values

Symptom: ASM runs successfully, but JSON fields that should be correlated aren't auto-configured.

Likely causes:

  1. Field names don't match exactly between response and request
  2. Values are JSON arrays or objects (ASM only correlates strings and numbers)
  3. Complex nested structures confuse auto-correlation

Solution:

  • Check field names: Fields View → Compare field names in response vs. request (must match exactly)
  • Configure manually: Create extractor for response value, link to request field using user variable
  • Simplify JSON structure: If possible, flatten nested structures

Best Practices

1. Block Third-Party Services During Recording

Why: Easier to configure test cases without ad-servers and analytics. Cleaner recordings focus on your application's performance.

How: Configure Transaction Blocking (Window → Preferences → Recording → Transaction Blocking) before recording.


2. Use Datasets for File Uploads

Why: Realistic testing requires each virtual user to upload different content.

How: Import files → Generate dataset → Link file upload field to dataset's filepath column.


3. Verify JSON/XML Auto-Configuration After ASM

Why: JSON/XML auto-correlation is not guaranteed to be correct in all situations.

How: After running ASM, run a small replay and check Fields View to verify JSON request fields contain correct dynamic values from previous responses.


4. Test Hostname Changes with Small Replays First

Why: Changing hostnames affects all transactions. A small replay (1 virtual user) quickly verifies the target environment is reachable and configured correctly.

How: Configure Hostname Resolution → Run 1-user replay → Verify all transactions succeed → Then run full load test.


5. Keep Deleted Transactions in a Backup Test Case

Why: If you delete transactions and later realize you need them, recovering them from a backup is easier than re-recording.

How: Before deleting transactions, duplicate the test case (right-click → Duplicate) for backup.


Next Steps

After customizing your test case, verify it replays successfully:

For advanced configuration:

For load testing with customized test cases: