Skip to content

Analysis Troubleshooting

Problems analyzing load test results? This page covers analysis-specific issues. Find the symptom that matches what you're seeing:


Report and Dashboard Access Issues

Report won't open or generate

Symptom: Clicking "Open Report" does nothing, or report generation fails.

Most common causes: 1. Load test still running (report only available after completion) 2. Result file corrupted or incomplete 3. Insufficient memory to generate report 4. Report template missing or corrupted

Fix: 1. Wait for test completion - Report is only available after load test finishes 2. Check result file - Navigator → Right-click result → Properties, verify file size is reasonable 3. Increase Java heap - Help → Preferences → General → Increase maximum heap size to 4GB or higher 4. Re-run test - If result file is corrupted, re-run the load test

See: Results Overview for report access methods.


Dashboard won't open

Symptom: Clicking "Dashboard" button does nothing or shows error.

Most common causes: 1. Dashboard view closed/hidden 2. Browser component initialization failed 3. Port conflict with embedded web server 4. Insufficient memory

Fix: 1. Open Dashboard view manually - Window → Show View → Embedded Analytics Dashboard 2. Restart Load Tester - Close and reopen application to reinitialize browser component 3. Check port availability - Dashboard uses port 8888 by default (configurable in preferences) 4. Increase memory - Help → Preferences → General → Increase heap size

See: Embedded Analytics Dashboard for dashboard features.


"Out of memory" when generating report

Symptom: Report generation fails with OutOfMemoryError.

Most common causes: 1. Large load test results (millions of transactions) 2. Java heap size too small 3. Too many metrics enabled (transaction metrics, detailed durations) 4. Multiple reports open simultaneously

Fix: 1. Increase heap size - Help → Preferences → General → Maximum Heap Size → Set to 8GB for large tests 2. Close other reports - Close unnecessary reports to free memory 3. Disable optional metrics - Load Test Settings → Turn off transaction metrics if not needed 4. Generate reports selectively - Use time range filters to analyze portions of large tests


Chart and Visualization Issues

Charts not displaying or blank

Symptom: Report opens but charts are empty or missing.

Most common causes: 1. No data for the selected time range 2. Metrics not enabled for this test 3. User-level analysis configured incorrectly 4. Chart rendering issue

Fix: 1. Check time range - Verify you're viewing a time range that contains data 2. Verify metrics enabled - Load Test Settings → Confirm metrics you want are enabled 3. Reset zoom - Double-click chart to reset zoom if you're zoomed to empty range 4. Check user-level settings - Report Settings → User-Level tab → Verify levels are configured

See: Understanding Metrics for metric configuration.


"No data at selected user levels"

Symptom: User-level charts show "no data" message.

Most common causes: 1. User-level analysis not configured 2. Test didn't reach configured user levels 3. User-level interval too narrow 4. Load test used custom ramp profile that doesn't match configured levels

Fix: 1. Configure user levels - Report Settings → User-Level tab → Set levels (e.g., 50, 100, 150, 200) 2. Use automatic detection - Report Settings → User-Level → Select "Automatic" to detect levels from test 3. Widen intervals - Use intervals (e.g., "Every 50 users ± 10") to capture more data 4. Check test duration - Short tests may not have enough samples at each user level

See: Performance Analysis Workflow - Step 2: Configure Analysis Methods.


Scatter plots not showing

Symptom: Scatter plot charts are missing or empty.

Most common causes: 1. Scatter plot view not enabled in report settings 2. Too many data points (performance issue) 3. Filtering applied that excludes all points

Fix: 1. Enable scatter plots - Report Settings → Charts tab → Enable "Scatter Plots" 2. Filter to specific pages - For large tests, generate scatter plots for individual pages only 3. Check filters - Clear any time range or page filters that might exclude data 4. Use dashboard - Embedded Analytics Dashboard handles large scatter plots better than static reports


Missing or Incorrect Metrics

Server metrics not showing in report

Symptom: Server monitoring data is missing from report.

Most common causes: 1. Server monitoring not configured for this test 2. Agent connection failed during test 3. Firewall blocked server monitoring ports 4. Server monitoring started after test began

Fix: 1. Verify server monitoring setup - Check if servers were configured before test started 2. Check agent logs - On monitored server, check agent logs for connection errors 3. Review firewall rules - Ensure ports 2099-2109 are open 4. Re-run test with monitoring - Set up server monitoring before starting load test

See: Server Monitoring Introduction for setup.


"Performance goals not evaluated"

Symptom: Report shows "goals not evaluated" or no goal analysis.

Most common causes: 1. Performance goals not configured before test 2. Goals configured for wrong pages 3. Insufficient data at user levels 4. Analysis method mismatch

Fix: 1. Set goals before testing - Load Test Settings → Performance Goals → Configure thresholds 2. Check page names match - Goals must match exact page names from recording 3. Verify user levels - Goals are evaluated at configured user levels 4. Choose appropriate analysis method - Average, 95th percentile, max duration, etc.

See: Performance Analysis Workflow - Step 1: Define Performance Goals Early.


Transaction metrics missing

Symptom: Only seeing page-level metrics, not individual transactions.

Most common causes: 1. Transaction metrics not enabled before test 2. Test case has no transaction boundaries defined 3. Large test with transaction metrics disabled for performance

Fix: 1. Enable transaction metrics - Load Test Settings → Enable "Collect transaction metrics" (before running test) 2. Define transactions in test case - Use Begin/End Transaction markers in test case editor 3. Accept performance trade-off - Transaction metrics add overhead for very large tests

Note: Transaction metrics must be enabled before running the test. You cannot add them after the fact.


Performance Goal Analysis Issues

All goals showing "failed" or "passed" incorrectly

Symptom: Goal analysis seems wrong: everything passes or everything fails.

Most common causes: 1. Goals set to wrong thresholds 2. Wrong analysis method selected 3. User levels don't match test 4. Time-based vs user-level confusion

Fix: 1. Review goal thresholds - Verify goals are reasonable (e.g., 2.0s, not 0.02s) 2. Check analysis method - Average is forgiving, 99th percentile is strict; ensure method matches requirements 3. Verify user levels - Make sure configured user levels match actual test 4. Use correct view - Check both time-based and user-level analysis

See: Performance Analysis Workflow - Step 7: Analyze Performance Goals.


"Not enough data to evaluate goals"

Symptom: Some goals show "not evaluated" due to insufficient data.

Most common causes: 1. Test duration too short for test case length 2. Pages not executed at certain user levels 3. Errors prevented pages from loading 4. User level ramp too fast

Fix: 1. Increase test duration - Allow more time at each user level for all pages to execute 2. Slower ramp rates - Ramp users more gradually to collect more samples 3. Fix errors first - If pages are failing, fix errors before analyzing goals 4. Check page execution frequency - Some pages may only execute once per session


Analysis View and UI Issues

Metrics view not updating

Symptom: Metrics view shows old data or doesn't refresh.

Most common causes: 1. Wrong test result selected in Navigator 2. Auto-refresh disabled 3. View showing cached data 4. Eclipse workspace sync issue

Fix: 1. Select correct result - Click the result you want to analyze in Navigator 2. Refresh view - Click refresh button in Metrics view toolbar 3. Close and reopen view - Window → Show View → Metrics (force re-initialization) 4. Refresh workspace - F5 or right-click Navigator → Refresh


"Cannot compare: incompatible test configurations"

Symptom: Trying to compare two tests but getting error about incompatibility.

Most common causes: 1. Different test cases being compared 2. Different metrics collected 3. One test has server monitoring, other doesn't 4. Time ranges don't overlap

Fix: 1. Compare same test case - Only compare results from the same test case 2. Match metric configuration - Both tests should have same metrics enabled 3. Accept limitations - Some comparisons aren't meaningful (different test cases, different goals) 4. Use AI analysis - Dashboard AI can explain differences even with configuration mismatches

See: Embedded Analytics Dashboard for comparison features.


Export and Sharing Issues

Export to PDF/HTML fails

Symptom: Export function fails or produces corrupted output.

Most common causes: 1. Large reports exceed PDF size limits 2. Special characters in page names 3. Missing images or resources 4. Insufficient disk space

Fix: 1. Export smaller sections - Export specific pages or time ranges instead of entire report 2. Check disk space - Ensure sufficient space in export directory 3. Simplify page names - Avoid special characters in page names 4. Use dashboard export - Dashboard's export is more robust for large datasets


Exported report missing charts

Symptom: Exported HTML/PDF has text but charts are blank.

Most common causes: 1. Chart rendering timeout 2. Memory issue during export 3. Charts not generated before export 4. Browser compatibility (for HTML exports)

Fix: 1. Wait for charts to render - Let report fully load before exporting 2. Increase memory - Set higher heap size in preferences 3. Export from dashboard - Dashboard export handles charts better 4. Use modern browser - Open exported HTML in Chrome, Firefox, or Edge


Dashboard-Specific Issues

AI assistant not responding

Symptom: AI panel is blank or prompts don't get responses.

Most common causes: 1. No test result selected 2. Network connectivity issue (if AI requires cloud service) 3. AI service unavailable 4. Invalid prompt format

Fix: 1. Select a test result - AI needs active test data to analyze 2. Check connectivity - Verify internet connection if AI uses cloud service 3. Simplify prompt - Start with basic prompts like "Summarize this test" 4. Check AI status - Help → About → AI Service Status

See: AI for Analysis for effective prompts.


Dashboard charts not interactive

Symptom: Can't zoom, drill down, or interact with charts.

Most common causes: 1. Using legacy reports instead of dashboard 2. Browser component not initialized 3. JavaScript disabled 4. Large dataset causing performance issues

Fix: 1. Open dashboard, not report - Click "Dashboard" button (top-right), not "Open Report" 2. Restart Load Tester - Reinitialize browser component 3. Reduce data scope - Filter to specific time range or pages for better performance 4. Use supported browser component - Verify Eclipse browser component is properly configured

See: Embedded Analytics Dashboard for interactive features.


"Dashboard service unavailable"

Symptom: Dashboard shows error about service not available.

Most common causes: 1. Embedded web server failed to start 2. Port already in use 3. Security software blocking server 4. Insufficient permissions

Fix: 1. Check port availability - Default port 8888 might be in use by another application 2. Configure different port - Help → Preferences → Dashboard → Port 3. Check firewall/antivirus - Allow Load Tester to run local web server 4. Run as administrator - Some systems require elevated permissions


Data Accuracy Issues

Metrics seem incorrect or inconsistent

Symptom: Numbers don't make sense or conflict with other metrics.

Most common causes: 1. Misunderstanding what metric measures 2. Time zone differences 3. Percentile confusion (95th vs average) 4. Aggregation period mismatch

Fix: 1. Review metric definitions - See Understanding Metrics for exact definitions 2. Check time zones - Verify all timestamps use same time zone 3. Understand percentiles - 95th percentile is NOT average; it's the value 95% are below 4. Verify aggregation windows - User-level vs time-based views aggregate differently

See: Understanding Metrics for detailed explanations.


"Response times don't match what users report"

Symptom: Load test shows fast times but users complain site is slow.

Most common causes: 1. Load test from different network location 2. Missing think times (unrealistic test) 3. Caching differences 4. Client-side rendering not measured 5. Testing different user workflows

Fix: 1. Test from user locations - Run cloud engines in same regions as users 2. Add realistic think times - See Load Test Concepts 3. Disable caching in test - Or match user cache behavior 4. Measure client-side performance - Use RUM (Real User Monitoring) alongside load testing 5. Validate workflows - Ensure test case matches actual user behavior

See: Performance Analysis Workflow for validation techniques.


Configuration and Settings Issues

Can't change report settings

Symptom: Report settings dialog won't open or changes don't apply.

Most common causes: 1. Report is read-only (archived result) 2. Settings locked by administrator 3. Eclipse preferences corrupted 4. Insufficient permissions

Fix: 1. Check result status - Archived results may be read-only 2. Verify permissions - Ensure you have write access to workspace 3. Reset preferences - Help → Preferences → Report Settings → Restore Defaults 4. Check workspace - Workspace corruption can cause this; try a fresh workspace


Analysis settings not persisting

Symptom: Settings reset every time report is opened.

Most common causes: 1. Changes not saved before closing 2. Workspace permissions issue 3. Settings stored per-result vs globally 4. Preferences vs report settings confusion

Fix: 1. Click "Apply" or "OK" - Don't just close dialog without saving 2. Check workspace permissions - Ensure Eclipse can write to workspace directory 3. Understand setting scope - Some settings are per-result, others are global preferences 4. Set defaults - Help → Preferences → set defaults that apply to all new results


Getting Help

If your issue isn't listed here:

  1. Check related troubleshooting topics:
  2. Load Testing Issues - Test execution problems
  3. Cloud & Engine Issues - Cloud infrastructure problems
  4. Common Error Messages - Error code reference

  5. Use the AI assistant:

  6. Ask: "Why is [specific issue] happening?"
  7. The AI can diagnose analysis problems based on your test data

  8. Contact support:

  9. Help → Submit Support Request
  10. Include: diagnostic logs, screenshot of error, description of what you were doing
  11. See Getting Support for details