ESXi memory analysis using batch ESXTOP for high memory consumption troubleshooting
search cancel

ESXi memory analysis using batch ESXTOP for high memory consumption troubleshooting

book

Article ID: 404415

calendar_today

Updated On:

Products

VMware vSphere ESXi

Issue/Introduction

One or more of the following:

  • Sustained high memory use on an ESXi host. Memory consumption stays elevated above normal levels for extended periods. The host may show memory usage consistently above 80-90% in vCenter performance charts.
  • Virtual machines (VMs) on the affected host may show performance problems. These include slower application response times, increased disk activity, or guest operating system memory warnings. This memory pressure can impact VM performance and may indicate the need for memory allocation adjustments.
  • Memory consumption occurs suddenly on specific ESXi hosts with no apparent configuration changes. Host memory usage jumps from normal baseline levels to significantly higher sustained levels. For example, memory usage may jump from 350GB to 690GB use.
  • Memory demand stays elevated after the initial increase without returning to previous baseline levels. You need to validate whether the memory consumption represents a genuine issue requiring action.

Additional symptoms reported:

  • High memory consumption is occurring on one of the hosts
  • Memory usage showing sudden increases in monitoring charts
  • Host memory demand jumping from baseline levels and remaining elevated

Environment

ESXi 7.0 and newer

Cause

ESXi host memory pressure occurs when the combined memory demands of virtual machines exceed the host's available physical memory capacity. This creates a resource contention scenario where ESXi must use memory management techniques. These techniques include ballooning, compression, and swapping to satisfy VM memory requests, potentially impacting virtual machine performance.

Resolution

To collect the diagnostic data needed for VMware support to analyze your ESXi host memory consumption:

  1. Connect to the affected ESXi host using Secure Shell (SSH).

  2. Identify an available datastore with sufficient free space using the command:
    df -h
    Note: Minimum 500 MB recommended for data collection.

  3. Execute the batch ESXTOP command to capture memory performance data:

    Note:  Replace `<datastore_name>` with your actual datastore name or UUID.
    Edit the capture_minutes, interval_seconds values if you want to modify how often samples are captured, and for how long the capture will run.
    # Set capture parameters
    capture_minutes=15
    interval_seconds=2
    datastore_path="/vmfs/volumes/<datastore_name>"

    # Calculate total samples needed
    samples_per_minute=$(expr 60 / ${interval_seconds})
    total_samples=$(expr ${capture_minutes} \* ${samples_per_minute})

    # Run ESXTOP batch capture
    esxtop -ba -d ${interval_seconds} -n ${total_samples} > "${datastore_path}"/$(hostname)_$(date -u +"%Y-%m-%dT%H%M%S")_esxtop_batch_all.csv
  4. Wait for the data collection to complete (15 minutes for the default configuration).

  5. Generate a log bundle for the ESXi host
    See Collecting diagnostic information for ESX/ESXi hosts and vCenter Server using the vSphere Web Client

  6. Locate the generated CSV file in the specified datastore directory.

  7. Download the CSV file from the datastore to your local system.

  8. Open a technical support case with Broadcom and attach the ESXTOP CSV file and ESXi host log bundle.

  9. Provide the following information in your support case:
    • ESXi host version and build number
    • vCenter Server version and build number
    • Timeline when the memory consumption issue began
    • Any recent environmental changes or VM deployments

Additional Information