You observe high or excessive memory utilization by Microsoft SQL Server, which appears to be causing instability in the host operating system and potentially contributing to failures during Information Centric Analytics' (ICA) RiskFabric Processing and RiskFabric Intraday Processing jobs.
Release : 6.x
Component : Microsoft SQL Server
By default, SQL Server does not have a limit on the maximum memory it is allowed to use. This is controlled through the setting Maximum server memory, and its default value is 2,147,483,647
MB (2 petabytes).
Per Microsoft's SQL Memory Management Architecture Guide:
SQL Server dynamically acquires and frees memory as required. Typically, an administrator does not have to specify how much memory should be allocated to SQL Server, although the option still exists and is required in some environments.
[...]
The default memory management behavior of the SQL Server Database Engine is to acquire as much memory as it needs without creating a memory shortage on the system. The SQL Server Database Engine does this by using the Memory Notification APIs in Microsoft Windows.
When SQL Server is using memory dynamically, it queries the system periodically to determine the amount of free memory. Maintaining this free memory prevents the operating system (OS) from paging. If less memory is free, SQL Server releases memory to the OS. If more memory is free, SQL Server may allocate more memory. SQL Server adds memory only when its workload requires more memory; a server at rest does not increase the size of its virtual address space.
High utilization up to the maximum limits configured for SQL Server is normal while ICA's processing jobs are running. Unless a server has been over-provisioned with CPUs and RAM, this is expected. For more information, please refer to the following documents:
Note that SQL Server is designed to utilize memory in a cooperative manner with other services, but SQL Server Analysis Services (SSAS) is not. Consequently, when the two services are hosted on a shared server, the probability exists that in the event both services are demanding resources, SQL Server may deallocate a percentage of RAM to improve system responsiveness. SSAS, in turn, will acquire that RAM (if needed). Once the amount of memory required by SSAS exceeds its HardMemoryLimit
, it will start closing connections to free up more resources.
Your DBA is responsible for managing and tuning memory and other settings as needed, but Broadcom provides some limited general guidelines in the Symantec ICA Administrator Guide for segregating memory allocations between SQL Server and SSAS when they are hosted on the same server:
"Configure SQL Server's maximum memory allocation to use 50% of the system RAM. Configure the
TotalMemoryLimit
of SSAS to 45%, leaving 5% for the OS."
Configure the maximum server memory setting for SQL Server by following this procedure:
RiskFabric
relational database* Broadcom cannot prescribe this value. Please refer to Microsoft's documentation for guidance in selecting the correct value for your environment.
The following query will return the current Maximum Server Memory value and the current amount of memory in use:
SELECT c.[value] AS "Maximum Server Memory",
c.value_in_use AS "Memory In Use"
FROM sys.configurations AS c
WHERE c.[name] = N'max server memory (MB)';
See also: Microsoft SQL docs online: Server memory configuration options