LazyWriter Stream write failure
search cancel

LazyWriter Stream write failure

book

Article ID: 269474

calendar_today

Updated On:

Products

Information Centric Analytics

Issue/Introduction

The RiskFabric Processing job step Risk Fabric cube processing fails and records the following errors in the SQL Server Agent log:

<Error ErrorCode="3238133764" Description="File system error: The following error occurred while writing to the file 'LazyWriter Stream': There is not enough space on the disk. ." Source="Microsoft Analysis Services" HelpFile="" />
<Error ErrorCode="3238133820" Description="File system error: The background thread running lazy writer encountered an I/O error. Physical file: <file-path>. Logical file: ." Source="Microsoft Analysis Services" HelpFile="" />

The drive referenced in the error appears to have ample free disk space when processing is not running. If cube processing is scripted to run sequentially rather than in parallel, this failure does not occur. The RiskFabric cube processing step is configured to process dimensions and measure groups in parallel by default but may be converted to sequential processing to address a known issue with certain versions of Microsoft SQL Server, as documented in the following KB article:

RiskFabric Processing hangs during cube processing
Article ID: 175826
https://knowledge.broadcom.com/external/article?articleId=175826

NOTE: URL last validated July 10, 2023

Environment

Release : 6.x

Component : Microsoft SQL Server Analysis Services

Cause

If this failure occurs after restoring parallel cube processing per Broadcom KB article 199072, the processing task is utilizing more temporary disk space than it had previously despite the size of the RiskFabric database remaining approximately the same. Per Microsoft, parallel processing can create conditions that result in greater space consumption than would otherwise be used during sequential processing:

During processing, the aggregation buffer determines the amount of memory that is available to build aggregations for a given partition. If the aggregation buffer is too small, Analysis Services supplements the aggregation buffer with temporary files. Temporary files are created in the TempDir folder when memory is filled and data is sorted and written to disk. When all necessary files are created, they are merged together to the final destination. Using temporary files can potentially result in some performance degradation during processing. To monitor any temporary files used during processing, review MSOLAP:Proc Aggregations\Temp file bytes written/sec.

In addition, when processing multiple partitions in parallel or processing an entire cube in a single transaction, you must ensure that the total memory required does not exceed the value of the Memory\TotalMemoryLimit setting. If Analysis Services reaches the Memory\TotalMemoryLimit during processing, it does not allow the aggregation buffer to grow and may cause temporary files to be used during aggregation processing. Furthermore, if you have insufficient virtual address space for these simultaneous operations, you may receive out-of-memory errors. If you have insufficient physical memory, memory paging will occur. If processing in parallel and you have limited resources, consider doing less in parallel.

Under the default configuration, Analysis Services will throw an out-of-memory exception if you try to request too much memory during processing. It is possible to disable this error by setting the MemoryLimitErrorEnabled to false in the server properties. However, this may cause disk spill and slow down the processing operation.

If there is no way you can avoid spilling data to disk, you should at least make sure the TempDir folder and Page file is a fast I/O system.

The white paper is published here:

https://www.microsoft.com/en-au/download/confirmation.aspx?id=17303

NOTE: URL last validated July 10, 2023

Resolution

Increase the size of the disk identified in the error or separate the location of the RiskFabric cube's data and temp directories on the SQL Server Analysis Services (SSAS) server.

If you require assistance with increasing the size of a disk, contact your server administrator.

To change the location of the data and temp directories in SSAS, follow this procedure:

  1. Open SQL Server Management Studio (SSMS)
  2. Connect to the Analysis Services server hosting the RiskFabric cube
  3. In Object Explorer, right-click the name of the server and select Properties
    The Analysis Server Properties window will open
  4. Select the General page
  5. Enable the option labelled Show Advanced (All) Properties
  6. Modify the DataDir and/or TempDir properties as desired
  7. Click the OK button to commit the changes and close the Analysis Server Properties window
  8. Open a Command Prompt as an administrator
    1. If the RiskFabric cube is installed in the default instance of SSAS, enter the following commands, followed by Enter:
      net stop "MSSQLServerOLAPService"
      net start "MSSQLServerOLAPService"
    2. If the RiskFabric cube is installed in a named instance of SSAS, enter the following commands, followed by Enter:
      net stop "MSOLAP$<instance-name>"
      net start "MSOLAP$<instance-name>"