…brew your own Business Intelligence

PowerShell: Checking For Large SSAS String-Store Files

In a recent blog post, Why You Need to Routinely ProcessFull, I discussed an issue where we’d run into the 4GB file size limit on SSAS string store files.

In an ideal world, we’d fully reprocess the SSAS databases on a regular basis which would eliminate this issue…and maybe even provide a (very) minor performance boost by keeping the database data files as compact as possible.  But unfortunately we don’t live in an ideal world and the task of updating our ETL process to address this particular issue hadn’t quite made it to the top of the priority list when we ran into it again.


As an interim measure I created hacked together a PowerShell script that scans the SSAS database directory for large string-store files (.asstore, .bstore, .ksstore and .string).  If any of these file types are over the defined file size limit, the script logs them to an output file and throws an error.

[code] # ————————————————————————
# NAME: CheckSSASFileSize.ps1
# AUTHOR: Bill Anton
# DATE: 09/13/2013
# DESC: This script scans the directory for the SSAS database
# and logs any string-store files over 500MB to the specified
# output file. This is intended to be used in a recurring
# job to catch large string-store files before they reach
# the 4GB limit.
# SSAS String-Store files (.asstore, .bstore, .ksstore and .string)
# PARAMS: $db_name name of SSAS database (used for error message)
# $db_path path to the root data directory for SSAS database
# $size configures the filter for file size
# $ext_list limits scope of search to specified file extensions
# ————————————————————————
$db_name = “AdventureWorksDW2012”
$db_path = “C:\Program Files\Microsoft SQL Server\MSAS11.SQL2012_MD\OLAP\Data\AdventureWorksDW2012.0.db”
$output_file = “C:\Temp\SSAS_StringStore_List.csv”
$size = 500MB
$ext_list = “*.asstore”, “*.bstore”, “*.ksstore”, “*.string”

$filelist = get-ChildItem -path $db_path -recurse -ErrorAction “SilentlyContinue” -include $ext_list |
? { $_.GetType().Name -eq “FileInfo” } |
where-Object {$_.Length -gt $size} |
sort-Object -property length -Descending |
Select-Object Name, @{Name=”SizeInMB”;Expression={$_.Length / 1MB}},@{Name=”Path”;Expression={$_.directory}}
IF ($filelist -ne $NULL)
$filelist | Export-Csv $output_file
$error_msg = “String store files in SSAS database (” + $db_name + “) are over ” + $size + “. Check the following file for details: ” + $output_file
Write-Error $error_msg -EA Stop

This script can be used in a recurring SQL Server Agent job and send notification if any files are approaching the 4GB limit so we can schedule a full process after hours…you know, before regular processing errors out in the middle of business hours.

Keep in mind that the script will require filesystem permissions to read the files in the SSAS directory and write the output file which contains the files that are over the limit. When testing on my lab system, I simply created a credential (based on my domain account) and a proxy (based on the credential) and created an Agent Job Step for PowerShell that run via the proxy.

click to zoom

click to zoom

On a similar note, Bob Duffy (b | t) recently blogged about an Excel workbook he created which uses a bit of VBA to provide a more detailed view of the SSAS database file sizes.  You can read about it and grab a copy here.

2 thoughts on “PowerShell: Checking For Large SSAS String-Store Files

  1. Just wanted to let you know that I used this script today and it worked like a charm 😉 We have been dealing (we think) with the dreaded string store limit all day.

  2. Bill says:

    glad you found it helpful 🙂

Leave a Reply