…brew your own Business Intelligence

SSAS Memory Configurations for Common Architectures

Earlier this week I was working with a client to configure the server-level memory configuration properties for one of their SSAS servers and the discussion drummed up enough thoughts to prompt this blog post.

Below are the primary SSAS server-level memory configuration properties (along with the descriptions from TechNet BoL) that we will focus on in this post.

LowMemoryLimit
Specifies the amount of memory allocated by Analysis Services at start up. When this limit is reached, the instance will start to slowly clear memory out of caches by closing expired sessions and unloading unused calculations. The server will not release memory below this limit. The default value is 65; which indicates the low memory limit is 65% of physical memory or the virtual address space, whichever is less.

TotalMemoryLimit
Defines a threshold that when reached, causes the server to deallocate memory more aggressively. The default value 80% of physical memory or the virtual address space, whichever is less. Note that TotalMemoryLimit must always be less than HardMemoryLimit

HardMemoryLimit
Specifies a memory threshold after which the instance aggressively terminates active user sessions to reduce memory usage. All terminated sessions will receive an error about being cancelled by memory pressure. The default value, zero (0), means the HardMemoryLimit will be set to a midway value between TotalMemoryLimit and the total physical memory of the system; if the physical memory of the system is larger than the virtual address space of the process, then virtual address space will be used instead to calculate HardMemoryLimit.

One important point to keep in mind when setting these values is that a number between 1-100 represents a percent of total system memory while a number > 100 represents a static value in bytes – yes bytes. Also, pay special attention to the default value for HardMemoryLimit – it’s a bit different 😉

All of these properties can be accessed and configured via SSMS from the General page of the Analysis Server Properties window shown below.

image

Note: the General page of the Analysis Server Properties window is basically just a UI for the msmdsrv.ini file.

Ok – now that we’ve got the basics out of the way, let’s run through a few of the more common SSAS architecture scenarios and discuss the primary considerations in the context of server-level memory configuration.

Standalone SSAS Instance

This is by far the most common configuration for production SSAS deployments in large enterprises: 1 server completely dedicated to a single SSAS instance. For this scenario, leaving the primary server-level memory properties (LowMemoryLimit, TotalMemoryLimit, HardMemoryLimit) configured with defaults is fine. There’s very little risk – however, if the server is spec’d with a lot of physical memory (> 64 GB), then you might be leaving a bit too much on the table by not adjusting the defaults.

Below is a table showing the breakdown of the default SSAS memory configuration across a range of systems from 16 GB of memory all the way up to 512 GB.

image

The OS+Misc section at the bottom of the table is where you want to focus your attention. This represents the amount of memory available for the OS and other miscellaneous applications that may be running on your server (e.g. Anti-Virus). The Low line is calculated based on the HardMemoryLimit (i.e. [Total Physical Memory] – [HardMemoryLimit]) and the High line is calculated based on the TotalMemoryLimit (i.e. [Total Physical Memory] – [TotalMemoryLimit]). Even on some of the most active systems, it’s rare to see the memory available for OS+Misc anywhere close to the Low line of the range – see this post by Richard Lees (b | t) explaining how overly-effective caching mechanisms (SSAS and windows file system) can prevent maximum memory consumption.

Clearly, the amount of memory available for OS+Misc is complete overkill once the system configuration gets to 64GB and above. At the 250 and 512 GB levels, it is outrageous! So if you’re configuring a standalone SSAS server with > 32GB of memory, I strongly urge you to override the defaults.

Below is how I’d initially configure these properties for a standalone SSAS server with 64 GB of memory up to 512 GB of memory.

image

As you can see I’m using static values (representing the limits in number of “bytes”). Despite the awkwardness of such large numbers (and having to whip out excel to convert GB to Bytes) I consider this good practice anytime you’re using non-default values for these properties. For one reason, it helps with precision – especially when you’re trying to back into a number. Another reason is that these large numbers stick out when viewing the Analysis Server Properties window – which is very helpful for admins since it makes the intention very clear.

SQL Server DB Instance + SSAS Instance

Another common architecture is one where both the SQL Server relational DB Engine and Analysis Services instances are running on the same server. This is more common in smaller environments where the resource requirements of the system do not call for a separate server (and additional SQL Server license – cha-ching!). In this scenario, it is absolutely crucial to override the defaults for both instances in order to prevent resource contention, memory pressure, and (in the worst case) memory exception errors.

The primary memory configuration properties for a SQL Server relational DB Engine are shown below – followed by the descriptions from TechNet BoL. These can be configured using the Server Properties window (or via the sp_configure command).

image

Minimum server memory (in MB)
Specifies that SQL Server should start with at least the minimum amount of allocated memory and not release memory below this value. Set this value based on the size and activity of your instance of SQL Server. Always set the option to a reasonable value to ensure that the operating system does not request too much memory from SQL Server and inhibit Windows performance.

Maximum server memory (in MB)
Specifies the maximum amount of memory SQL Server can allocate when it starts and while it runs. This configuration option can be set to a specific value if you know there are multiple applications running at the same time as SQL Server and you want to guarantee that these applications have sufficient memory to run. If these other applications, such as Web or e-mail servers, request memory only as needed, then do not set the option, because SQL Server will release memory to them as needed. However, applications often use whatever memory is available when they start and do not request more if needed. If an application that behaves in this manner runs on the same computer at the same time as SQL Server, set the option to a value that guarantees that the memory required by the application is not allocated by SQL Server. The minimum amounts of memory you can specify for max server memory are 64 megabytes (MB) for 32-bit systems and 128 MB for 64-bit systems.

Unlike the previous architecture (Standalone SSAS Instance) – which only required us to consider the memory requirements for the OS and the Analysis Services instance – we now have to consider the impact of a SQL database instance being thrown into the mix.

From a memory management perspective, the SQL database instance can be described as “courteous” – it pays attention to low memory conditions (on the system) and will give up some of it’s memory stash (until it hits the Minimum server memory limit) to accommodate requests from other applications on the server. Analysis Services, on the other hand, can be described as “self-centered” – it only cares about it’s own memory consumption within the context of it’s own defined server-level limits.

The implications of these characteristics (“courteous” and “self-centered”) is that the SQL database instance – when left to the default memory settings – could become starved for memory if the SSAS instance is overly aggressive (since the minimum memory threshold is set zero!).

Consider, for example, a single server with 128 GB of physical memory hosting a SQL database instance and a SSAS instance installed with default memory configurations:

image

It is quite possible for a heavy SSAS session (processing + query load) to approach the TotalMemoryLimit of 102 GB of memory. That would only leave ~20-23 GB of memory for the SQL database instance (leaving a few GB of memory for OS+misc). That may or may not be enough memory for the SQL database instance.

If you’re trying to accommodate a low-latency requirement where data must be refreshed every hour, then you’ll have ETL processes loading data into the relational data warehouse (SQL database instance) and processing operations to update the cube (SQL database instance + SSAS instance) in addition to the regular query load from reports and ad-hoc analysis (SSAS instance + possibly SQL database instance).

On the other hand, maybe you’re data-latency requirements are a bit more slack and you only need to batch process new data over night while no users are hitting the system. In that case, you’re process is probably more serialized such that your SSAS instance is near idle while the ETL process is loading new data into the data warehouse. Then after ETL process completes, the SQL database instance sits (mostly) idle (with the exception of the SQL queries sent from SSAS to retrieve the data) while the cubes on the SSAS instance are being processed.

Note: the nightly batch processing scenario used to be the vast majority but I’m starting to see more and low data-latency requirements involving near continuous processing.

So, when it comes to configuring the memory settings for an architecture where the SQL database instance and the SSAS instance reside on the same hardware, here’s how I recommend you approach it for a system where both instances will be fairly active at the same time throughout the day:

  1. Determine how much memory you need for the OS and any miscellaneous applications (e.g. 4 GB to 10 GB depending on the size of the server)
  2. Determine the minimum amount of memory you need for the SQL database instance (e.g. 32 GB)
  3. Set the HardMemoryLimit on the SSAS instance: (Total Physical Memory) – (Memory for OS) – (SQL database instance Min)

You can leave the SQL database instance Max memory setting at the default of 2 TB since SQL server is good about sharing Winking smile . As for the LowMemoryLimit and TotalMemoryLimit settings on the SSAS instance; they can be set to reasonable values based on the HardMemoryLimit.

Below is an example of a reasonable configuration for a server with 128 GB of physical memory where you’re guaranteeing 6GB of memory for OS+Misc and ensuring the SQL database instance will have at least 32 GB of memory to work with:

Multiple SSAS Instances

Having multiple SSAS instances running on the same hardware is not too common – but you will see this architecture used from time to time to minimize hardware and licensing costs for QA and Development environments.

Since we are back to dealing with just the OS requirements (in addition to the SSAS instances), the rules are similar to the first architecture (Standalone SSAS Instance) where we simply determine the amount of memory to reserve for the OS and then chop up the remainder based on the estimated needs of the SSAS instances. Depending on the use case, not all of the SSAS instances on the same box will have the same memory requirements (some more, some less) so it helps to know a bit about the intended use of each instance. Ultimately, we just want to ensure the SSAS instances aren’t stepping on each other’s toes …and that there’s enough room for the OS etc.

Below is a basic configuration for a single server with 128 GB of physical memory, 6 GB reserved for the OS, and the rest divided evenly between 2 instances:

image

You could also go a bit more complicated than this if you desire. For example, if the SSAS instances on the box represent QA and DEV, and you’re development/release process is one where QA is only needed for the last week of a 4 week sprint, then you might have something like the following for the first 3 weeks of the sprint:

image

Then, as you enter the 4th week where QA is getting more heavily involved, or you’re doing some preproduction release testing, you might reconfigure settings to something like the following:

image

Note: changes to SSAS server-level memory configuration properties does NOT require a server restart. You can confirm this by looking at the General page of the Analysis Server Properties window in SSMS and checking the restart column.

Wrap Up

In this post we covered some of the primary considerations for configuring the SSAS server-level memory properties. We then ran through a few of the more common architectures (involving SSAS) and discussed a few of the factors to take into account when configuring these properties. These examples where very high-level and light on the specifics – so if you have any questions feel free ask in the comments section.

Perhaps in a future post, I’ll add to this one and discuss some of the more advanced SSAS server-level memory properties (e.g. AggregationMemoryLimitMax, AggregationMemoryLimitMin, BufferMemoryLimit) in conjunction with some of the more complex architectures (e.g. Scale-Out Query/Processing servers)

 

Additional References:

 

12 thoughts on “SSAS Memory Configurations for Common Architectures

  1. Klaas says:

    Thanks Bill,

    I’ve been looking for some guidance on this subject for a long time.
    I’m still not there, however, since our configuration with SQL,SSAS, SSIS and SSRS on one server with 64GB RAM probably isn’t what you call a common architecture. Soon I’ll probably be forced to add a tabular model to that collection.
    Any hints on how to configure memory settings on this?

    1. Bill says:

      That’s definitely a bit more tricky.

      A good chunk of the load for SSRS and SSIS will be shared by SSAS (assuming reports are against the SSAS) and the database engine…but it can really vary depending on the complexity and frequency of the ETL and the number of users/reports (throughput) for the reporting.

      So I would probably start with something like what’s described above in the “SQL Server DB Instance + SSAS Instance” scenario and pad a bit more for the OS+Misc to accommodate the additional SSRS application load (assuming SSRS installed in native mode). Then just monitor and adjust as necessary.

      1. Klaas says:

        Bill

        I think my SQL Server is not so into sharing. Cube processing failed because of not enough memory.
        On my 64GB machine, I set maxmemory to 49152MB, and for SSAS Hardmemory 51539607552, Totalmemory 45097156608, lowmemory 34359738368. The resource exhaustion detector ( never knew that exists) said at the time of failure the top 3 consumers were: sqlserver using 48GB, msmdsrv 18GB and SSIS 100MB. That’s more than 64GB, and clearly SQLServer wasn’t ‘courteous’ at all.
        The day after I set maxmemory to 24576MB and I haven’t had this problem since then.

  2. Nice post, Bill and one I’ll reference in the future I’m sure. I’ve always been semi-amused at the fact you have to specify memory settings in bytes. Like you’re monitoring the Analysis Server and thinking “One more byte ought to do the trick”!

    1. Bill says:

      haha – glad you liked it. I think many agree the “byte by byte” increments are somewhat ridiculous. Even back in 2000 with the first release it doesn’t make sense.

  3. Nice post, good job.

    Thomas

    1. Bill says:

      Thanks Thomas!

  4. Nitin says:

    This is simply knowledgeable post. Thanks.

  5. liujianyang says:

    Great post Bill – I have the exact scenario as you’ve described above 128G with SQL and SSAS but I am confused about how do you guarantee x amount of memory for misc + OS 6G if your SQL set to no limits? My understanding is that SQL will take all memory as it could on the server, plus how about SSAS?

    1. Bill says:

      yes, SQL Server will take as much memory as it thinks it needs (using the default memory settings)… SSAS (default) memory settings are percentage based and it does a pretty good job of keeping it’s memory consumption below the high/hard limits.

      it might be easier had i not screwed up the last image in that section showing the recommended limits. The MAX value for SQL should read 32GB… NOT 2048 GB… and the LOW value for SQL can be anything under 32 GB… (just updated it)

      does that make more sense now?

  6. Kraaitje says:

    Thanks, this will help me setting up a new environment.

Leave a Reply