This post contains a list of various methods that can be used to process (i.e. load data into) an Azure AS tabular model. As you will see – not much has changed from the regular on-premise version (which is a very good thing as it softens the learning curve).
Note: In these examples, I’m using a very basic model comprised of 3 tables from the Adventure Works database running on a local (on-premise) VM – so in order for this to work (i.e. Azure AS model processing data from on-prem SQL database) I first had to install the AzureASGateway.
Management Studio – GUI
Management Studio – JSON/TMSL
{
"refresh": {
"type": "full",
"objects": [
{
"database": "AdventureWorksAzureAS"
}
]
}
}
PowerShell
Again, same as it ever was… just remember to use the new syntax/arguments specific for 1200-compatibility models (i.e. RefreshType instead of ProcessType)… if you don’t you will get an error.
Invoke-ProcessASDatabase ` -Server "asazure://southcentralus.asazure.windows.net/ssas01" ` -DatabaseName "AdventureWorksAzureAS" ` -RefreshType Full Invoke-ProcessASDatabase ` -Server "asazure://southcentralus.asazure.windows.net/ssas01" ` -DatabaseName "AdventureWorksAzureAS" ` -RefreshType ClearValues
Or, alternatively, you can use the Invoke-ASCmd function w/ a TMSL command (like the one shown above in the management studio section)…which in my opinion isn’t quite as pretty but may save some time if you’re already using TMSL commands.
Invoke-ASCmd ` -Server "asazure://southcentralus.asazure.windows.net/ssas01" ` -Database "AdventureWorksAzureAS" ` -Query "{ `"refresh`": { `"type`": `"full`", `"objects`": [ { `"database`": `"AdventureWorksAzureAS`" } ] } }"
.NET
In some cases it will make sense to use .NET (w/ TOM) – for example if you’re dealing with a complex partitioning scenario and already have the .NET skills in your toolbox.
using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Threading.Tasks; using Microsoft.AnalysisServices.Tabular; namespace ProcessASDatabase_CSharp { class Program { static void Main(string[] args) { try { Server as_svr = new Server(); String as_conn_str = "Provider=MSOLAP;Data Source=asazure://southcentralus.asazure.windows.net/ssas01;"; as_svr.Connect(as_conn_str); Database as_db = as_svr.Databases.FindByName("AdventureWorksAzureAS"); as_db.Model.RequestRefresh(RefreshType.Full); as_db.Model.SaveChanges(); } catch (Exception e) { Console.Write(e.ToString()); Console.ReadKey(); } } } }
For a more elaborate example (from someone who actually knows how to write .NET code) check out Christian Wade’s AsPartitionProcessing project over on GitHub.
Quick Word on Scheduling
In more traditional scenarios, processing will continue to be handled the same as it always was… e.g. scheduled agent job (on-prem or Azure IaaS), SSIS, etc. However, for those who’ve already made a break for the cloud, there are few new options worth considering.
Azure Data Factory
This is a cloud-based data integration service that automates the movement and transformation of data. The Data Factory service creates data integration solutions that can ingest data from various stores, transform and process the data, and publish the result data back to the data stores.
Long story short, if you’re building a complex data solution in Azure, you’ll most likely be using this feature to coordinate. Processing an Azure AS database will hook right in just like any other step… .NET, Azure PowerShell, etc.
Here are a few references to help get you started:
- https://docs.microsoft.com/en-us/azure/data-factory/data-factory-introduction
- https://docs.microsoft.com/en-us/azure/data-factory/data-factory-sdks
- Orchestrate Azure Data Factory pipelines and other Azure Data Platform management tasks using Azure Automation
Azure Functions
This is a new – and very interesting – option that I saw for the first time in the Azure AS presentation at PASS Summit last month. It’s described as an event-based serverless compute experience to accelerate your development that can scale based on demand where you pay only for the resources you consume.
Check back in the coming weeks (once I’ve had more time to play/explore/learn). In the meantime, here are a few references to help you get started:
- https://functions.azure.com
- https://azure.microsoft.com/en-us/get-started/ (see very bottom: “Serverless Computing”)
24 replies on “Processing an Azure AS Database”
[…] Bill Anton shows how to process an Azure Analysis Services tabular model: […]
LikeLike
Hi, thank you for the article. When I try to run Invoke-ProcessASDatabase against Azure AS, I’m getting the following: Invoke-ProcessASDatabase : The value ‘ClaimsToken’ is not supported for the connection string property ‘Integrated Security’.
I tried to specify a Credential parameter but to no avail. Any advice please?
LikeLike
are you running this from a PowerShell window on a workstation or from something like Azure Automation?
LikeLike
Hi,
When I try to run Invoke-ProcessASDatabase against Azure AS, I’m getting the following: Invoke-ProcessASDatabase : The value ‘ClaimsToken’ is not supported for the connection string property ‘Integrated Security’
I am executing PowerShell window from Azure VM.
I tried to specify a Credential parameter but still same error. Any advice please?
LikeLike
hi Mohan – can you show the full command syntax you’re using?
LikeLike
$Credential=Get-Credential
Invoke-ProcessASDatabase `
-Server “asazure://test.asazure.windows.net/testservice” `
-DatabaseName “Test_Demo” -Credential $Credential -RefreshType Full
LikeLike
and do you get an error if you try w/out the credential parameter?
LikeLike
and do you get an error if you try w/out the credential parameter?
Yes, I am getting error .
LikeLike
How you are passing Azure AS credential ?
LikeLike
the credentials were cached ahead of time.
if an account isn’t already cached, you should get an internet explorer popup window when you run the Invoke-ProcessASDatabase command for the first time.
try the instructions over on Chris’ blog (https://blog.crossjoin.co.uk/2016/10/27/deleting-cached-logins-for-azure-analysis-services-in-sql-server-management-studio/) to remove cached credentials. Then, the next time you run the Invoke-ProcessASDatabase command, you should see the popup request for azure credentials.
LikeLike
Thanks for helping…Good post.
After installing latest client libraries (MSOLAP (amd64),MSOLAP (x86),AMO,ADOMD).
Now I am able to access to Azure AS using Azure Powershell .
Thanks.
LikeLike
Great Post.
LikeLike
thanks – glad you liked it
LikeLike
Hi,
Have you had luck running the C# script using Azure Batch or Azure Functions? Or have you been running it on-premise?
Thanks
Mehmet
LikeLike
Hi Mehmet –
so far, I’ve only tested the C# code from an on-premise workstation. I did take a quick peek at at Azure Functions but wasn’t able to fully wrap my head around it before burning through all my azure credits for November. Leveraging Azure functions is high on my to-do list for December.
LikeLike
While Processing the Azure AS from Power Shell I am getting an error “Invoke-ProcessASDatabase” is not recognized as the name of cmdlet. Let me know if I need import any libs
LikeLike
Hi Anand,
Yes, you will need to import the SQLPS module / SQLAS cmdlets. This should be handled automagically during SQL Server / Analysis Services installation…
LikeLike
Thanks Bill I am able to process the cube now.
After importing module
Install-module -Name SqlServer -Scope CurrentUser
https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-ps-module
LikeLike
Hi, I have a scenario where my tabular model has to be processed after completing ETL using azure data factory. It will be like, after data load is done then process model.
How can i achieve this?
LikeLike
There’s probably a better way to do this… but one option is to use a ADF custom activity and some C# code.
LikeLike
Hi Bill, I have been using the powershell method to automatically refresh Azure AS Cube for quite sometime now, via SQL Server Job on Azure VM, but since last 2 months it has been failing with below error:
(No changes made to script – it suddenly stopped working, trying to debug it)
Correct the script and reschedule the job. The error information returned by PowerShell is: ‘Object reference not set to an instance of an object. ‘. Process Exit Code -1. The step failed.
Interesting point is, the same command works fine from powershell window, but not via SQL Server Job. It used to work earlier, but not now for some reason…
Any thoughts? Can you also confirm if the powershell method still works for you.
LikeLike
Is it possible there were changes to the azure resources?
Under what credentials was the executing when run as a SQL Agent job? Does that account still have permissions to the model?
LikeLike
can i use SSIS “analysis services process” task to refresh azure analysis service model?
LikeLike
please give reply for above comment
LikeLike