Sending Users to the Geographically Nearest Azure Analysis Services


I’ve been preparing my Automating Azure Analysis Services talk for the May 19, 2018 SQL Saturday in Dallas. The session abstract is:

The elasticity of the cloud lets you provide a better user experience instead of breaking the bank running your solution at the highest scale 24 hours a day. This demo-heavy session will show how to automate Azure Analysis Services. We will look at solutions to the following questions: How do you schedule a refresh of the data in your Azure Analysis Services model? How do you pause the server on the weekends? Better yet, how can you pause it when the last user disconnects and start it automatically when the first user connects in the morning? How do you scale to meet peak demand each day? How can Azure Analysis Services load balance across servers in multiple countries so that the user connects to the geographically closest server? How do you extend your on-premises SSAS solution into Azure Analysis Services?

This blog post is about the bolded part of that abstract. In other words, it is about users from a multi-national corporation connecting automatically to the geographically nearest Azure Analysis Services instance. Why? Performance. High availability. Cost optimization. Even data security involving national borders.


The Ingredients

Most Azure solutions are a bit like a soup pot where you toss in various Azure technologies as ingredients to create something greater than the sum of their parts. In this instance, we’ll mince up some Azure Traffic Manager, some Azure Web Apps, and some Azure Analysis Services to create a pretty tasty solution.



How Users Connect

In the diagram above, we have some worldwide users. Everyone will connect to Azure Analysis Services via the server name they were provided link:// That server name could easily have been a vanity URL such as link:// using custom DNS. This uses the server aliases feature of Azure Analysis Services and the latest drivers. (Notice Christian Wade mentioned the idea of geographic redirection in that article!)


When a user in North America connects, Azure Traffic Manager sends them to the Azure Analysis Services instance in South Central US because it determined that was geographically closest. But when a user in Asia connects, it determines that the Japan East instance of Azure AS is closest. This all happens automatically and transparently to the user. And when the Japan Azure AS instance and Japan web app are paused for the night, any Japanese users who are working late will end up connecting to the South Central US instance. So this solution can also add an extra layer of high availability or cost optimization.

How does this work?

Azure Traffic Manager is just a DNS level service which understands user geography based upon IP addresses and which understands the health of your backend services. It pairs these two pieces of information together and decides where to route a user by returning the appropriate DNS. In this instance, we tell Azure Traffic Manager to route based upon the “Performance” routing method which redirects you to the geo which will have the lowest latency for you. However, review the other routing methods to determine which one matches your needs best.


You setup endpoints in Azure Traffic Manager to tell it where it can route traffic.


Azure Traffic Manager will monitor those two websites to ensure they are up.

You can use to resolve the DNS from servers around the world to test your Azure Traffic Manager routing method is working as expected.


You can see that users in Malaysia, Japan, and China would get routed to the Japan East server while other users would end up in the South Central US server.

Azure Traffic Manager configuration is so easy I literally set it up from a web browser on my phone while I was at a dead stop in traffic.

Setting up the Web Apps

How were the web apps from above setup? In this instance, the web apps are very simple. Create two web apps. In order to easily integrate with Azure Traffic Manager, you need to choose at least the S1 Standard tier as lower tiers don’t support Azure Traffic Manager.


Next, edit the website files via a Visual Studio project or other methods such as the in-browser App Service Editor which I will show below.



Unless you have more advanced needs, the code can be very simple. The Japan East web app should return the string azureas://<yourservername> while the South Central US web app should return asazure://<yourservername>.


The URL https://<yourwebsite> will work over SSL properly, but if you try to connect via then you will get an SSL certificate error since the SSL cert doesn’t match the URL. You need to purchase an SSL certificate for this URL and apply it to the website for your URL (or your vanity URL). Since this was just a demo not a production environment, I used Let’s Encrypt to create a free SSL cert. Why do we need SSL? The link:// syntax uses HTTPS under the covers and will not work if there’s an invalid certificate.

What if I’m using Analysis Services on a VM?

Connection to Azure Analysis Services is a little different because it connects over HTTPS and supports the link:// syntax. So how do you accomplish the same solution with Analysis Services on a VM? If you already enabled msmdpump.dll for HTTP connectivity to Analysis Services, then just point Traffic Manager at those URLs. Assuming you haven’t enabled HTTP connectivity or don’t want to, you can just eliminate the web apps and point Azure Traffic Manager at the public IP of the SSAS server and have it monitor on port 2383 TCP.



What if I Want To Load Balance Across On-Premises SSAS and Azure Analysis Services?

Let’s say you have an on premises SSAS Tabular install which is heavily used by users in your corporate office. But what if you need to burst to the cloud during busy month-end reporting? Or what if you need to put Azure Analysis Services in Japan for Japanese users but allow your corporate users to continue using their on-premises SSAS server? Well, I suspect this isn’t a supported scenario since the versions of the SSAS engine on prem vs. in Azure Analysis Services will be slightly different. But supportability doesn’t quash my curiosity (even if it’s not wise).

If Excel users are your target audience, then have the website return a dynamic ODC file that points to either the on-premises server or the cloud azureas:// server name depending on the location of the user.

As for other client tools, I haven’t figured out an elegant way. Maybe you will!


I have personally experienced on-premises global projects where users that were near the corporate office experienced great SSAS performance while users in Europe or Asia experienced annoyingly sluggish performance due to the network latency and bandwidth to get to US servers. This solution is a perfect fit for such global rollouts of Azure Analysis Services.