Using EffectiveUserName To Impersonate in SSAS

It's common to implement your role-based security in your Analysis Services cube. That way Fred can only see the sales from the east region, but Frank can only see sales from the west region. Role-based security requires connections to your cube identify the end user so the cube can secure itself. One way of tackling this is to implement Kerberos so that the credentials of the end user pass from the end user's laptop, to the report server, to the cube.

But what if your end user's laptop is on another company's domain? Or what if your end user's laptop is on a home workgroup? That laptop can't start a Kerberos ticket that will be valid on your domain. One option is to throw ISA Server 2006 into this mix, use it to present the user with a login form over HTTPS, then let ISA start the Kerberos ticket:

For either of the above diagrams, you would setup your data source in SSRS as follows:

We would definitely recommend the above approach, but we realize that ISA and Kerberos add a good deal of complexity to your infrastructure. So the point of this blog post is to mention a good trick for getting role-based security without requiring your end-users' laptop to be able to start a Kerberos ticket.

Before I mention the main trick that's the focus of this post, I should mention that enabling basic auth over HTTPS in SSRS is one possibility to solve this problem. It's not as good of a user experience as the single sign-on login form you can customize with ISA Server, but it works.

A Checkbox in SSRS Many People Misunderstand

The following setup of a data source in SSRS will allow you to pass the end user's credentials to the cube without having a Kerberos ticket start at the end user's laptop, and without having ISA Server.


Because you stored credentials, SSRS will open the connection to SSAS using those domain\SSASAdminUser credentials. Because you checked the "Impersonate the authenticated user after a connection has been made to the data source" checkbox, behind the scenes SSRS is going to append the EffectiveUserName property into the connection string, specifying the end user who is running the report. The EffectiveUserName property tells SSAS to pretend to be the specified user so that the results of MDX queries use the role-based security setup for that end user.

Only SSAS administrators can use the EffectiveUserName property, so the domain\SSASAdminUser must be an SSAS admin. The other requirement is that you must setup Kerberos on the SSAS server. To be clear, you need to setup the SPN on the SSAS server and you need to setup the AD configurations for delegation, but you don't need to setup Kerberos on any other servers besides the SSAS server. So this is a fairly quick way to get up in running without having your whole network setup for Kerberos.

Since you're using a service account which is an SSAS admin, carefully validate that you've setup the data source correctly to use EffectiveUserName impersonation. You wouldn't want to show your users more data than they're allowed to see just because you forgot to check a box!

SSRS Subscriptions

What happens when an end user tries to create a new subscription on a report with the above data source setup? You get this annoying error message: "Subscriptions cannot be created because the credentials used to run the report are not stored, or if a linked report, the link is no longer valid." It says the credentials have to be stored. Uh. They are. But it's just complaining that we've checked the "impersonate the authenticated user" checkbox. My personal opinion is that this is an oversight. Feel free to vote for this Connect suggestion.

So we have to go for a workaround. The best workaround I could find is to use an expression-based connection string. So edit your report in BIDS, edit the data source, click the fx (expression) button, and enter the following expression-based connection string:

="Data Source=SSAS;Initial Catalog=YourDB;EffectiveUserName=" & User!UserID

After you deploy it, check the data source has stored credentials. Don't check the "Impersonate the authenticated user" checkbox.

Now users can create their own subscriptions. When the subscriptions run, the user receives report data according to his role-based security.

Data-Driven Subscriptions

Now with your expression-based connection string, you try to create a data-driven subscription. And you get this error message: "Data-driven subscriptions cannot be created because the credentials used to run the report are not stored, the report is using user-defined parameter values, or if a linked report, the link is no longer valid." I believe it's complaining about the user of User!UserID.

You need to create a separate report for data-driven subscriptions. The expression-based connection string in the report needs to be changed to:

="Data Source=SSAS;Initial Catalog=YourDB;EffectiveUserName=" & Parameters!User.Value

Then you need to create a hidden parameter called User. Then the data-driven subscription needs to pass in the appropriate domain\username into that parameter.

Make sure end users don't have access to this report designed for data-driven subscriptions because if they figure out how to set the User parameter, they could run the report and see someone else's data.

The Dark Side of Expression-Based Connection Strings

Note expression-based connection strings can't be used in shared data sources. So you have to manage this one report at a time. Plus you can't even set the expression-based connection string from Report Manager. So you have to set the expression-based connection string in the report at development time, and this makes it a pain to deploy the same report to dev and prod without changing the RDL. Also, once you have changed your report to use an expression-based connection string, you can't open the query designer. So expression-based connection strings really get in the way of the proper development, deployment, and manageability experience.

So you might consider developing your reports with regular shared data sources. Then after you deploy the reports to the server, you can run .NET code like the RSExpressionBasedConnectionString sample code I worked up. Once you've got the sample code, edit the app.config pointing it at a folder in SSRS. It loops through all the reports in that folder, finds SSAS data sources, and changes them all to expression-based connection strings. You'll obviously need to tweak the code a good deal to have it do exactly what you want in your environment, but hopefully that's a start. I might know a few good consultants if you happen to need help customizing this for your needs. :-)

Teo Lachev also suggested to me that you could use an RDCE to accomplish the same thing as that .NET code above. But just like with the .NET code above which has to run after the report has been deployed, you'd have to set each report to use the RDCE after it is deployed. Neither option is ideal, but both would work.


Knowing about that "Impersonate the authenticated user" checkbox is a handy trick to have in your bag. Knowing that it's using EffectiveUserName under the covers is very helpful because it lets you use an expression-based connection string to enable subscriptions. But expression-based connection strings are almost more pain than they are worth. So you might consider automating the setup of those expression-based connection strings with some code like what I have mentioned above.

SSRS – Multiple Row Grouping Elements to appear under a Single Column within a Table in a report

This is a guest post by a co-worker Ram Sangam.

In SSRS 2008, when you create multiple Row groupings based on the dataset that is being consumed in the report, SSRS adds these grouping elements as separate columns in the table of the report. In the example shown below (Fig.1), Adventure Works DW Sales information is being retrieved by Territory and Product Classification (using database AdventureWorksDW and the tables FactResellerSales, DimProduct, DimProductSubCategory, DimProductCategory and DimSalesTerritory). The row groupings in the table are Geographical Group(GeoGroup), Geographical Region(GeoRegion), Product Category and Product Sub Category. By default, these groupings (Group, Region, Product Category and Product Sub Category) are added to the report as columns in the report making the table wider.



At times, it may be required these row grouping elements to be part of a single column within the table, giving it a tree view control experience (as shown below in Fig.2)



The implementation of this look and feel required creating the Row Grouping in this following sequence.

- Insert a table from the Toolbox onto the report.

- Drag and drop the data field “TotalSales” from the dataset defined for the report on to the Data section of the table that was inserted in the previous step. This sets the dataset property for the table that we will be working with in this example.

- By default, the table is created with 2 rows (a header and a detail) and 3 columns. Delete the 2 columns which are currently not used in the report.

- From the “Row Groups” pane in the bottom of the screen, right click on the “=(Details)” row and “Add Group->Parent Group…”. Create the row grouping based on the top level “Geo Group” in this case (Be sure to check the option “Add Group Header”) and click OK.

- For the column “TotalSales” column use the “SUM(Fields!Totalsales.Value)” as its value.

- From the “Row Groups” pane in the bottom of the screen, right click on the group that we created and select Group properties to name the group appropriately for the report (in this case, “grpGeoGroup”).

- From the “Row Groups” pane in the bottom of the screen, right click on the “grpGeoGroup” row and “Add Group->Child Group…”. Create the row grouping based on the top level “Geo Region” in this case (Be sure to check the option “Add Group Header”) and click OK.

- For the column “TotalSales” column use the “SUM(Fields!Totalsales.Value)” as its value.

- The cell above the new group that was added (in this case “Geo Region”) will be empty by default. Set the property of the cell to the value of the parent group element (in this case “Geo Group”).

- Delete the left most column of the report, in this case the column holding the “Geo Group” information.

- Repeat these steps for the additional groups (namely Product Category and Product Sub Category) that are to be created.

- After creating all the groups, from the “Row Groups” pane in the bottom of the screen, right click on the “=(Details)” group properties, Change the visibility option to “Hide”.

- After creating all the groups, the report designer should look similar to screen shot shown in Fig.3. (Note in this example, I have added color scheme and renamed the columns to easier reading)

- To get the indentation in each level, use the groupings “TextBox Properties…->Alignment->Padding Options (Left)” to get the desired effect.

- To get the tree view click experience, use Row Groups pane from the bottom of the screen and for each Row Groupings’ “Group Properties…->Visibility->Display can be toggled by this report item” option to refer to the parent row group data element. For e.g., Product Sub Category grouping to use “Product Category” textbox as its toggle item.

- In the cases where the dataset is being served by Analysis Services, and when the MDX query returns subtotals in addition to the details, instead of SUM, use the "Aggregate()" for each of the group totals in the SSRS textbox. In addition to this, you will need to either delete the “=(details)” group from the “Row Groups” pane or make the detail row cell separate from its parent group (in this case “Product Sub Category”). This is accomplished by right-clicking on the “Product Sub Category” cell and selecting “Split Cells”.


Long Navigation Parameters

I was shocked recently when my nice report that was executing in 2 seconds suddenly took 22 seconds to finish after I enabled navigation (i.e. drillthrough links) in every textbox in the report. The point of this post is to explain what’s going on and suggest workarounds to improve performance.

The Symptoms

This particular report was an MDX report in Reporting Services 2005. If your MDX reports are anything like mine, you end up having an obscene number of multi-select parameters to give the end user maximum flexibility. This report was no exception. In SQL reports, the parameter values are often integers. However, in MDX reports, the parameter values are usually MDX unique member names such as [Product].[Category].&[Bikes]. You can already see the problem. If you have many large multi-select parameters, the concatenated list of selected parameter values becomes large. In fact, for this report, that list ended up being about 30KB long. Multiply that times 3,000 textboxes in the report, and your speedy report is now a 90MB beast.

Another interesting thing I noticed is that of the TimeDataRetrieval, TimeProcessing, and TimeRendering columns in ReportServer.dbo.ExecutionLog, the only one that changed when I turned on navigation was the TimeProcessing column.

Also, you might guess shortening the name your “Geography” report parameter to “G” would help, but in fact, the length of the report parameter name does not matter (just the length of the selected parameter values).

The final interesting thing I noticed is that the Drillthrough chunk in ReportServerTempDB was massive. Clearly, part of the expense of lots of huge drillthrough links was in constructing and saving that chunk to ReportServerTempDB. The following query shows you the chunks and their sizes:

select sd.SnapshotDataID
,datalength(c.content) as ChunkBytes
from ReportServerTempDB.dbo.SnapshotData sd
join ReportServerTempDB.dbo.SessionData s
on s.SnapshotDataID = sd.SnapshotDataID
join ReportServerTempDB.dbo.ChunkData c
on c.SnapshotDataID = sd.SnapshotDataID
order by sd.CreatedDate, c.ChunkName

I did a little research into how drillthrough works in SSRS 2005. Here is my understanding based solely on observation of the HTML in Report Manager and watching SQL profiler events against the ReportServer database. When a report is rendered in Report Manager, it does not pass the entire URL for each drillthrough link to the web browser. All it does is capture the cell coordinates you clicked when you click a drillthrough link. Then it posts that event back to the server. Then the server runs a few SQL queries against ReportServerTempDB to lookup info from that ChunkData table. I was rather impressed with how efficient this lookup was as it did not require reading the entire chunk. From what I can tell, that drillthrough chunk is some sort of binary structure that is internally indexed somehow. When the server goes to lookup the full drillthrough URL for the link you just clicked, it grabs the first few segments of that drillthrough chunk (which I’m guessing is loading the “index” for the entire chunk) and then it skips right to the segment of that chunk which has the info it needs. So my conclusion is that in SSRS 2005, they have done a good job optimizing the click of a drillthrough link. However, the up-front processing required to optimized that click is rather expensive.

Indications so far are that SSRS 2008 will improve this scenario. I tried this in the November CTP of SSRS 2008 and the report with the navigation rendered the first page 2x faster than on SSRS 2005 SP2. Interestingly enough, the time difference between a report with navigation and a report without navigation now shows up in SSRS 2008 under the TimeRendering column, not the TimeProcessing column. Also, the chunks are stored differently in SSRS 2008. Hopefully someone from Microsoft will fill us in on how they were able to achieve this performance improvement. (Update: Robert Bruckner from Microsoft responded here.)

Workaround #1

Short of waiting for SSRS 2008, what is the workaround? I wrote an MDX sproc called ASSP.KeysStrToSet. You can refer to all the technical details, get some usage examples, and download some sample reports at that link. But let me explain the workaround at a high level. There are two parts.

First, you need to find a way to shorten the parameter values. In general, this is pretty easy as you can just use the surrogate key (i.e. use “1” as the key instead of “[Product].[Product].&[1]”). For optimal performance, change the parameter datatype to Integer if possible. However, there are a couple of more advanced scenarios such as composite keys or nullable keys which add some complexity. The link to the Analysis Services Stored Procedure project above explains all these advanced scenarios and the solution.

Second, you need to figure out a way to let MDX accept those shortened parameter values. That’s where the ASSP.KeysStrToSet function comes into play. Here’s an example. The following two expressions return the same set:

StrToSet("{ [Product].[Product].&[1],[Product].[Product].&[2] }")
ASSP.KeysStrToSet("[Product].[Product]", "1,2")

How much improvement should you expect from this workaround? Our tests on reports with lots of navigation-enabled textboxes have shown anywhere from a 2x to an 8x performance improvement in the time it takes to render to the browser in Report Manager. Of course, that depends upon how many textboxes have navigation enabled and on how large the drillthrough links are for each textbox.

I suspect utilizing this workaround with SSRS 2008 will make the performance improvement even greater. (Unfortunately, I can’t test that theory on the November CTP at the moment because of a bug.)

If you agree with this approach, please vote for this feature suggestion so that KeysStrToSet functionality becomes available in a built-in MDX function.

Workaround #2

Another possible workaround is to add a “Show Drillthrough?” parameter that controls whether or not to enable drillthrough on the textboxes. Then in the “Jump to Report” property, put the following expression:


In SSRS 2005 (it’s not necessary in SSRS 2008 apparently), you also have to wrap each parameter you hook up to the drill report in that way:


You should probably use workaround #2 in conjunction with workaround #1 to achieve good performance when they flip the “Show Drillthrough?” parameter to True.

Note you may have to make that “Show Drillthrough?” parameter a string to workaround a bug.


Enabling drillthrough on every cell in a report can drastically slow report rendering if the drillthrough parameters are very long. However, you can adequately workaround this problem by shortening the parameter values and dynamically disabling drillthrough when it is not needed.

Retrieving Cell Properties

The point of this post is to discuss how to retrieve cell properties. Specifically, I want to call attention to the ReturnCellProperties setting on an Analysis Services connection and command. I will also talk about how you can use cell properties in Reporting Services reports.

It must be said that Teo Lachev beat me on publishing this tip. I had already drafted most of this post and had a couple of extra things to say on the topic, so I’m publishing it anyway.

In an MDX query, you’ve got dimension properties and cell properties. Dimension properties are intrinsic member properties such as MEMBER_CAPTION in addition to member properties you define in your dimension (such as the Customer attribute having a member property called Email). Cell properties are properties such as VALUE, FORMATTED_VALUE, and FORMAT_STRING which are simply extra information about a cell.

In most cubes, an effort has been made to properly set the FORMAT_STRING for all measures so that percents are formatted as percents and currency is formatted as currency. (Sometimes, the calc script even customizes the FORMAT_STRING for particular slices of the cube.) Therefore, it’s important to know how to retrieve the FORMATTED_VALUE cell property so that you can fully leverage all the cube provides you.

When executing an MDX query, you can retrieve the results in two-dimensional (DataReader/DataSet) or multi-dimensional (CellSet) format. The CellSet object lets you easily access any cell properties the query returns. On the other hand, it is not obvious how it is possible to retrieve any cell property other than VALUE in a DataReader/DataSet.

Putting the ReturnCellProperties=True property on the connection string changes the columns you get in a DataReader/DataSet:

AdomdConnection conn = new AdomdConnection();
conn.ConnectionString =
"Data Source=(local);Initial Catalog=YourDB;ReturnCellProperties=True";

Similarly, setting the ReturnCellProperties property on a command object will produce the same effect:

AdomdCommand cmd = new AdomdCommand();
cmd.Properties.Add("ReturnCellProperties", true);

This is obviously relevant if you’ve got a custom .NET application that executes MDX queries because it allows you to retrieve cell properties. It’s also relevant if you’re using MDX queries in Reporting Services. If you’re using the OLE DB provider to connect to Analysis Services (which is less common in SSRS2005), you can retrieve cell properties by adding ReturnCellProperties=True to the connection string which causes the extra columns to show up. However, a “Microsoft SQL Server Analysis Services” data source in Reporting Services automatically sets the ReturnCellProperties property (and the DbpropMsmdFlattened2 property, which I won’t discuss) to true on the underlying command object. That data source also hides the extra columns from you when you’re previewing your dataset, so you just have to trust that they are there.

select [Measures].[Internet Sales Amount] on 0,
[Product].[Category].Members on 1
from [Adventure Works]

If your MDX query for your report does not specify a CELL PROPERTIES clause (as in the query above), then you get the VALUE and FORMATTED_VALUE properties by default in Reporting Services. Obviously, =Fields!FieldName.Value retrieves the VALUE cell property. You can retrieve the FORMATTED_VALUE cell property with =Fields!FieldName.FormattedValue expression. More information on all the field properties you can use in SSRS expressions is available here.

select [Measures].[Internet Sales Amount] on 0,
[Product].[Category].Members on 1
from [Adventure Works]

If your MDX query for your report does specify a CELL PROPERTIES clause (as in the query above), you override the defaults. This is useful for instructing Reporting Services to retrieve fewer properties for performance reasons. (For instance, a best practice is to specify “CELL PROPERTIES VALUE” so you don’t retrieve FORMATTED_VALUE if you’re not going to use the FormattedValue expression in your report.) Overriding the default CELL PROPERTIES is also useful to pull back extra properties such as FORMAT_STRING. Then in a report you can retrieve the FORMAT_STRING for a cell by using the =Fields!FieldName("FORMAT_STRING") expression.

There is a known limitation such that you cannot use the SSRS Aggregate expression and FormattedValue together. This limits the usefulness of the FormattedValue property. A workaround is to format the textbox in Reporting Services instead of using the FormattedValue property. This formatting could be hardcoded, but sometimes measures have different format strings for different slices of the cube. So if the format string is not constant, you can make the SSRS formatting dynamic by putting a =Fields!FieldName("FORMAT_STRING") expression on the Format property of a textbox. For footers, it would need to be =First(Fields!FieldName("FORMAT_STRING")). Note that .NET format strings (which SSRS uses) aren’t quite the same as SSAS format strings, so you may need to write a small function in the Code tab of your report and do replacements such as changing Currency to c0.

As a side note, Darren Gosbell blogged about an easy way to see the connection string properties for SSAS, but unfortunately, ReturnCellProperties is one of several properties that don’t appear in that list. You have to look at this page to see all the available properties. Mosha says that you can use all the XMLA properties on the connection string.

Reporting Services SP2 and MDX Summary Rows

If you have built any Reporting Services reports which use an Analysis Services datasource, SQL Server 2005 SP2 contains a fairly major change to be aware of. I’m going to describe the way it worked before SP2, then I’ll describe the change.

Prior to SQL Server 2005 SP2

There’s been a good deal of frustration about how not all rows from an MDX query are shown as detail rows when used in an SSRS report. For instance, the following query returns two rows.

select [Measures].[Internet Sales Amount] on 0,
{[Product].[Category].[Bikes], [Product].[Category].[All]} on 1
from [Adventure Works]

But if used in an SSRS report, you only get one detail row because SSRS ignores any rows which contain an All member in the tuple defining that row. To explain that differently, the leftmost columns in the dataset originate from the “on rows” or “on 1” axis of your MDX query. If any of those contain an All member, that row will not be shown as a detail row. Notice how the All member is rendered as a null when previewing the dataset:


Of course, you can still use the summary rows from the MDX query by putting the following expression in a table/matrix group or in the header/footer:


This behavior is sometimes helpful and sometimes annoying, especially for experienced MDX writers who expected the report to match their MDX query. It turns out that you now have an option with SP2.

With SQL Server 2005 SP2

If you reference any field from that dataset as the parameter for the Aggregate function in an expression, then the old behavior continues. But if you don’t, then all rows from the MDX query become detail rows in the report. This is on a per dataset basis, not a per table/matrix basis.

If your reports were built with correct usage of the Aggregate expression, they will work fine in SP2. The only way SP2 could break your report is if your MDX dataset contains summary rows which you never intended to be used. Obviously, that should never be the case, but it is easy to see how a developer who didn’t understand how it worked before SP2 could build a report that might break with SP2. For instance, let’s say you built a report off the query mentioned above and setup the Layout tab as follows:


When this report is run prior to SP2 it looks like:


When this same report is run with SP2 you get totally bogus numbers:

It’s worth carefully testing before you apply SP2 to your production servers.