I will be speaking on Aggregations in Power BI for the Irish Community Cafe / Dublin Power BI User Group

I am excited to be presenting on 06/22/20 at 9 am EDT for on the topic of using aggregations in Power BI. Meetup info for Dublin Power BI User Group and session details are below. #PowerBI #PBIUG

https://www.meetup.com/DublinPUG/events/270402348/

Accompanying older blog post here:

https://shabnamwatson.wordpress.com/2019/11/21/aggregations-in-power-bi/

Aggregations and Time Series calculations in Power BI

I have received a couple of questions about Aggregations in Power BI and whether they can be used to cover time series calculations such as Year to Date, Quarter to Date, and Month To Date. The answer is yes. Since time series calculations break down into calculations over a series of days, an aggregation table defined at day level with the basic summarization methods (min, max, sum, count) and the right relationship with a Date dimension, can answer Year to Date, Quarter to Date, and Month To Date calculations.

Let’s take a quick look at one such calcualtion and how it can be covered with an aggration. I am going to use the same version of AdventureWorks sample database and Power BI model that I used in my previous blog post on aggregations, with a few changes.

Sample Queries

My goal is to cover the following two queries that both return Year to Date Sales Amount by Date so that they can be answered from an Aggregation table vs. the main Sales table:

The first query uses the DATESYTD function to generate a set of dates to sum Sales Amount over.

EVALUATE
SUMMARIZECOLUMNS (
    'Date'[Date],
    "Sales Amount", 
    CALCULATE(SUM ( 'Internet Sales'[Sales Amount] ) ),
    "YTD Sales Amount",
    CALCULATE(SUM ( 'Internet Sales'[Sales Amount] ),
              ALL('Date'),
              DATESYTD('Date'[Date])
              )
)
ORDER BY 'Date'[Date]

The above query can also be written using the TotalYTD function instead and returns the same results.

EVALUATE
SUMMARIZECOLUMNS (
    'Date'[Date],
    "Sales Amount", 
    CALCULATE(SUM ( 'Internet Sales'[Sales Amount] ) ),
    "YTD Sales Amount",
    TOTALYTD(SUM ( 'Internet Sales'[Sales Amount] ), 'Date'[Date])
)
ORDER BY 'Date'[Date]

If you were to graph these results, you would see something like this:

Creating the Aggregation table

I created an aggregation table at Day level in AdventueWorks database using the following SQL code:

SELECT	fs.DateKey,
		SUM([Sales Amount]) AS [Sales Amount]
INTO dbo.Sales_Agg4_By_Day
FROM [dbo].[vFactInternetSales] fs
GROUP By fs.DateKey

This table has 1,024 rows which is much smaller than the original sales table with 60,398 rows.

Power BI Model

I imported this table into the model and connected it to the Date table on DateKey. Notice how in the model below, this relationship is “1 to many” with the aggregation table on the many side. I will explain this in more details in the next section.

Relationship between the Sales_Agg4_By_Day and Date tables

When I first connected the Sales_Agg4_By_Day and Date tables, Power BI created a 1-to-1 relationship between the two tables with a bi-directional cross filter. I had to change this before I could configure the aggregation table, otherwise the interface did not allow it and gave a series of warning messages, including:

An aggregation table cannot have a bidirectional filter.
An aggregation table cannot be referenced by a one to one relationship.

Table storage modes

I explained table storage modes in detail in my previous blog post on Aggregations. Here everything is the same as it was in that post with the exception of the Date table:

  • The Internet Sales and Sales Territory tables are in Direct Query storage mode.
  • The Product tables are in Dual storage mode.
  • The Sales_Agg4_By_Day table has Import storage mode.

The Date dimension table is in Dual storage mode which is a change from my previous post where it was left in Direct Query. Here, its storage mode must be set to Dual; otherwise, even after the Sales_Agg4_By_Day table is configured as an aggregation, it won’t be used to cover queries. This is because the relationship between the Sales_Agg4_By_Day and Date tables is considered to be a weak relationships:
One side in Import, the other side in Direct Query.

It is interesting that in this case, Power BI won’t stop you from configuring the aggregation, it just won’t be used. If you use DAX Studio or another tool to check the usage of the aggregation, you will see a failed attempt at using the aggregation table and the only message you get is that no mapping was found for the DateKey column even though you have a relationship in place. Even though there is no mention of a weak relationship in the reason for the miss, the weak relationship is the main culprit here.

It is important to remember:

Agggreation hits based on relationships are only considered when the tables involved in the query have the same storage mode or one side is at least set to Dual where it can change its nature on demand.

Configuring the Sales_Agg4_By_Day table

Configuring the Sales_Agg4_By_Day table is easy. The following picture shows the details:

Since there is already a relationship between the Date and Sales_Agg4_By_Day tables, there is not need to fill out the row for DateKey in the aggregation properties window.

Profiling the sample queries

I used DAX Studio to connect to the Power BI model. I ran the sample queries after configuring the aggregation with the Server Timings option selected so that I can see the details of how the query was answered. (Alternatively, you could use SQL Server Profiler/Extended Events and run a trace against your PowerBI model to monitor the same event but that is a lot more work.)

Both queries were able to use the aggregation table instead of the main sales table. The following shows the details of the “Aggregate Rewrite Attempt” event. It shows that Power BI was smart enough to use the aggregation after it broke down the TotalYTD and DatesYTD functions into a series of dates.

Conclusion

Time series functions that break down to a series of dates, can be covered by an aggregation table defined at day level.

Connecting live from Power BI Desktop to SSAS generates this error: The Server you’re trying to connect to doesn’t have any models or you don’t have permissions to access them.

I recently ran into an error while connecting live from Power BI Desktop to a SSAS server. Everything was on-premises.There was no cloud component involved. I had full admin rights to the SSAS server and could see all databases and models from SSMS and other tools but was getting this error from PBI Desktop.

The Server you’re trying to connect to doesn’t have any models or you don’t have permissions to access them.

If I explicitly specified the name of the database I wanted to connect to, then it would connect and show me the database contents but if I did not specify the database name, I would get this error. The question is why? Obviously it wasn’t a permissions problem because I could connect to that database if I specified its name and I was admin on the SSAS server and could see that database and other databases from other front end tools such as Excel and SSMS. You can get this exact same error because of permission issues and in those cases this is a valid error. Here I am only exploring the case, where all correct permissions are in place and I still get this error.

The problem was …

The default database on that server was created with a script as an empty shell database for someone to use later. It has stayed empty and never deployed to. This empty database was causing the problem. Here is an example of a simple JSON script that creates an empty SSAS database:

{
"create": {
"database": {
"name": "test 3",
"compatibilityLevel": 1200
}
}
}

Note: This exact same error can happen with SSAS Multidimensional if the first database created on the server has a cube that is in an unprocessed mode.

Default Database on SSAS

The first database that is created on a SSAS server becomes the default database. There is no server property that you can set to modify this later. At least I could not find one through SSMS. I suspect this is in one of the meta data files but I did not find it readily. On the other hand, as I will explain later, there is a SSAS property that can disable automatically setting the first database to the default database but that does not give a choice to set the default database explicitly. Changing this setting has its own implications that I will get into later in this post.

How I found out the cause of the error

I started a SQL Server profiler trace on the SSAS server (You can use Extended Events as well) and attempted the same live connection from Power BI Desktop with different databases on the server. I noticed something very interesting.  The query that Power BI Desktop submits to SSAS to discover the list of databases, seemed different in the following situations:

  1. The SSAS sever is completely empty. No databases exist on the server.
  2. The state of the of the default database. (Empty or unprocessed for Multidimensional)
  3. The AutoDetectDefaultCatalog SSAS server setting.

Let’s explore these.

1) The server is completely empty.

In this case, there are no database on the server. Power BI Desktop can connect live to this server successfully, even though I got an empty list of databases as expected.

Query Profiler With Empty Server
PBIDesktopCanLiveConnectToEmptyServer
PBIDesktopCanLiveConnectToEmptyServer2

In this case, SQL Server Profiler captured the following query from Power BI Desktop:

Select [CATALOG_NAME], [DESCRIPTION], [DATE_MODIFIED], [COMPATIBILITY_LEVEL]
from $System.DBSCHEMA_CATALOGS 
Query Profiler With Empty Server

Notice how this query was run against the server and that the DatabaseName column is empty in the trace shown above. This is possible because this is a server level query. It uses one of the SSAS DMVs (Dynamic Management Views) to get metadata. It returns a list of all databases on the server. You can run this query in SSMS in a DMX query window. Here is an example of what this query would have returned if there were databases on this server:

DMXQuery1Results

To summarize, Power BI Desktop worked successfully in this case. There were no databases to show and no errors were generated. The reason I included this case is for you to see how the query changes in the next case.

2) The default database is empty.

Continuing with the same empty server, I scripted an empty database using the script shown previously. The first database created on a server becomes the default database on the server. This is what I could see at this point from SSMS on this server:

Empty Scripted Database From SSMS

I went to Power BI Desktop and attempted to live connect without specifying the database name. I got this error:

Main Error

The question is: What did adding an empty scripted database do that changed the Power BI Desktop’s behavior?

Here is the DMX query that Power BI Desktop submitted this time which is different than the one in case 1:

Select [CUBE_NAME] from $System.MDSCHEMA_CUBES where [CUBE_SOURCE] = 1
Query Profiler With One Empty Scripted

Notice how in the Profiler trace, the Database is set to “test 3”.

If I run this query in SSMS, it returns no results because that database is empty. There are no models defined in it yet. If there were models in the database, I would have seen something like the following:

DMX Query 2 Results

Note that this DMV has existed since SSAS  Multidimensional, hence the name Cube but it works against both Tabular and Multidimensional SSAS databases, as long as they have a model/processed cube.

3) An empty default database and a non empty database

To test this further, I added another database from a backup file of AdventureWorks. This is what SSMS was showing:

Select [CUBE_NAME] from $System.MDSCHEMA_CUBES where [CUBE_SOURCE] = 1
Added AdventureWorksFromBackup_SSMS

I went back to Power BI Desktop and attempted to live connect as before without specifying a database name. Power BI desktop still generated the same error even though AdventureWorks was fully browsable from SSMS.

Select [CUBE_NAME] from $System.MDSCHEMA_CUBES where [CUBE_SOURCE] = 1

This is the query that Power BI Desktop submitted to SSAS, still just one query and different than the query in case 1:

Query Profiler With One Empty Scripted database and AdventureWorks added

Notice how this query still ran against the default database which is “test 3”, the first (default) database created on the server. 

This is the exact case I was running into in the beginning. I had several databases with models in the them that were fully browsable from SSMS and Excel. There was one empty database that happened to be the default database and was causing the error because the query ran against it and returned no results!

When the above query returns no values, Power BI Desktop shows the error message. This seems like a bug that will hopefully get fixed at some point.

Obviously, one way to get around this error, is to either specify the database name you are trying to connect to or to remove the empty database. In the latter case, another database becomes the default database. You can also delete the empty database and script it again. It will look exactly the same as before except that it won’t be the default database. I tried this case to make sure this whole problem was really a default database issue and not the existence of an empty database.

4) A non empty database and an empty default database

I deleted “test 3” and created it again with the same script as before. This time it was not the default database anymore. SSMS looked exactly as before:

Added AdventureWorksFromBackup_SSMS

I went to Power BI Desktop and tried to live connect. This time it worked! This proved that it was not the empty database causing the issue because of its emptiness. The problem was that it was empty and the default database at the same time. This is what the traces showed this time:

One Non Empty Database As Default And One Empty Database Trace Limited

Power BI Desktop submitted 3 DMX queries, 2 of which are identical and ran against Adventure Works (The default database):

Select [CUBE_NAME] from $System.MDSCHEMA_CUBES where [CUBE_SOURCE] = 1

This was followed by this query:

Select [CATALOG_NAME], [DESCRIPTION], [DATE_MODIFIED], [COMPATIBILITY_LEVEL]
from $System.DBSCHEMA_CATALOGS 

4) AutoDetectDefatInitialCatalog

This one is exactly the same query that ran in case 1 (empty server). In case 2, this query did not run because the first query had returned no values and Power BI Desktop stopped with an error message displayed.

How does Power BI desktop know which database is the default database? As mentioned before, the DXM queries were all submitted in the context of the default database, but how? It is not that Power BI desktop knows which database is the default database, it is that it creates a connection to SSAS without specifying a database name (since I did not explicitly specify one) and it lands on the default database.

This behavior of creating a connection and landing on the default database, is similar to what happens in SSMS when you connect to a SSAS server. You always get connected to the default database and you can change it by changing the name of the database from a drop down box. This behavior is controlled by a setting in SSAS called AutoDetectDefaultInitialCatalog.

AutoDetectDefaultInitialCatalog is one of the Advanced server properties and it is set to True by default.

SSAS Server Properties AutoSetDefaultInitialCatalog

Warning!

Make sure you understand the implications of changing this setting, especially on a production database, before you change it!

When this setting is set to Ture, when you connect to the server using SSMS, you get connected to the default database right away. You can see the default database in the drop down to the left of the Execute query button.

New Query With AutoDetectDefaultDatabase

On the other hand, when AutoDetectDefaultInitialCatalog is set to False, when you connect to the server using SSMS, you get a query that is no connected to any database. You can see this in the drop down to the left of the Execute query button.

SSAS Server Properties AutoSetDefaultInitialCatalog Set To False
New Query With AutoDetectDefaultDatabase Set To False

With this setting set to Fales, I repeated cases 2 and 3. This time everything worked even with the empty default database.

AutoDetectDefaultInitialCatalog Set To False One Empty Database PBI Desktop

How did this work? Power BI desktop only submitted the following query:

Select [CATALOG_NAME], [DESCRIPTION], [DATE_MODIFIED], [COMPATIBILITY_LEVEL]
from $System.DBSCHEMA_CATALOGS 
Query Profiler With One Empty Scripted database and AdventureWorks added and AutoDetectDefaultCatalogSetToFalse

Note how this query ran against the server and no particular database. In this case, Power BI Desktop was able to show the list of all databases, even though the first database created (“test 3”) was still empty.

Please remember that changing this setting can have other implications so I don’t recommend you change it permanently. I only did this for testing. The problem you may run into if you set this setting to False, is that it an cause problems for any code that was dependent on the connection landing on the default database. I have seen this in the past at one client, where custom .NET code was getting metadata from a SSAS database without realizing that they were getting results because their database happened to be the default database. Of course, this was not best practice; however, the point is you should change a production server’s settings very carefully and be ready to work through unforeseen consequences.

Summary

Power BI Desktop as of Sept 2019, has a small bug where you would get an error if you try to live connect to a SSAS sever and explore the database on it without specifying a particular database.

The Server you’re trying to connect to doesn’t have any models or you don’t have permissions to access them.

The error happens when the default database is empty on SSAS Tabular or if the default database is unprocessed in SSAS Multidimensional. You can get around this by removing and adding the empty/unprocessed database so that it is not the default database (assuming you have other good databases on that server.) The other workaround is to change the AutoDetectDefaultInitialCatalog to False but this can have other unforeseen implications on custom code connecting to the sever as explained previously.

Note: You can get this error under other valid circumstances as well. For example, if you truly don not have access to a server. In that case, the error is valid.

Analysis Services Processing SSIS Task in SSDT 2015/2017 fails when TargetServerVersion is set to SQL Server 2016

I ran into this problem a while ago at a client. They upgraded from Visual Studio 2013 to 2015 and the SSAS processing tasks started to error out immediately. The solution turned out to be setting the TargetSeverVersion to anything but SQL Server 2016. In this case, it was set to 2014 and that fixed the error.

Recently I ran into this post https://twitter.com/SQLKohai/status/994335086425399297 by Matt Cushing (@SQLKohai) and decided to dig in more.  Initially when I tested it, all was working fine. After I installed SSDT 2015 to test, I started getting the same error in SSDT 2017.  I played around with a DLL and got SSDT 2017 to work with all TargetVersionServers again. At the end I managed to break it again after I went through an uninstall and reinstall of all versions of SSDT. The reason I did the reinstall of SSDT was that I thought I might have had a broken registry entry that I was hoping the installation would fix. This did not work!

Before I get started I would like to remind you that:

As far as SSAS is concerned, the XMLA generated by the processing task works against SSAS 2012 to 2017. Unless if you are worried about other tasks in your SSIS project, setting the TargetServerVersion property of the SSIS project to SQL Server 2017 or SQL Server 2014 solves the problem and does not make any difference for SSAS.

If you are interested to read more details about different things I tried, read on:

When I tried SSDT for Visual Studio 2017 (version 15.6.7), the problem seemed to have been fixed. The SSAS processing task worked with all TargetServerVersions.

Then I installed SSDT for Visual Studio 2015 to test the problem. The default setting for the SeverTargetVersion for the latest version of SSDT for VS 2015 (14.0.23107.0) is SQL Server 2017. Once I changed it to SQL Server 2016, I ran into the same error others had reported. I get an error message right away if I click on an existing SSAS processing task and I am not able to set up a new SSAS processing task.

Here is the error after double clicking on an existing task:

Method not found: ‘Microsoft.AnalysisServices.ErrorConfiguration Microsoft.AnalysisServices.Commands.BatchCommand.get_ErrorConfiguration()

SSIS_Error_2

If I run the package, it fails with the following error:

[Analysis Services Processing Task] Error: Could not load file or assembly ‘Microsoft.AnalysisServices.AdomdClientUI, Version=13.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91’ or one of its dependencies. The system cannot find the file specified.

Doing some research, it seems like others had run into this problem as well. See below: https://www.sqlservercentral.com/Forums/PrintTopic1931084.aspx

The problem seems to be happening because AdomdCientUI.dll had somehow become unregistered. Some were able to solve the problem by copying it into another folder where it was expected to be.

Then I searched for dll and I found copies here:

C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\PrivateAssemblies\Business Intelligence Semantic Model

C:\Program Files (x86)\Microsoft Visual Studio\2017\SQL\Common7\IDE\CommonExtensions\Microsoft\BIShared

I decided to try to register the DLL before I try the copy solutions. I copied the dll to C:\Windows\System32 directory and tried to register it:

C:\WINDOWS\system32>regsvr32 Microsoft.AnalysisServices.AdomdClientUI.dll

I got this error:

Regsrv32CommandRegsrv32Message

Here is the version information from the DLL:

AdomdDLLVersion

Then I decided to use ProcessMonitor to see if can find out where VS is looking for the file as someone had suggested in the previous link.

For this part, I used SSDT 2017.

It is interesting to see that when I switch the TargetServerVersion, the dll is accessed differently:

First starting with TargetServerVersion = SQL Serve 2017

  1. When I double click on the package, there are no entries in ProcessMonitor for the dll.
  2. When I click on the task, there are no entries for the dll.
  3. When I run the task, a lot of entries are generated. Some show a Result of PATH NOT FOUND. Interestingly the processing task runs successfully.

VS2017_TV2017_PackageRun_ProcessMonResults.PNG

Now switching the ServerTargetVersion = SQL Server 2016:

Right after the switch: Notice the process name is DtsDebugHost.

VS2017_TV2016_RightAftreSwitch_ProcMon

After double clicking on the package: Note that the process name is deven.exe.

VS2017_TV2016_AfterDoubleClickingOnThePackage_ProcMon

After double clicking on the package, I get the error message as before

SSIS_Error_2

but no entries show up in Process Monitor.

If I run the package it runs successfully!

While I cannot actually edit the existing processing task, I can add a new one:

SSDT2017_TS2016_NewProcessingTask

Note the warning at the bottom. At this point, it won’t let me save it.

Copying the DLL over:

Source:

C:\Program Files (x86)\Microsoft Visual Studio\2017\SQL\Common7\IDE\CommonExtensions\Microsoft\BIShared

Copied To:

C:\Program Files (x86)\Microsoft SQL Server\130\DTS\Tasks

I still got the same error message. This time the ProcessMonitor results were a little bit different:

ProcMon_Results_After_First_File_Copy

Same results from SSDT 2015:ProcMon_Results_After_First_File_Copy_SSDT2015

The path that DtsDebugHost is looking for does not exist. I decided to crate the gnarly looking folders and see what happens:

I chose the first entry in ProcessMonitor. If you right click on a row, you can get the properties and copy the path so that you can create the exact same folder. I copied the DLL from

C:\Program Files (x86)\Microsoft Visual Studio\2017\SQL\Common7\IDE\CommonExtensions\Microsoft\BIShared

to

  1. C:\WINDOWS\Microsoft.Net\assembly\GAC_32\Microsoft.AnalysisServices.AdomdClientUI\v4.0_13.0.0.0__89845dcd8080cc91
  2. C:\Windows\Microsoft.NET\assembly\GAC_MSIL\Microsoft.AnalysisServices.AdomdClientUI\v4.0_13.0.0.0__89845dcd8080cc91
  3. C:\WINDOWS\Microsoft.Net\assembly\GAC_32\Microsoft.AnalysisServices.AdomdClientUI\v4.0_14.0.0.0__89845dcd8080cc91
  4. C:\Windows\Microsoft.NET\assembly\GAC_MSIL\Microsoft.AnalysisServices.AdomdClientUI\v4.0_14.0.0.0__89845dcd8080cc91

and everything started to work fine for SSDT 2017! The processing task ran successfully!

Unfortunately I still got the same error from SSDT 2015.

Note: The folders may be different on your machine. Use Process Monitor to find out where DTsDebugHost is looking for the DLL.

Uninstalling SSDT 2017 and others…

It looks to me that there is still some kind of registry problem. Remember that SSDT 2017 was working fine until I installed SSDT2015. I am going to uninstall SSDT 2017 and 2015 and any other remnants and start over!

This is what I had in Control Panel before I started uninstalling. I have four version/flavors of SQL Server Data Tools! Time to clean up!

ControlPanel1

First uninstalled Microsoft SQL Server Data Tools for Visual Studio 2017 (SSDT).

Then I uninstalled Microsoft SQL Server Data Tools – Visual Studio 2017.


Uninstalling_SSDT_VS_2017 Interestingly, during this install there is a mention of SQL Server 2016 AMO.

Then I uninstalled Microsoft SQL Server Data Tools – enu (14.0.61712.050), SQL Server Data Tools – Visual Studio 2015, and Prerequisites for SSDT.

Finally I uninstalled all versions of Visual Studio Tools for Applications. I think at some point I had installed this for a test.

 

Reinstalling

You can get SSDT for VS 2015 and VS2017 from here.

https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017

I started with SSDT for VS 2015. Unfortunately, the error is back! I am at a loss with this one!

SameError

I installed SSDT for Visual Studio 2017. Same error is back for SSDT 2017. I can still run the package successfully though.

Conclusion:

I believe this is some kind of bug in SSDT for Visual Studio 2015 but I don’t know how to fix it.

Some people (including myself for the first round) were able to get around this problem by copying the ADOMDClientUI.dll to a couple of folders.

Ultimately if you need the TargetServerVersion to be set to SQL Server 2016 for other parts of your SSIS project, you can get around this problem by setting it to SQL Server 2017 or 2014. All these setting generate the same XMLA processing command for SSAS.