Dan English's BI Blog

Welcome to my random thoughts in regards to Business Intelligence, databases, and other technologies

Posts Tagged ‘ssrs’

Now Available: Visualizing Data with Microsoft Power View

Posted by denglishbi on June 5, 2012

[tweetmeme source=”denglishbi” only_single=”false”]

I am very excited and proud to announce that there is a brand new book available on the market and now in print – Visualizing Data with Microsoft Power View.  When I got home after work yesterday there was a box waiting for me and the books were inside.  It is hard to believe that just over a year and a half ago that I received the email from Brian Larson asking – “Ever want to be an author?”

Just got my #powerview book http://www.mhprofessional.com/product.php?isbn=0071780823 #msbi #SharePoint #sqlserver

Here is the outline for the book:

Part I – Power View

  • Chapter 1 – Getting Started
  • Chapter 2 – The Table Visualization
  • Chapter 3 – Additional Visualizations
  • Chapter 4 – Charting
  • Chapter 5 – Bringing Your Data to Life
  • Chapter 6 – Sharing Reports

Part II – Creating a BI Semantic Model (BISM)

  • Chapter 7 – BISM: Getting Started
  • Chapter 8 – Basic BI Semantic Model Design
  • Chapter 9 – BI Semantic Model: Additional Model Features
  • Chapter 10 – Using DAX Expressions
  • Chapter 11 – Deploying Tabular BI Semantic Models

Part III – Appendixes

  • Appendix A – Installing and Configuring the Sample Data
  • Appendix B – Creating a Virtual Learning Environment

The book is loaded with Learn By Doing exercises throughout the book and included with the book is a DVD that includes the sample data, project files, Power View reports, and videos of all of the exercises.

There will be eBook versions made available as well and there is even talks of an enhanced one that would be available in the iTunes Apple Store that would have the videos embedded throughout the chapters for viewing.

So what are you waiting for, check it out now and let us know what you think – Visualizing Data with Microsoft Power View.

Here is an example of one of the videos for a Learn By Doing exercise:

Posted in Business Intelligence, SQL Server | Tagged: , , , , | 18 Comments »

Using Styles in Microsoft Power View

Posted by denglishbi on April 19, 2012

[tweetmeme source=”denglishbi” only_single=”false”]

The other night I was working on some final chapter reviews of our upcoming book Visualizing Data with Microsoft Power View and I stumbled across something new in the RTM release of Microsoft Power View.  A new feature called Styles that provides theming capabilities with the views in the report files. Let’s take a quick look at where this new feature is located and what this can do to your views.


Now when you launch Power View in your browser you will see a new tab in the ribbon called Styles next to the Home tab.  When you click on this you will see that you are provided 8 different accent color themes to choose from as shown above.

So what do these look like?

Accent 1


Accent 2


Accent 3

(same as Accent 2 with exception of 1st color)


Accent 4


Accent 5


Accent 6

(same as Accent 5 with exception of 1st color)


Accent 7

(same as Accent 1 with slightly different blue color)


Accent 8


Now the one thing to note is that the Styles will change the colors for the entire view, so this includes all of the items (tables, slicers, etc.), not just one particular visualization.


The other thing to note here is that if you have multiple views in your report file that each view can have its own accent color setup.


So when designing your reports you have some options for colors to select from.  Nothing custom like you have available in Reporting Services reports, but at least you have something unlike say PerformancePoint.

So that is a quick intro and overview of the new Styles option in Power View, hope you enjoyed the quick tour.

If you are looking for more information and guidance don’t forget to check out our book that will be coming out very soon – Visualizing Data with Microsoft Power View.

Posted in Reporting Services, SharePoint | Tagged: , , | 4 Comments »

Visualizing Data with Microsoft Crescent

Posted by denglishbi on October 7, 2011

[tweetmeme source=”denglishbi” only_single=”false”]

Yes, it is official, I am now an author.  Yesterday I noticed that the Visualizing Data with Microsoft Crescent book is now listed on Amazon.  I had been asked by Brian Larson if I wanted to collaborate on writing a book for the new reportCrescentBookCover authoring tool and I of course immediately said ‘YES’. I am also one of the technical editors of his other two books for Reporting Services and Delivering Business Intelligence for the new SQL11 release.  There are four of us from Superior Consulting Services that are working on the new Crescent book and so far it has been a very enjoyable experience.  I have been doing the blogging, tweeting, and presenting, so now it is on to something new and actually working towards a published book.

So some insight into the book, this book is going to cover the new reporting authoring tool ‘Crescent’ that is coming out in the next release of SQL11.  This is a new Silverlight browser based reporting experience that will be part of the Reporting Services Integrated Mode configuration with SharePoint 2010.  Not only will the book cover the reporting experience, but we will also go over the Business Intelligence Semantic Models (BISM) which is the new terminology being actively talked about now with SQL11.  In addition to this there will be samples and videos that will provide a very in-depth learning experience so that you will be able to get up-to-speed and familiar with both Crescent and BI Semantic Models.

Now you might know that ‘Crescent’ is the code name provided to this new feature in SQL11 and next week during the PASS 2011 Summit we should be able to have an official name to associate with this along with a few other SQL11 items (maybe even the product itself – 2011 or 2012?).  For the time being though this is the cover, but in the next week or so it should be updated to say ‘Visualizing Data with Microsoft …’.

So for the time being all I can say is thanks to Brian for the opportunity, thanks to my family for their understanding, and I hope that everyone will enjoy the book (and videos).  You can go ahead and pre-order your copy today Visualizing Data with Microsoft Crescent.

Posted in Reporting Services | Tagged: , , , | 4 Comments »

Using Images in Tabular BI Semantic Models with Crescent

Posted by denglishbi on September 12, 2011

[tweetmeme source=”denglishbi” only_single=”false”]

One of the really neat features with the new Reporting Services authoring tool code named ‘Crescent’ is that you can reference images directly from your model.  This can look really cool when you use the images in a card visualizations, part of a table, as tiles, and as slicers to use as filters within the report.  Here is a nice report that is part of the Hello World Picnic Sample that came out with the CTP3 release of SQL Server ‘Denali’ (download from Project Crescent Samples and download images from Sample Images and you can find information on the Crescent Samples for CTP3 in the readme here).


So this is really cool and provides a really nice visualization perspective to the report both from what you are looking at and to add a little color and life to the report.  When you are looking at a sales report for products, wouldn’t it be nice to see the image of the product or category that you are looking at.  So what better place to store this information than in your BI Semantic Model that you can then leverage with your reporting tool like Crescent.  So how do you do this?

Well in the current CTP3 there are a couple of different ways that you can go about this.  The first option is to store your images in say a SharePoint library and the provide the Image URL references in the model.  That is how the example above is being done.  That report is actually leveraging a PowerPivot Tabular model that was built in Excel 2010 leveraging the CTP3 release of the PowerPivot add-in which you can download here.  The second option that you have right now is to store the images in a database table and reference that way use base64-encoded strings.  Getting this option to work in CTP3 is by no means straightforward either and took me quite a while to figure out.  Just using a base64-encoded string was a puzzle and luckily Robert Bruckner provided that solution as to how to format the binary data in the database to this format.  There is a third option which is simply referencing the binary data from the database, but currently the CTP3 release of Crescent will not recognize this field type in the model at all.

So lets take a look at the two working options we have that we can use with the CTP3 release of Crescent along with the Tabular BI Semantic Models using both a PowerPivot and a Tabular Analysis Services solution as examples.

Image URL

As I mentioned, the Hello World Picnic PowerPivot model uses this setup.  So I will highlight how this is done.

  • First you need to have images available for what ever you want to display.  The download link is available above for the samples and then the instructions are provided in the information readme.
  • Next you need to place the images in a location that can be referenced as a URL in the model.  In this example they are placed in a document library within a folder.

images in sharepoint

images are stored in a SharePoint document library called Shared Documents buried down a couple of levels –> /Images/HelloWorldPicnic/

  • Then within your table in the PowerPivot Tabular model you can provide your URL to the image.  In this example they store the name of the image in the table and then use a calculated column to generate the Image URL.  In order to activate the option to set this as an Image URL in the Reporting Properties (which is in the Advanced tab – might need to turn on the Advanced Mode in the file setting first) you need to define the Table Identifier column in the table.  This means that you need to specify which column makes the table unique to identify each record (in this example it is the ItemId column).

table identifier

need to specify a table identifier – a column that defines the table and makes each record unique

image url

a calculated column is used here that includes the image name column from the table along with the rest of the string to the SharePoint location and concatenates everything together

  • Then once you have deployed your file to SharePoint you can create a Crescent report from the PowerPivot file and you will now see the field in the field list to select from along you can start leveraging this in your report design.


the Items Name and Drawing fields are selected from the Field List which get added to a Table

So that one wasn’t too bad.  And this is a nice option because having the images in a document library location is a nice spot where end-users can maintain these as needed and swap them out.

Okay, let’s take a look at the other option that is currently available and that is storing the images in the database and using a base64-encoded string.

Base64-Encoded String

So what if the images are already stored and available in a database table? Well there is a solution that can allow you to do this, at least if you are doing the Tabular Analysis Services project type.

  • Here we already have the images stored in a database table in as a varbinary(max) data type – just a warning, I believe that the max size for the images is 25K, I don’t believe that has changed yet, so just be aware of that in case your images do not show up. In this example we will be using the AdventureWorksDWDenali CTP3 sample database leveraging the product dimension data – you can download the sample databases and projects for Denali CTP3 here.


table contains a varbinary(max) column called LargePhoto that contains some images of our products

  • Once the table is imported into a Tabular Analysis Services project we need to modify the Table Properties for the product dimension and change the reference from Table Preview to Query Editor to generate our base64-encoded string that we will use in our model instead of the binary data that is already available from the LargePhoto data (this is currently not supported by Crescent right now, but will be in next release).


now the key here is to add the logic to generate the base64-encoded string which is the following – thanks to Robert Bruckner for providing the logic:

cast(N” as xml).value(‘xs:base64Binary(xs:hexBinary(sql:column(“[dbo].[DimProduct].LargePhoto”)))’, ‘varchar(MAX)’) as LargePhoto64

  • Next we need to once again set the Table Identifier (Product Id column – in this example the ProductKey column has been renamed to Product Id) and then set the properties on the new LargePhoto64 column for Image URL to True along with Representative Image to True


set the Product Id as the Table Identifier (which is actually the ProductKey column)


set the Image URL and Representative Image properties to True for the LargePhoto64 column

  • The next step is the hack that we have to perform to the Code portion because if you try to change the Image URL property back to False it will also set the Representative Image to False and that is not what we want.  You can right-click on the bim file in the Solution Explorer and select ‘View Code’ or pick the option from the ‘View’ file menu.  Once you are in the Code behind do a search for LargePhoto64 and it will take about three times to find the right spot we are looking for which contains the Type property setting that displays ImageURL.  We need to change this to just Image.


find the LargePhoto64 column where the Type property is located with ImageURL


now change the Type so that it does not include the valuens=”ddl300” and remove URL from the value so that it is just <Type>Image</Type>

  • Now you want to Save this change and then double click on the bim file in the Solution Explorer to go back into the Design view and deploy the database to your Tabular instance of Analysis Service.
  • Setup the BISM Connection File to you model (see my previous posting on this if you are not familiar with how to setup data connections to a Tabular BI Semantic Model – Configuring SQL Server Denali Reporting Services SharePoint 2010 Integration) and then click on the connection file in your SharePoint library and you will now be able to utilize the LargePhoto64 column as an image in your reports.


the Product Name and LargePhoto64 columns were selected from the Field List and added to a Table

So there you go, not too bad, huh?  You have two different options right now for using Images in your models.  In the next release we should see direct support with Crescent in being able to utilize binary data directly instead of having to use the base64-encoded string method.  But for the time being this is what you can use.  My preference would be the Image URL.  This just seems not only more straightforward, but also easier to maintain and update, especially from an end-user standpoint.


Posted in Reporting Services, SQL Server | Tagged: , , , , , | 2 Comments »

Configuring SQL Server Denali Reporting Services SharePoint 2010 Integration

Posted by denglishbi on July 22, 2011

[tweetmeme source=”denglishbi” only_single=”false”]

Update (7/22/2011): Added a note about an IE Security setting for SharePoint menus at the end of the post.  I will add a PDF of this posting either tonight or over the weekend if anyone is interested in that.

Update (7/24/2011): Added PDF download of the posting for printing, offline, and eReader availability at end of posting in the File Download section.

With the recent release of the SQL Server Denali CTP3 there have been a few changes in the installation process.  One of those is with how the SharePoint integration configuration works.  There was new post yesterday in regards to this process Getting up and running with SQL Server Denali for Business Intelligence (Crescent and PowerPivot in CTP3).  In this post is goes through the steps assuming that you have installed SharePoint 2010 first.  What if you are like myself and installed all of the SQL Server components (SQL Server ‘Denali’ CTP3 Install Experience), including Reporting Services, before SharePoint 2010?  Well the process is a little different because since you didn’t have SharePoint installed first the integration process will not be completed by just installing SharePoint 2010 next.  There are a few steps you can take and I am going to go over the PowerShell option that I took through with the help of some documentation I was provided from Microsoft with previous CTP testing.

So here are the steps that I went through to get this working (this was done on a Windows 2008 R2 VM):

  1. Installed all of the SQL Server Components (installed Multidimensional SSAS and just installed the SSRS and did not configure it) – SQL Server ‘Denali’ CTP3 Install Experience
  2. Next I went through the installation process again and installed an additional SSAS named instance in the Tabular mode.
  3. Then I installed SharePoint 2010:  pre-reqs and then SharePoint (but did not run the configuration wizard – unchecked the box and closed the process down after the install).  I had to actually run the pre-req install twice and had to do a reboot prior to the SharePoint install as I recall.
  4. Then I installed the two SharePoint 2010 SP1 files – one for Foundation and the other for Office Server (this is going to be required to get SharePoint to work with Denali for the database engine).  I believe a reboot was required after this as well. http://support.microsoft.com/kb/2510766
  5. I then did the SQL Server install again to install PowerPivot for SharePoint creating another named instance of SSAS for PowerPivot.
  6. The next step can be done a few different ways and I picked the new PowerPivot for SharePoint Configuration Tool (need to ensure your pass phrase meets the security requirements) – this option is available in the Tools page of SQL Installation or in the Configurations folder in your SQL Server install folder in the Start Menu. This process will configure the PowerPivot integration, activate it, and create a PowerPivot SharePoint Site Collection.
  7. I then installed the ADO.NET Data Services for .NET 3.5 SP1which is required if you want to use PowerPivot against an Atom Feed in SharePoint.  A reboot is required after this install.
  8. Setup the SSRS service and application (since I installed SQL and SSRS prior to SharePoint the Integration with the add-in did not complete – because no SharePoint).
    1. Run the Microsoft SharePoint 2010 Management Shell as Administrator
    2. Type the following PowerShell command> Install-SPRSService
    3. Type the following PowerShell command> Install-SPRSServiceProxy
    4. Now to start the service type the following PowerShell Command> get-spserviceinstance -all |where {$_.TypeName -like “SQL Server Reporting*”} | Start-SPServiceInstance
    5. Verify that the service is started in SharePoint Central Administration –> Application Management –> Service Applications –> Manage Services on Server and now look for the SQL Server Reporting Services Service to see that it has now been setup and is runningimage
    6. Now we are going to create a new SQL Reporting Services Application in SharePoint Central Administration –> Application Management –> Service Applications –> Manage Service Applications and create a new SQL Server Reporting Services Application
    7. Specify a name for the Application, create a new application pool, and specify the database server name.  Down below you will also need to select the web application to associate this application with and I used the Windows Authentication for the database authentication
    8. Now we should be all set and you should see the new application that you just created
  9. Now you are off to the races and can create a Reports Library in SharePoint and add the content types including the new BISM Connection File for using to create Crescent Reports! You will need to install Silverlight to get the SharePoint Create window and this will be needed for Crescent as well since it is a Silverlight application.  If you do not see the Reports Library option you will want to make sure that the Enterprise Features are enabled not only at the Site Collection Level, but also at the Site level. I left out the Report Builder Model content type because from here on out it is all about the BI Semantic Model (BISM)
  10. Now you can create a new BISM Connection File and or Report Data Source to get started with Crescent (shown below in the respective order). After you create these you will not see them in the library.  In order to see them you can go to the Library tab in the Ribbon and select the ‘All reports and dashboards’ option for the view and then you will see the connection and data source files you have created (by default it simply displays just the reports so it is not cluttered)
  11. Now you can launch Crescent from the context menu from the connection and data source files you have created.  In order to create these connection files and data sources I installed and deployed the Denali Samples from CodePlex.  There is one issue with the Tabular project that you need to be aware of when you want to deploy this that has been documented here – Problems with the SSAS Tabular Model Projects Denali CTP3 samples (according to the site the samples have been fixed, so if you haven’t downloaded them you should be okay).

So there you go, you are off to the races.  You know have SSRS in the new SharePoint Integrated Mode setup running as a SharePoint Application (no longer running as its own Windows Service, completely setup and managed by SharePoint now) and you are creating awesome reports with Crescent.


NOTE: If you are having an issue in say Step 9 and you are not getting menu options in SharePoint you need to go into your IE Security Settings and enable the Active Scripting.  Once you do that you should be able to use the menu options in SharePoint.


File Download:


Posted in Reporting Services, SQL Server | Tagged: , , , , , | 12 Comments »

Using SharePoint List data for Reporting Services (SSRS) parameter source

Posted by denglishbi on April 13, 2011

[tweetmeme source=”denglishbi” only_single=”false”]

UPDATE (8/11/2012): updated the link for the sample download files to point to my file on SkyDrive.

UPDATE (6/18/2011): added a PDF download option of the blog post.

UPDATE (6/15/2011): added file download which includes the Excel source file used to create SharePoint list and also two RDL files, the one in the demo and additional one that was created with multi value modification for the parameter.

When creating reports with Reporting Services (SSRS) you might find yourself having to reference data that is stored in SharePoint lists.  Another task that you might need to do is to leverage that same data that is stored in the list to filter the results that are displayed to the user in the report.  Let’s take a look at a scenario for this type of a report request.  In order to get setup for this scenario we are going to do a little prep work first though to get the data created for the example.  We will do this right after we go over the scenario.


A request comes in to create a report that provides a list of the stores and to provide a way for the users to filter the stores that they see on the report based on the location of the store; in this case they would like to filter the records by country.  The request states that the information needed for the report is currently being stored and maintained in a SharePoint list.

Just a heads up, in this example we will be using SharePoint 2010 along with Reporting Services (SSRS) 2008 R2.  SSRS 2008 R2 includes a new SharePoint List Connection Type.  You will see screenshots of this down below in the Solution part.

Prep Work

Okay, first we are going to generate the list in SharePoint that is going to be the source for our report where the store information is entered and maintained so that we can do this example.  We are going to do the following task:

  • Create custom Store Information SharePoint List by importing an Excel file as a source for the structure and initial data load

The Excel file that we are going to use for this list was originally created from the AdventureWorks2008R2 database by exporting the view [Sales].[vStoreWithAddresses].  You can download the AdventureWorks sample databases from CodePlex here.

image image
From the SharePoint 2010 site select Site Actions->View All Site Content From the All Site Content page click on the Create option at the top of the page to display the popup window for creating new content. Under the All Categories section on the left-hand side select Blank & Custom, then select Import Spreadsheet, and finally click the Create button.
image image
Enter a Name for the new list to maintain the store information, click Browse to locate the Excel file that contains the store information, click Open, and then click Import A Excel window will open up so that you can specify what you want to import.  Select the range of cells and then click Import.
image image
The range of cells will be converted to a Table and you will see a progress meter on the bottom of the file stating ‘Adding Records’ Once the import process is complete you will have a new list in your site with all of the store addresses from the Adventure Works database.


We will now create the report that will provide a list of stores and also provide a way to filter the records displayed to the end-user.  All of this data will be generated from the SharePoint list that we just setup in the Prep Work section above.

image image
First step is to simply create a Report Server Project that will contain our report and to add a new Report item called Store List. Next step is to add a new Data Source to the report with the Type of Microsoft SharePoint List with the URL address of the site (in my case it is http://mybi10/) and then click OK.
image image
Now add a new Dataset to the Data Source you just created, provide a name, click Query Designer, select the desired fields from the SharePoint list you created in the Prep Work, and then click OK twice to close out the Query Designer and the new Dataset dialog boxes. Repeat the previous step, but this time only select the CountryRegionName field from the SharePoint List.  This is going to be the source for the Parameter we are going to setup next.
image image
Add a new Parameter to the report and on the General page make sure you click on the checkbox for ‘Allow multiple values’ and also click the radio button for Hidden. On the Available Values page click the radio button for the ‘Get values from a query’, select the Dataset you created that contains just the CountryRegionName field, and then select the CountryRegionName for the Value and Label field drop-downs.
image image
On the Default Values repeat the process that you just did for the Available Values with the exception that there is no Label field option.  Once you have done this click OK to add the new Parameter. Now if we go back and run the query for the CountryNames dataset we created we will see that this is not a distinct list of values of Countries.  Since this is coming from a SharePoint list we need to do a bit of extra work to generate this distinct list.Here is where the magic comes in and a little help comes in from the community and the Microsoft forums – “How to get distinct values of sharepoint column using SSRS”.
image image
In the Report Properties we need to go to the Code page and add some code that will be referenced to create an ArrayList of values that will contain our distinct list of values, that will be sorted, and will also contain an “All” option as well.  In the code above which is also provided below in the posting to reference there is some additional code to handle ampersands (&) in names as well, but in this case we do not have those, and I use a Vertical Par (Pipe) for a delimiter – need to be careful if your values had commas, so I go with the bar instead.  Once you have the code in place click Ok to exit the Report Properties window. Now we need to setup an additional Parameter.  Add a new Parameter and on the General page enter a name and prompt text.
image image
On the Available Values page click the radio button for ‘Specify values’, click on the Add button, click on the fx button to enter an expression for the value and use the following=Split(Code.RemoveDups(JOIN(Parameters!CountryNames.Value, “|”)), “|”)Now click the Ok and go to the next step. Similar to the previous step, on the Default Values page click on the ‘Specify values’ radio button, click Add, click on the fx button to enter an expression for the value and use the following=Split(Code.RemoveDups(JOIN(Parameters!CountryNames.Value, “|”)), “|”)(0)This will return the first value in the ArrayList.  Now click OK twice to close the expression window and to add the new Parameter to the report.
image image
Next step is to wire the Parameter into the main dataset for the report.  In order to do that we need to access the properties of the dataset and add a filter. Now we need to setup the Expression and we will use a formula for this so that we can handle the use of the “All” value.=IIF(Parameters!CountryName.Value=”All”, “All”, Fields!CountryRegionName.Value)
image image
Next move down to the Value and enter another expression in order to account for the modification we did in the Code to handle ampersand ( & ) logic.=Replace(Parameters!CountryName.Value,”&”,”&amp;”) Now that part is done, just need to create the desired report layout.  In this case I use a List data region, but you could use a Table.  I also included some additional logic to alter the background color (green bar effect).Don’t forget to add a sort to the data region as well.
image image
Run the report and test out the filter.  Here is the report for Canada.One thing to note is that we would want to update each field that might contain an ampersand with the REPLACE logic. After that logic is added into the expression for the field we are in business. Another option is to modify the placeholder for the data and enable the option for HTML support.You could simply create another function in the Code to handle the ampersand logic for reusability as well.Nice job;)

Code Snippet used to remove duplicates from the dataset to present to the second Parameter defined in the report.  I also added in the Sort after the ArrayList is fully populated along with the additional All value added to the beginning of the pipe delimited string.

Public Shared Function RemoveDups(ByVal items As String) As String
Dim noDups As New System.Collections.ArrayList()

Dim SpStr as String()
SpStr = Split(items ,"|")

For i As Integer=0 To Ubound(Spstr)
If Not noDups.Contains(SpStr(i).Replace("&amp;","&").Trim()) Then
End If

Dim uniqueItems As String() = New String(noDups.Count-1){}

Dim strRetValue AS String = String.Join("|", uniqueItems)
strRetValue = "All|" + strRetValue
Return strRetValue
End Function


The challenge with this solution was to get the parameter setup the way we wanted as a distinct list of values.  Since there can be more than one store in each country the values were duplicated.  Luckily we can handle this with some code and an additional hidden parameter.  The other thing is dealing with SharePoint data and handling additional HTML markup.

Pretty fun and interesting huh?  Little challenges like this are always fun to work with and it is great once you figure out the workaround to get this all to display the way you want.


File Downloads:


Posted in Reporting Services | Tagged: , | 32 Comments »

Reporting Services (SSRS) OR Filter Logic

Posted by denglishbi on January 28, 2011

In designing a report one of the requirements is typically to add parameters and be able to filter the results displayed in the report.  For the most part these types of requests are to filter the data so that the reports include data where the values are inclusive to what is selected. 


Let’s say that I am working on having a sales meeting and want to include the local sales people along with some of my top performers.  I would create a report with two parameters, one to filter the report by state and the other based on the sales amount like the following:


In the example above I am using the ever famous Adventure Works 2008 R2 database pulling data from the Sales.vSalesPerson view.  I am referencing the same view for the additional dataset that I use to provide available values for the StateProvince parameter performing an aggregate ‘Grouped by’ on the Sales.vSalesPerson.StateProvinceName column.

I then setup the report to display the list of sales people that I would like to send an invite to for the meeting.  My criteria will be to include the sales people that live in Minnesota because that is where I am planning on having the sales meeting and the second criteria is to also include people that have had sales of at least $1 million.  I go ahead setup the parameters with the dataset, create the report, and select the criteria.



One thing to point out here is that I am placing the filtering criteria on the Filters portion of dataset and it is not embedded in the Query.  In this case I am using a database view that I could embed this logic, but there might be certain scenarios when you are referencing an item that you cannot do this like against a stored procedure or SharePoint list.


The report returned the results based on my criteria, but in this case what was returned was only a single sales person.  Not exactly what I wanted.  What I was looking for was to include the sales people that lived in Minnesota along with the people that had at least $1 million in sales.  The way that the report was setup it only returned results for people that met both criteria, not one or the other.

If I go back into the Filters section of the dataset properties I do not see an option to change the relationship of the filters from an ‘And’ to an ‘Or’ relationship.  Where is the logical operator selection?  Hmmm….  If I travel back in time a few years I believe that Report Builder 1.0 had an option to specify the logical operator between the parameters that I would use.


Working with a Filter Condition (Report Builder 1.0)

Logical Operators

As you add additional filter conditions to the filter area, a logical operator is displayed between each filter condition. Report Builder supports the And and Or logical operators. By default, the logical operator is And, but you can easily change it by clicking And and selecting Or.

Interesting.  Looks like this what kind of left out with the newer versions of Report Builder.  So where do we go from here?  How about we head over to the Reporting Services forums and see what we can come up with?  After a few searches we come across this thread – How to filter a table with an “OR” condition.  We find out that you would run into this same type of issue with data regions on the report if you try and add filters to them as well.  The solution for this issues is not as obvious as one would think, but there is a workaround based on the answer to the thread.


In order to resolve the fact that there is no option to select the logical operator between filters they must be combined into a single filter option.  This mean that the second filter will be removed and the OR statement will be placed into an expression in a single filter.  This same technique can be utilized on a data region as well within the report.  In the configuration of the filter we will set the expression like such:


And the Filters portion will look like the following now with the value being set to True and the expression type have a data type of Boolean.


Now when we go and run the report the results contain not only the sales person from Minnesota, but also the other sales people that had sales of at least $1 million.


Now we have the complete list of 14 sales people that we can send an invite to attend the sales meeting.  The one person that lives in Minnesota and the other 13 that had at least $1 million in sales.


Coming to solutions is not as always as straight forward as one would assume.  There is typically always a workaround to get at the desired results.  You might not necessarily have to go to the lengths that we did here if you are going against the database directly, but in some cases you might be using a stored procedure that you cannot modify or using a SharePoint data source that you might not be able to incorporate this logic against directly.

If you are interested in taking a look at the RDL file, feel free to download it from my SkyDrive account here:


Posted in Reporting Services | Tagged: | 16 Comments »

Calculating Cumulative Values in Microsoft BI Tools

Posted by denglishbi on January 24, 2011

In working on a business intelligence project a common situation you will find yourself in is having to come up with a cumulative value (running value) calculation.  The challenge that you will find is that depending on what product you are working with the calculation that you need is going to be different.  In this post I will compare how this can be done with T-SQL, Excel, SSIS, SSRS, MDX, and DAX.  Along the way I will provide some additional reference links to other options, content, and I will point out some differences in how the totals are being calculated so you will know what to expect with the results.

In these examples I will be working with the Contoso Retail Data Warehouse data sample.  These examples below are not necessarily going to be the optimal approaches, just showing how you can go about getting at the desired result.

SQL Server T-SQL

So let’s start out taking a quick look how calculating the sales over each of the years directly against the data in the database through SQL Server Management Studio.

SELECT d.CalendarYear,

       SUM(f.SalesAmount) SalesAmount,

       SUM(f.SalesAmount) + COALESCE((SELECT SUM(SalesAmount)

                        FROM FactSales fs JOIN DimDate dd ON fs.DateKey = dd.DateKey 

                        WHERE dd.CalendarYear < d.CalendarYear),0) CumSalesAmount

FROM FactSales f

INNER JOIN DimDate d ON f.DateKey = d.DateKey

GROUP BY d.CalendarYear

ORDER BY d.CalendarYear

And the results for this are:

CalendarYear SalesAmount           CumSalesAmount
———— ——————— ———————
2007         4561940955.0212       4561940955.0212
2008         4111233534.6841       8673174489.7053
2009         3740483119.1823       12413657608.8876


Not too bad.  We have an inner query that is being used to return the pervious year(s) values and append that onto the current year (if they exist).  If no value is returned then a zero is used in its place.  For additional examples and options take a look at this posting by Garth Wells – Calculating Running Totals.

Microsoft Excel

So now lets take a look at how this would be done if we were using Excel. 


There area a few different options you have here, in this example I enter a formula in the C2 cell of =SUM($B$2:B2) and then copy that formula down into cells C3 and C4.

SQL Server Integration Services (SSIS)

Now we will take a quick look at how this could be done in the ETL process.  As we are pulling data from a source we can evaluate the values in the data flow and accumulate the values inside a script component.  In the source we will simply get the sales for each year and then append the values in the flow to each other to get at the same results displayed above.


The cumulative value is being performed in the script component with some code.  We have a variable setup that we are using to append the sales amount of each row in the data flow and then sending this value back out in a new output column called CumulativeSales.


Nothing too fancy going on here.  I don’t have a destination in the flow, just added a derived column to get the data viewer included so that I could run the data flow to show the results.

SQL Server Reporting Services (SSRS)

Now lets switch over to Reporting Services.  In this example we will not do the cumulative value in the dataset query, we will do this value calculation in the report. 


In this example we leverage the RunningValue aggregate function that is available in Reporting Services.  The expression that we use in the Cumulative Sales column is =RunningValue(Fields!SalesAmount.Value,Sum,”DataSet1″)


This function returns a running aggregate of the values in our specified dataset using the SUM function which is also specified in the expression.

SQL Server Analysis Services (SSAS) MDX

Now how about taking a look at Analysis Services.  How would we go about doing this calculation using MDX?

WITH MEMBER [Measures].[Cumulative Sales] AS

    SUM({null:[Date].[Calendar Year].CurrentMember},[Measures].[Sales Amount])


SELECT {[Measures].[Sales Amount],[Measures].[Cumulative Sales]} ON 0,

    NON EMPTY [Date].[Calendar Year].[Calendar Year].Members ON 1

FROM [Sales]

And the results of this are:


We can apply some filters into the MDX statement to remove the years that currently don’t have Sales Amount associated with them.

WITH MEMBER [Measures].[Cumulative Sales] AS

    SUM({null:[Date].[Calendar Year].CurrentMember},[Measures].[Sales Amount])


SELECT {[Measures].[Sales Amount],[Measures].[Cumulative Sales]} ON 0,

    NONEMPTY([Date].[Calendar Year].[Calendar Year].Members,[Measures].[Sales Amount]) ON 1

FROM [Sales]

This returns what we have seen in our other results:


There are additional ways in going about this calculation and for more options take a look at this recent posting here by Amish Manubhai Shah – Various way to calculate running total from All Period to Currentmember. Now I want to show how this solution differs a bit from the other ones we have seen.  What happens if I apply a filter to the query to only display 2008 and 2009?  What would you expect?


Might not be what you would want, but then again maybe it is.  The Cumulative Sales calculation is summing up the values from the beginning of time (the [All] level).  So the first value that we see in 2008 is including the Sales Amount from 2007 as well (if there was sales in 2006 it would include that as well and so on). 

You can make a few modifications to the calculation and setup a named set to reference the items and come up with something along these lines.

WITH DYNAMIC SET [SelectedYears] AS Existing [Date].[Calendar Year].[Calendar Year].Members


 MEMBER MEASURES.[Cumulative Sales] as 

    SUM({EXISTS([Date].[Calendar YQMD].[Calendar Year].Members,

    [SelectedYears]).Item(0):[Date].[Calendar YQMD].CurrentMember},

    [Measures].[Sales Amount])


SELECT {Measures.[Sales Amount], Measures.[Cumulative Sales]} ON 0,

    [Date].[Calendar YQMD].[Calendar Year].Members ON 1

FROM [Sales]

WHERE {[Date].[Calendar Year].&[2008],[Date].[Calendar Year].&[2009]}

With the following results:


Here a dynamic set is used to get the references of the members that are in the slicer (WHERE clause) and feed this information into the calculation for the cumulative sales to grab the first item to come up with the reference to the 2008 year.  This will be the starting point for the cumulative total instead of from the beginning of time.

You might just want to look at year-to-date cumulative totals and be able to drilldown into a user defined hierarchy if you have one setup like Year-Quarter-Month-Date.  If we modify our calculation a bit we might end up with something like the following:

WITH MEMBER [Measures].[Cumulative Sales] AS

    SUM(YTD([Date].[Calendar YQMD].CurrentMember),[Measures].[Sales Amount])


SELECT {[Measures].[Sales Amount],[Measures].[Cumulative Sales]} ON 0,

    Hierarchize({[Date].[Calendar YQMD].[Calendar Year].Members,

    [Date].[Calendar YQMD].[Calendar Quarter].Members}) ON 1

FROM [Sales]

WHERE {[Date].[Calendar Year].&[2008],

    [Date].[Calendar Year].&[2009]}


With the following results:


Here we can see that our cumulative total stops at the year level and we can see at the lower levels that the values are accumulating as expected, so the cumulative total at Q4 is same as the Yearly value.  What we have now is a YTD calculation.

PowerPivot for Excel 2010 DAX

We have one more product that I want to take a look at and that would be PowerPivot.  This is going to be fairly similar to the Analysis Services solution since it actually is Analysis Services behind the scenes, the only difference here is that we will be using DAX instead of MDX.

So for the first example we will setup a cumulative sales calculation just like we did with the first MDX example.  The calculation we use is as follows:





So we end up with the following:


You could go ahead and add some additional logic to evaluate the Sales Amount to determine if a value exists, but you get the general idea here.

Once again this is calculating the cumulative total from the beginning of all time and that is done by using the ALL reference in the calculation for the FIRSTDATE.  If we filter the years and only display 2008 and 2009 we will see the similar results like we did with MDX where the cumulative sales amount for 2008 includes the 2007 sales.


The one nice thing about PowerPivot is that this is in Excel, so if you want to do anything a little more custom or make any references to cells you can do that and use Excel functions as well.  As far as setting up a cumulative sales amount value like the second MDX example I can’t determine an equivalent in DAX.  Maybe someone else has an idea if this can be done so that it only does this based on the Date values being evaluated.  If you do, please leave a comment.

You can do the YTD calculation in DAX and here is formula for that:


And the results look like this if we break out the years by the quarters:


Once again we can see that the yearly total is equivalent to the Q4 for the same year.  For some additional information and explanation of the PowerPivot calculations take a look at a great post by Kasper de Jonge here Accumulate values (running value) over multiple years in PowerPivot using DAX.


Cumulative total (running value) calculations are all calculated differently within each of the Microsoft products.  We took a look at some examples of how this can be done in T-SQL, Excel, SSIS, SSRS, MDX, and DAX.  These are not the only options, so if you want to see more take a look at the additional posts that I included as reference points the different sections.  It is important to understand how the formulas need to be configured and also what the results are once the calculations and logic is put into place.  I think that one of the easiest ones, besides the Excel formula, would have to be the SSRS with the RunningValue aggregate function.  I like how this is setup and it evaluates everything based on the context that is being referenced.  So if we go back to the SSRS report and apply a filter to the tablix to remove 2007 from what is displayed we would end up with the following:


Very simple and easy to understand, but that is just my personal opinion.

Posted in Analysis Services, Integration Services, PowerPivot, Reporting Services, SQL Server | Tagged: , , , , | 10 Comments »

Using Reporting Services Report (SSRS) with PerformancePoint Services (PPS)

Posted by denglishbi on December 31, 2010

[tweetmeme source=”denglishbi” only_single=”false”]

UPDATE (6/21/2011): Modified table layout of step-by-step walkthrough to be two columns instead of three. Added a downloaded PDF file as well.

Back in January I did a posting in regards to Using Excel Services Reports with PerformancePoint Server (PPS).  This has been a very popular posting and I thought I would add another one in regards to using Reporting Services (SSRS) reports with PerformancePoint (in PPS 2007 this type of report was called SQL Server Report).

Some of the reason that you might want to include a SSRS report in your PPS dashboard would be because:

  • leverage an existing report created by an end-user
  • incorporate existing operational reports
  • use additional charting options – map, area, range, scatter, polar, bar (not column), funnel, 3D, sparklines, data bars
  • need more flexibility and control over reports, styles, colors, scales, etc.
  • join multiple data sources into a single report
  • combine relational and OLAP data into a single report

The example that I will be showing is using SSRS in SharePoint Integrated Mode, but you can also do this in Native Mode as well, you would just see a different setup screen when you are configuring the report in Dashboard Designer (a tad bit easier in my opinion configuring these in Native Mode – which is labeled as ‘Report Center’ mode in Dashboard Designer, confusing I know…).  I will also be using Report Builder 3.0 to create and deploy the report to the SharePoint site.

image image
Go to Report Library in SharePoint site, select Documents from Ribbon, select New Document, and pick Report Builder Report This will either launch Report Builder or ask you if you want to run and install the application if you haven’t done so yet
image image
Report Builder is a ClickOnce application and by clicking Run you will install the application Once installed the Report Builder application will start up
image image
In this example we will build a Map Reposition the map up a bit so it appears above the legends
image image
A Bubble Map will be used to be able to analyze two metrics A new data set will need to be added that contains the spatial data
image image
A new data source will be added connecting to the Contoso Retail DW SSAS database Use the Sale cube, filter for the United States, setup the Fiscal YQM as a Parameter, pick State Province Name, Sales Amount, and Sales Total Cost
image image
Use STATENAME and map this to the State Province Name field from the data set Pick a theme for the style, setup the bubble size to visualize Sales Amount, and polygon color for the Sales Total Cost
image image
Setup Chart and Legend titles, polygon tooltip, remove color legend, resize/reposition map, and remove default marker size Save report to SharePoint library
image image
Now we are going to add a new Report to our existing PerformancePoint Content library This will launch Dashboard Designer and like the Report Builder you may be prompted to install it (this is also a ClickOnce application)
image image
If nothing launches then you need to make a small adjustment in your IE security settings to Enable ‘Automatic prompting for file downloads’ Now we will create the new PerformancePoint Report
image image
Use the SharePoint Integrated mode, specify the URLs for the Report Server and the RDL file, uncheck the Show toolbar, and specify a name for the PPS report Next we will create a filter that we can use with the report once it is displayed in the dashboard page
image image
The filter we will create will be for the Fiscal YQM and we will remove periods that don’t have any Sales Amount We will use a Tree style display and only allow a single selection
image image
Name the filter and get ready to create the dashboard Add a new Dashboard item
image image
Name the dashboard item, page, add the filter, add the report, and remove the extra column (zone) on the page Create a Connection (formerly link in PPS 2007) between the filter and the report
image image
The filter will connect to the DateFiscalYQM parameter on the report and will pass the Member Unique Name (an SSAS member value to the report) Save the PPS content items and deploy the dashboard to the Dashboards library
image image
Select the Master Page and whether or not you want to include the page navigation or not Test out the filter and view the results with the deployed PPS dashboard

My example here used the Contoso Retail DW sample data which is available from the Microsoft downloads here – Microsoft Contoso BI Demo Dataset for Retail Industry.  This is also using Reporting Services 2008 R2 which includes the new Map report item, Report Builder 3.0, PerformancePoint Services, and SharePoint 2010 Enterprise.

I have two other postings that I did earlier in the year in regards to the new Map report item here that you can check out if you have questions in regards to that:


Feel free to download the PPS Workspace file (ddwx) and the SSRS report (RDL) file from my SkyDrive which I have included in a zip file.

image image

You might find this posting useful if you want to reuse the workspace file – Migrating PerformancePoint 2010 Content to New Server.

Enjoy and Happy New Year!

Posted in PerformancePoint, Reporting Services | Tagged: , , | 17 Comments »

Project ‘Crescent’ first impression – simply amazing!

Posted by denglishbi on November 10, 2010

All I can say is ‘wow’.  During yesterday’s keynote that was streamed live from the SQL Pass Summit Amir Netz did a demonstration of the new reporting tool codenamed ‘Crescent’ that will be available in the next release of SQL11, codenamed ‘Denali’.  I was completely amazed at the visualizations and the functionality that was provided.  Being able to visualize the report instantly as you are creating it and the interactivity that is provided out-of-the-box was incredible.  The storyboard feature with PowerPoint was very nice as well.  Can’t wait until it is available to test drive which hopefully will be in the CTP2 release.

If you haven’t heard about this product yet then take a look at Robert Bruckner’s posting here Glimpse at Project Crescent.  He has FAQ information along with links and additional information that you will find very helpful.  There is a brief video you can check out on BizIntelligence.tv – Data Visualization Done Right: Project Crescent and if you didn’t get to see the keynote yesterday you can head over to the SQL PASS site and watch the on-demand video recording (fast forward about 3/4 of the way through to get to Amir Netz’s presentation) – PASS Summit 2010 Day One Live Streaming Keynote.

Kudos to the product team and can’t wait to see how this evolves over the next year.  Very exciting news as always.

Posted in SQL Server | Tagged: , | Leave a Comment »