Dan English's BI Blog

Welcome to my random thoughts in regards to Business Intelligence, databases, and other technologies

Archive for January, 2011

Free SQL Server Community February 2011 Training

Posted by denglishbi on January 31, 2011

Head over to Pragmatic Works site and take a look at all of the training that is available in February.  In February they have 11 business intelligence webinars lined up presented by Brian Knight, Patrick LeBlanc, Shawn Harrison, Sherri McDonald, Jose Chinchilla, Mike Davis, and Devin Knight  The technologies covered will be SSIS, BI xPress, SSRS, SSAS, and Task Factory.image

  • February 1, Brian Knight – Data Cleansing in SSIS
  • February 2, Brian Knight – BI xPress Launch Party
  • February 3, Patrick LeBlanc – Change data capture with SSIS
  • February 8, Shawn Harrison – Building your first SSRS Dashboard
  • February 9, Brian Knight – Administering SSIS with BI xPress
  • February 10, Sherri McDonald – Introduction to SSRS
  • February 15, Jose Chinchilla – Building SSRS reports on your cube
  • February 16, Mike Davis – Developing BI solutions faster with BI xPress
  • February 17, Jose Chinchilla – Building SSRS reports on your cube
  • February 22, Devin Knight – SSAS Security
  • February 23, Devin Knight – Taking SSIS to the Next Level with Task Factory

If you miss a session they record them and you can check them out later on-demand in the webinar resource area.

It looks like they already have the line up going for March as well, they already have six sessions posted, so check out the online training schedule.

Enjoy!

Posted in SQL Server, Training | Tagged: | Leave a Comment »

Reporting Services (SSRS) OR Filter Logic

Posted by denglishbi on January 28, 2011

In designing a report one of the requirements is typically to add parameters and be able to filter the results displayed in the report.  For the most part these types of requests are to filter the data so that the reports include data where the values are inclusive to what is selected. 

Scenario

Let’s say that I am working on having a sales meeting and want to include the local sales people along with some of my top performers.  I would create a report with two parameters, one to filter the report by state and the other based on the sales amount like the following:

image

In the example above I am using the ever famous Adventure Works 2008 R2 database pulling data from the Sales.vSalesPerson view.  I am referencing the same view for the additional dataset that I use to provide available values for the StateProvince parameter performing an aggregate ‘Grouped by’ on the Sales.vSalesPerson.StateProvinceName column.

I then setup the report to display the list of sales people that I would like to send an invite to for the meeting.  My criteria will be to include the sales people that live in Minnesota because that is where I am planning on having the sales meeting and the second criteria is to also include people that have had sales of at least $1 million.  I go ahead setup the parameters with the dataset, create the report, and select the criteria.

image

image

One thing to point out here is that I am placing the filtering criteria on the Filters portion of dataset and it is not embedded in the Query.  In this case I am using a database view that I could embed this logic, but there might be certain scenarios when you are referencing an item that you cannot do this like against a stored procedure or SharePoint list.

Issue

The report returned the results based on my criteria, but in this case what was returned was only a single sales person.  Not exactly what I wanted.  What I was looking for was to include the sales people that lived in Minnesota along with the people that had at least $1 million in sales.  The way that the report was setup it only returned results for people that met both criteria, not one or the other.

If I go back into the Filters section of the dataset properties I do not see an option to change the relationship of the filters from an ‘And’ to an ‘Or’ relationship.  Where is the logical operator selection?  Hmmm….  If I travel back in time a few years I believe that Report Builder 1.0 had an option to specify the logical operator between the parameters that I would use.

image

Working with a Filter Condition (Report Builder 1.0)

Logical Operators

As you add additional filter conditions to the filter area, a logical operator is displayed between each filter condition. Report Builder supports the And and Or logical operators. By default, the logical operator is And, but you can easily change it by clicking And and selecting Or.

Interesting.  Looks like this what kind of left out with the newer versions of Report Builder.  So where do we go from here?  How about we head over to the Reporting Services forums and see what we can come up with?  After a few searches we come across this thread – How to filter a table with an “OR” condition.  We find out that you would run into this same type of issue with data regions on the report if you try and add filters to them as well.  The solution for this issues is not as obvious as one would think, but there is a workaround based on the answer to the thread.

Solution

In order to resolve the fact that there is no option to select the logical operator between filters they must be combined into a single filter option.  This mean that the second filter will be removed and the OR statement will be placed into an expression in a single filter.  This same technique can be utilized on a data region as well within the report.  In the configuration of the filter we will set the expression like such:

image

And the Filters portion will look like the following now with the value being set to True and the expression type have a data type of Boolean.

image

Now when we go and run the report the results contain not only the sales person from Minnesota, but also the other sales people that had sales of at least $1 million.

image

Now we have the complete list of 14 sales people that we can send an invite to attend the sales meeting.  The one person that lives in Minnesota and the other 13 that had at least $1 million in sales.

Conclusion

Coming to solutions is not as always as straight forward as one would assume.  There is typically always a workaround to get at the desired results.  You might not necessarily have to go to the lengths that we did here if you are going against the database directly, but in some cases you might be using a stored procedure that you cannot modify or using a SharePoint data source that you might not be able to incorporate this logic against directly.

If you are interested in taking a look at the RDL file, feel free to download it from my SkyDrive account here:

image

Posted in Reporting Services | Tagged: | 16 Comments »

24 Hours of PASS Spring 2011

Posted by denglishbi on January 27, 2011

The next 24 Hours of PASS is coming up in March and the session line up is set.  This event will be celebrating the Women in Technology and all of the speakers are females.  This is a free online event, so don’t miss out.

There will be three tracks – BI, DBA, and Dev24 Hours of PASS

Here is the sessions for the BI Track:

And from MN, Lara Rubbelke will also be presenting in the DBA track:

Check out the complete list of sessions broken out by track here – Session Listings by Track.

Don’t miss out on this great chance to hear about some great topics.

Posted in Training | Tagged: | 1 Comment »

Twin Cities SharePoint Saturday April 2011 – Call for Speakers

Posted by denglishbi on January 26, 2011

I just saw that the next SharePoint Saturday event is in the works for the Twin Cities.  Currently the call for speakers is open and you have until Friday, February 11.  The event will be on Saturday, April 9 in Bloomington, MN at the Normandale Community College.

SharePoint Saturday Twin Cities

I didn’t make it out to the event this past October, so not sure what the event turnout was like.  There is a short video you can check out here

Looks like it was a fun time and as they state in the video

All that and a bag of chips…for FREE at: SharePoint Saturday Twin Cities

So if you are interested in presenting at the event coming up in April head over to the site to get the details and get your abstract in before February 11.

SharePoint Saturday Twin Cities site

Maybe I will see you there at the event.

Posted in Training | Tagged: , | Leave a Comment »

Calculating Cumulative Values in Microsoft BI Tools

Posted by denglishbi on January 24, 2011

In working on a business intelligence project a common situation you will find yourself in is having to come up with a cumulative value (running value) calculation.  The challenge that you will find is that depending on what product you are working with the calculation that you need is going to be different.  In this post I will compare how this can be done with T-SQL, Excel, SSIS, SSRS, MDX, and DAX.  Along the way I will provide some additional reference links to other options, content, and I will point out some differences in how the totals are being calculated so you will know what to expect with the results.

In these examples I will be working with the Contoso Retail Data Warehouse data sample.  These examples below are not necessarily going to be the optimal approaches, just showing how you can go about getting at the desired result.

SQL Server T-SQL

So let’s start out taking a quick look how calculating the sales over each of the years directly against the data in the database through SQL Server Management Studio.

SELECT d.CalendarYear,

       SUM(f.SalesAmount) SalesAmount,

       SUM(f.SalesAmount) + COALESCE((SELECT SUM(SalesAmount)

                        FROM FactSales fs JOIN DimDate dd ON fs.DateKey = dd.DateKey 

                        WHERE dd.CalendarYear < d.CalendarYear),0) CumSalesAmount

FROM FactSales f

INNER JOIN DimDate d ON f.DateKey = d.DateKey

GROUP BY d.CalendarYear

ORDER BY d.CalendarYear

And the results for this are:

CalendarYear SalesAmount           CumSalesAmount
———— ——————— ———————
2007         4561940955.0212       4561940955.0212
2008         4111233534.6841       8673174489.7053
2009         3740483119.1823       12413657608.8876

 

Not too bad.  We have an inner query that is being used to return the pervious year(s) values and append that onto the current year (if they exist).  If no value is returned then a zero is used in its place.  For additional examples and options take a look at this posting by Garth Wells – Calculating Running Totals.

Microsoft Excel

So now lets take a look at how this would be done if we were using Excel. 

image

There area a few different options you have here, in this example I enter a formula in the C2 cell of =SUM($B$2:B2) and then copy that formula down into cells C3 and C4.

SQL Server Integration Services (SSIS)

Now we will take a quick look at how this could be done in the ETL process.  As we are pulling data from a source we can evaluate the values in the data flow and accumulate the values inside a script component.  In the source we will simply get the sales for each year and then append the values in the flow to each other to get at the same results displayed above.

image

The cumulative value is being performed in the script component with some code.  We have a variable setup that we are using to append the sales amount of each row in the data flow and then sending this value back out in a new output column called CumulativeSales.

image

Nothing too fancy going on here.  I don’t have a destination in the flow, just added a derived column to get the data viewer included so that I could run the data flow to show the results.

SQL Server Reporting Services (SSRS)

Now lets switch over to Reporting Services.  In this example we will not do the cumulative value in the dataset query, we will do this value calculation in the report. 

image

In this example we leverage the RunningValue aggregate function that is available in Reporting Services.  The expression that we use in the Cumulative Sales column is =RunningValue(Fields!SalesAmount.Value,Sum,”DataSet1″)

image

This function returns a running aggregate of the values in our specified dataset using the SUM function which is also specified in the expression.

SQL Server Analysis Services (SSAS) MDX

Now how about taking a look at Analysis Services.  How would we go about doing this calculation using MDX?

WITH MEMBER [Measures].[Cumulative Sales] AS

    SUM({null:[Date].[Calendar Year].CurrentMember},[Measures].[Sales Amount])

    

SELECT {[Measures].[Sales Amount],[Measures].[Cumulative Sales]} ON 0,

    NON EMPTY [Date].[Calendar Year].[Calendar Year].Members ON 1

FROM [Sales]

And the results of this are:

image

We can apply some filters into the MDX statement to remove the years that currently don’t have Sales Amount associated with them.

WITH MEMBER [Measures].[Cumulative Sales] AS

    SUM({null:[Date].[Calendar Year].CurrentMember},[Measures].[Sales Amount])

    

SELECT {[Measures].[Sales Amount],[Measures].[Cumulative Sales]} ON 0,

    NONEMPTY([Date].[Calendar Year].[Calendar Year].Members,[Measures].[Sales Amount]) ON 1

FROM [Sales]

This returns what we have seen in our other results:

image

There are additional ways in going about this calculation and for more options take a look at this recent posting here by Amish Manubhai Shah – Various way to calculate running total from All Period to Currentmember. Now I want to show how this solution differs a bit from the other ones we have seen.  What happens if I apply a filter to the query to only display 2008 and 2009?  What would you expect?

image

Might not be what you would want, but then again maybe it is.  The Cumulative Sales calculation is summing up the values from the beginning of time (the [All] level).  So the first value that we see in 2008 is including the Sales Amount from 2007 as well (if there was sales in 2006 it would include that as well and so on). 

You can make a few modifications to the calculation and setup a named set to reference the items and come up with something along these lines.

WITH DYNAMIC SET [SelectedYears] AS Existing [Date].[Calendar Year].[Calendar Year].Members

 

 MEMBER MEASURES.[Cumulative Sales] as 

    SUM({EXISTS([Date].[Calendar YQMD].[Calendar Year].Members,

    [SelectedYears]).Item(0):[Date].[Calendar YQMD].CurrentMember},

    [Measures].[Sales Amount])

 

SELECT {Measures.[Sales Amount], Measures.[Cumulative Sales]} ON 0,

    [Date].[Calendar YQMD].[Calendar Year].Members ON 1

FROM [Sales]

WHERE {[Date].[Calendar Year].&[2008],[Date].[Calendar Year].&[2009]}

With the following results:

image

Here a dynamic set is used to get the references of the members that are in the slicer (WHERE clause) and feed this information into the calculation for the cumulative sales to grab the first item to come up with the reference to the 2008 year.  This will be the starting point for the cumulative total instead of from the beginning of time.

You might just want to look at year-to-date cumulative totals and be able to drilldown into a user defined hierarchy if you have one setup like Year-Quarter-Month-Date.  If we modify our calculation a bit we might end up with something like the following:

WITH MEMBER [Measures].[Cumulative Sales] AS

    SUM(YTD([Date].[Calendar YQMD].CurrentMember),[Measures].[Sales Amount])

 

SELECT {[Measures].[Sales Amount],[Measures].[Cumulative Sales]} ON 0,

    Hierarchize({[Date].[Calendar YQMD].[Calendar Year].Members,

    [Date].[Calendar YQMD].[Calendar Quarter].Members}) ON 1

FROM [Sales]

WHERE {[Date].[Calendar Year].&[2008],

    [Date].[Calendar Year].&[2009]}

 

With the following results:

image

Here we can see that our cumulative total stops at the year level and we can see at the lower levels that the values are accumulating as expected, so the cumulative total at Q4 is same as the Yearly value.  What we have now is a YTD calculation.

PowerPivot for Excel 2010 DAX

We have one more product that I want to take a look at and that would be PowerPivot.  This is going to be fairly similar to the Analysis Services solution since it actually is Analysis Services behind the scenes, the only difference here is that we will be using DAX instead of MDX.

So for the first example we will setup a cumulative sales calculation just like we did with the first MDX example.  The calculation we use is as follows:

=CALCULATE(SUM(FactSales[SalesAmount]),

DATESBETWEEN(DimDate[Datekey],

FIRSTDATE(ALL(DimDate[Datekey])),

LASTDATE(DimDate[Datekey])))

So we end up with the following:

image

You could go ahead and add some additional logic to evaluate the Sales Amount to determine if a value exists, but you get the general idea here.

Once again this is calculating the cumulative total from the beginning of all time and that is done by using the ALL reference in the calculation for the FIRSTDATE.  If we filter the years and only display 2008 and 2009 we will see the similar results like we did with MDX where the cumulative sales amount for 2008 includes the 2007 sales.

image

The one nice thing about PowerPivot is that this is in Excel, so if you want to do anything a little more custom or make any references to cells you can do that and use Excel functions as well.  As far as setting up a cumulative sales amount value like the second MDX example I can’t determine an equivalent in DAX.  Maybe someone else has an idea if this can be done so that it only does this based on the Date values being evaluated.  If you do, please leave a comment.

You can do the YTD calculation in DAX and here is formula for that:

=TOTALYTD(SUM(FactSales[SalesAmount]),DimDate[Datekey])

And the results look like this if we break out the years by the quarters:

image

Once again we can see that the yearly total is equivalent to the Q4 for the same year.  For some additional information and explanation of the PowerPivot calculations take a look at a great post by Kasper de Jonge here Accumulate values (running value) over multiple years in PowerPivot using DAX.

Conclusion

Cumulative total (running value) calculations are all calculated differently within each of the Microsoft products.  We took a look at some examples of how this can be done in T-SQL, Excel, SSIS, SSRS, MDX, and DAX.  These are not the only options, so if you want to see more take a look at the additional posts that I included as reference points the different sections.  It is important to understand how the formulas need to be configured and also what the results are once the calculations and logic is put into place.  I think that one of the easiest ones, besides the Excel formula, would have to be the SSRS with the RunningValue aggregate function.  I like how this is setup and it evaluates everything based on the context that is being referenced.  So if we go back to the SSRS report and apply a filter to the tablix to remove 2007 from what is displayed we would end up with the following:

image

Very simple and easy to understand, but that is just my personal opinion.

Posted in Analysis Services, Integration Services, PowerPivot, Reporting Services, SQL Server | Tagged: , , , , | 10 Comments »

Free SQL Server Community January 2011 Training

Posted by denglishbi on January 20, 2011

Head over to Pragmatic Works site and take a look at some training that is available this month.  This month you can listen to some more great business intelligence content presented by Mike Davis, Adam Jorgensen, Brian Knight, and Joe Salvatore.  The technologies covered will be SSIS, PerformancePoint, and PowerPivot.image

  • January 18, Mike Davis – SSIS Troubleshooting
  • January 20, Adam Jorgensen – 0 to Dashboard in PerformancePoint
  • January 25, Brian Knight – Introduction to PowerPivot
  • January 27, Joe Salvatore – Loading a fact table

If you miss a session they record them and you can check them out later on-demand in the webinar resource area.

Enjoy!

Posted in SQL Server, Training | Tagged: | Leave a Comment »

Minneapolis Tableau User Group January 19, 2011

Posted by denglishbi on January 14, 2011

Next week a new user group is starting up in the Twin Cities.  This is being started up by one of my friends, Alan Wernke.  If you are interested in Data Visualizations then this will be a user group you will want to check out.  This is the very first meeting and the topics that will be presented are a case study with the Tableau 6.0 software along with tips & tricks.  Of course there will be time for networking as well.

image

Date: Jan. 19th – 3:00pm CST

Duration: 120 minutes

Location

Minneapolis College of Art and Design

2501 Stevens Avenue

Minneapolis, MN 55404

United States

See map: Google Maps

Proposed Agenda:

  • Introduction & Networking
  • Bike Share Program (NiceRideMn) Case Study – Tableau Software 6.0 in action!
  • Networking
  • Tableau 6.0 – Tips & Tricks
  • Q & A
  • More networking

If you are interested in visualizing your data and dashboard solutions then you will want to check out this event – Register Here.

If you are not familiar with Tableau you might want to check out a couple of areas on their site Solutions and Visual Gallery.  You might want to check out a white paper on their site by Derek Comingore, Microsoft SQL Server MVP, titled Self-Service BI Synergies with Microsoft PowerPivot.

Posted in Business Intelligence, Training | Tagged: , | Leave a Comment »

PASSMN January 25, 2011 Meeting

Posted by denglishbi on January 12, 2011

UPDATED (1/14/2011): updated with registration link and sponsor information.

The next Minnesota SQL Server User Group (PASSMN) meeting is coming up and the date, time, and location has been moved from the typical.  The next meeting will be on Tue, Jan 25 from 5 to 7 and it will be in the 8000 building instead of the 8300.

The sponsor for this meeting is Benchmark Learning:

Benchmark Learning

“Error Logs and Deadlocks, Oh My!” and “Seeking SQL Server Secrets”

Agenda:

  • 4:30-5:00 : Registration / hospitality / networking
  • 5:00-5:15 : Announcements etc.
  • 5:15-6:00 : Jason Strate:  Error Logs and Deadlocks, Oh My!
  • 6:00-7:00 : Kalen Delaney:  Seeking SQL Server Secrets

Presentation:

Error Logs and Deadlocks, Oh My! (by Jason Strate, Digineer) – Tired of turning to the SQL Server error log to find out what’s happened? Would you rather just know about a deadlock when it’s occurred rather than later in the day when the manager as you why HRs bonus distribution application failed? Find out how to get this information and more delivered to your inbox so that you can head off issues instead of drowning in them. In this session we’ll look at Event Notifications and find out how to stop having to monitor SQL Server error logs today.

Seeking SQL Server Secrets (by Kalen Delaney, SQLearning) – Many experienced DBAs are aware that there are many undocumented commands and traceflags in SQL Server, and some DBAs may even keep track of all references to such options whenever they read about them in an article or blog posts, or hear about them at a conference. In this session, I’ll not only tell you about some of my favorite undocumented features, I’ll also tell you some of my tricks for discovering more undocumented secrets.

Sounds like some good topics, don’t miss out!

Posted in Training | Tagged: , | Leave a Comment »

Book Review: Microsoft SharePoint 2010 PerformancePoint Services Unleashed

Posted by denglishbi on January 10, 2011

I finally wrapped up my read of the Microsoft SharePoint 2010 PerformancePoint Services Unleashed book by SAMS publishing.  I was not provided the book to review, so just want to state that up front.  I thought that the authors did a great job going over the product providing an overview, installation information, component breakdown, dashboard development, PerformancePoint 2007 differences and migration, and custom development.  I thought the authors did a very good job explaining the product and they even had a case study in regards to Apples and Oranges which is a broadcast company wanting to evaluate The Green Orange show.  I definitely like the idea of bringing in the case study, just not sure if that theme was pushed throughout the rest of the book showing how the product helps them to be able to monitor and analyze information about the show.

I liked that the book included Notes, Tips, Summary, and Best Practices in each chapter.  Getting some additional insight and tips from the authors was good to help explain and provide some additional guidance and reasoning.  I liked chapters 11 and 12 where the authors showed how you can interact with the API, perform custom development with the SDK, and manage the product leveraging PowerShell scripts.

If you are new to the PerformancePoint product and will be working with the SharePoint 2010 product I would definitely recommend picking up this book as a reference point.  I think it does a good job providing examples and will prepare you for building dashboards with the product.  I think if the case study would have been fully incorporated along with some more dashboarding explanation and advanced PerformancePoint topics that would have really completed the book.  The end chapters that I mentioned above did help though.  I think the book could have been just a tad bit longer.

All-in-all I would give it a 3.5 out of 5 (five being the highest) rating – image and the price that you can get it for is definitely reasonable, plus I believe this is the only PerformancePoint 2010 book that is currently available.  There are a couple of items available for download that you can take a look at (an Excel file and SSAS backup) once you register the book on the Informit site.

Microsoft SharePoint 2010 PerformancePoint Services Unleashed (9780672330940): Tim Kashani, Ola Ekdahl, Kevin Beto, Rachel Vigier: Books

ISBN: 0672330946
ISBN-13: 9780672330940

 

And just an FYI, you can install and use Per User Identity authentication for security on a data source in a single server environment (farm setup).  This does work, you do not need to do a standalone SharePoint install.

Posted in PerformancePoint | Tagged: , , | 1 Comment »

Using PowerPivot with PerformancePoint Services (PPS) 2010

Posted by denglishbi on January 3, 2011

[tweetmeme source=”denglishbi” only_single=”false”]

UPDATE (6/20/2011): Modified table layout of step-by-step walkthrough to be two columns instead of three.  Added a downloaded PDF file as well.

UPDATE (2/24/2012): I have a blog for using PowerPivot 2012 with PerformancePoint Servers here – Using PerformancePoint Services (PPS) with PowerPivot SQL Server 2012 RC0.

Analysis Services was one of the primary data sources with the previous version of PerformancePoint to get the most functionality out of the product.  Now with the release of the 2010 product we have some new options and one of them is PowerPivot.  PowerPivot is also a multi-dimensional data source and in this post I will talk about what this source provides for functionality and what it does not compared to Analysis Services.

First of all you will need to have an environment that has PerformancePoint Services configured.  If you are not familiar with how to do that you can take a look at this great posting on the product team’s web site – Deploying PerformancePoint 2010 Soup to Nuts.  Once you have that setup then the other thing that you are going to need to do is have PowerPivot setup with SharePoint so that we will be able to utilize the files as data sources.  There are plenty of articles and material available in regards to this configuration, here is one of them – How to: Install PowerPivot for SharePoint on a New SharePoint Server.

Once you have your environment setup and ready the next thing will need to do is create your PowerPivot data source that you are going to leverage to create your dashboard with.  In this example we are not going to be creating the file and leveraging the PowerPivot PivotCharts and PivotTables, we are simply going to be setting up a file that will be used to provide the data and measures needed for analysis.  I will be using it in place of having to implement an Analysis Services solution (except for the fact that PowerPivot is really an Analysis Services database under the hood).

Some of the reasons that we might want to use PowerPivot in a PPS dashboard would be because:

  • ability to leverage PPS analytical features like analytical grids & charts and decomposition tree
  • quickly integrate disparate data for analysis into a single source
  • leverage existing Excel Power User expertise, data, and files

For this example I will be leveraging the Contoso Retail BI Demo data – Microsoft Contoso BI Demo Dataset for Retail Industry.  I will also be using the following components to put this example together – Excel 2010 with PowerPivot add-in (this can be downloaded from here – PowerPivot Download), PowerPivot for SharePoint 2010, PerformancePoint Services, and SharePoint 2010 Enterprise.

Warning: This example does not include each step-by-step instruction and expects that you have some basic knowledge of PowerPivot, DAX, and PerformancePoint.

image image
First step is to launch PowerPivot Window from Excel 2010 to load data Now from the PowerPivot Window we will select the option to load data from a SQL Server database
image image
Select the ContosoRetailDW SQL Server database Choose the option to select from a list of database tables
image imageimage
In this example we will select DimDate, DimGeography, DimProduct, DimProductCategory, DimProductSubcategory, Dim Store, FactSales, and FactSalesQuota For the two fact tables we will remove a few columns that are not needed for analysis during the load using the Preview & Filter option (remove initial fact key column and last three ETL columns).  Also add a filter for FactSalesQuota on ScenarioKey to only include values of 2 (for Budget)
image image
Now you can click finish and watch the data load – over 5 millions rows of data! During the load process the relationships that were defined in the SQL database were retained, so we will not need to setup any additional relationships between the tables
image image
image
We might simplify the snowflaked model and pull the Product Category and Subcategory information into the Product table using DAX and the RELATED function To simplify the model a bit more we can remove columns and tables that are exposed to the PivotTable in Excel – remove the surrogate key, foreign key, ETL columns, and in the related tables remove all columns
imageimage image
Now the creation of the PivotTable data (measures) which will be needed in PerformancePoint to reference – I will comment on this more down below after the example Measures have been added for PerformancePoint, so now the file can be published to SharePoint to be referenced
image image
File –> Save & Send –> Save to SharePoint –> Save As (this will be saved to a Trusted Excel Services location in SharePoint) In Dashboard Designer a new Data Source will be created – go into the Data Connections library and add a new PerformancePoint Data Source (this will launch Dashboard Designer)
image image
Create a Multi-Dimensional Analysis Services data source and use a connection sting pointing to the Excel file in SharePoint – “DATA SOURCE = http://mybi10/Workbooks/ContosoSalesAnalysis.xlsx” and the cube name is “Sandbox” If you receive this message when setting up the data source to pick the cube you will need to verify the connection string URL and/or verify that the SQL Server Browser service is running
image
image
image
Now create a couple of KPIs for Sales and Gross Margin information (you will need to use the Add Lists option in the Workspace portion of the Ribbon to add the PerformancePoint Content list). Make sure to change the Calculation option from Default to Data Setup a Scorecard and add a dimension column (ProductCategory) to analyze
image image
Create an Analytical Chart (you need to save the data source if you haven’t done so yet to create the analytical reports). To apply formatting to the measures switch to Grid report type and change formatting to $ and %.  Also filter empty axis and series Add another Analytical Chart. Apply appropriate formatting and filter to axis and series
image image
image
Add an Analytical Grid. Adjust Settings so that the column and row headings are not truncated in the View options in the Ribbon Create a couple of Filters, one for the FiscalYear and another for RegionCountry (an MDX Query was used along with the Tree view setup for each filter)
image image
Create and setup the Dashboard Connect the filters to the Dashboard items
image image
Save all of the content and then deploy the Dashboard to SharePoint View and test the dashboard page
image
image
View the Decomposition Tree (requires Silverlight)

A couple of items that I wanted to discuss briefly about in regards to this example was the reasoning for setting up measures in the PowerPivot PivotTable and the other was for setting up the KPIs Calculation as Data Value.  External tools require that you setup measures in the PowerPivot file in our to reference.  Whether you did this in PerformancePoint or Reporting Services, you would need to have measures defined to be able to reference to utilize for analysis.  PowerPivot doesn’t have the ability to setup formatting either to pass through, so you also have to perform this in the tools referencing it currently.  You can reference items placed in the PivotTables in the Values section, but you can’t guarantee these will always exist, so I recommend setting up measures for every metric that you want to evaluate and utilize with external tools.

The other items that I wanted to mention was the KPI calculation setting.  This was changed to Data value so that we would be able to leverage the Decomposition tree option in the right-click context menu.  If this was left as Default value this would have been disabled.  You currently cannot use the Show Details option since drillthrough is not an option in the current PowerPivot release, but you can use the Decomposition tree for analysis.

For the most part PowerPivot provides very similar functionality when compared to Analysis Services.  A few items that you don’t get will be Show Details (Drillthrough), Named Set references, measure formatting (along with formatting configurations – colors), and user-defined hierarchies.

For a good reference point for this type of a setup take a look at this white paper – PerformancePoint Services and PowerPivot for Excel (white paper).

This white paper covers the basics of integration between PerformancePoint Services in Microsoft SharePoint Server 2010 and Microsoft SQL Server 2008 R2 PowerPivot for Microsoft Excel 2010. It describes the basics of utilizing a PowerPivot model in PerformancePoint Services and expands on issues and scope of using these applications together.

Well that was a really brief and fast example of using PowerPivot with PerformancePoint, but you can see how quickly you can build out a fairly advanced dashboard that provides insight and analytical functionality to the end-users.

If you liked this posting you might also be interested in my other two that are similar for Excel Services and Reporting Services:

Enjoy!

Download:

image

Posted in PerformancePoint, PowerPivot | Tagged: , , | 23 Comments »