I have recently finished MOC 50429 SharePoint 2010 Business Intelligence with Lior Cohen Amikam which I found very enlightening. I thought I’d share with you some of the excerpts from my summaries throughout the course. It’s a mix of ideas of what to do with SharePoint BI abilities and a short description of what they are. This is the second post in a series of posts that I am making from the material of the course.
PerformancePoint Dashboard Designer is the place where we connect the different parts of the dashboard one to another. We build the different reports, (scorecard, filter), and then use the designer to connect them all in one dashboard. The dashboard is only a place to link content which was saved in the different PerformancePoint folder (be it for reports, scorecards etc.)
The minute we save a change to the content (the report or the scorecard saved in the dashboard for instance), we needn’t update anything in the dashboard – the change will show automatically in it.
If we change something in the dashboard itself (adding a new filter to the dashboard for instance), the user will see it only after the deploy of the dashboard.
After deploying the dashboard, we’ll get an .aspx page with web parts with PerformancePoint content type.
Each deploy of the dashboard will overwrite definitions we may have made in between deploys in the SharePoint itself (which is why we should keep them in the definitions to the CSS or Master Page that the dashboard is connected to).
We can use the filter even to let the user decide which columns to show in the report.
We’ll show in each Dashboard a different issue (a separate one for HR, Finance etc.). In each Dashboard we can use a different tab for a different level of granularity and detail.
When we work with the Dashboard Designer, we can save a workspace locally. We can use that as a backup of our dashboard design, or to help us move the dashboard between environments.
I can open a workspace from my local PC and then chose to mark the difference between workspaces to see the difference between my copy and the one on the server. I can also choose compare item to see in more detail the difference. If I want I can save my version over the one on the server or I can click on the added item and choose to remove it from the workspace on my PC and add to it the item from the server. If we keep versioning on the server then we can always go back to an earlier version.
PerformancePoint runs a query for each value it needs to calculate. The query, isn't necessarily the optimal query (a bit like the report builder 1.0 query isn't the best T-SQL you could write). That means that usually calculating a scorecard with a KPI would be very pricey performance wise (a lot of values to calculate). So if you're cube is big and not built by all the best practices + the query of the scorecard isn't optimal, then you would find the scorecard taking a long time to load or refresh.
When we create a scorecard, we can create a few goals for each row – how we stand in comparison to last year, how much did we sell in comparison to our forecast, how much did this department sell in comparison to a fellow department etc.
We can also base one target on the calculation of another target.
If we name a column in a table as: filter, key, index etc. then SharePoint won't prevent us from building a filter on it, but the filter won't work. Avoid using reserved words in your column names (a column called country_key on the other hand won't be a problem).
We can also define a filter on any other tabular data source like a SharePoint list, an Excel file etc.
Using a Time Intelligence filter will give us a calendar. Connecting a filter to a time dimension will give us a tree view instead.
If we deploy from PerformancePoint to a "regular" page, we'll delete whatever content there was on that page. That means that if I had a text filter on that page – it'll be deleted. Maintenance of a page which has Dashboard content and regular content can be problematic and we should consider adding PerformancePoint web part manually to the page along with other web parts.
I have recently finished MOC 50429 SharePoint 2010 Business Intelligence with Lior Cohen Amikam which I found very enlightening. I thought I’d share with you some of the excerpts from my summaries throughout the course. It’s a mix of ideas of what to do with SharePoint BI abilities and a short description of what they are. This will be the first post in a series of posts that I am making from the material of the course.
We can leverage the user’s personal page to show his photo (a personal touch), his history of reports and his documents (for easy access).
A site with readymade BI parts bundled to one site. We can also add to this site web parts and libraries from other types.
In the very same way, we can put a KPI and a graph outside of a PerformancePoint page.
Excel Services enables us to manage one Excel file that will be shared between all the users on the SharePoint without a specific place in the NAS.
We can upload an Excel 2010 file and the user can edit it in Excel Webb App without having Excel 2010 on his own PC.
If I downloaded an Excel file from the SharePoint, changed something in it and saved it, it’s automatically saved to SharePoint.
We can define a name for each part of the report in Excel and use just that part. For instance, when we create a report in Excel with a table, a chart and a parameter – we can give a name for each one of them. Then, we can choose in SharePoint which part to expose from all of those parts (just the parameter and the chart for instance) and not necessarily all the Excel sheet.
A 3-D graph in Excel will look like a 2-D graph in SharePoint.
You cannot use a Macro in Excel in SharePoint. You have to turn it into a UDF.
We can define in the site level the amount of time a query can run in Excel and that way limit the resources it can take.
We can use the new REST API in Excel in SharePoint 2010 to insert a “live” chart to either Word or PowerPoint.
At the end, I had to finish my questions quickly as Mr. Hathi did not stay till the convention, but I did feel like I had quite a few questions to ask:
- There were a lot of new features in PowerPivot v2. A lot!
And as PowerPivot and Tabular SSAS is getting a bigger part of Microsoft BI, I couldn't help but think to myself – so how can I as a user affect it's changes.
- Microsoft continues to empower SharePoint through it BI SQL Server 2012 offering. I remember that in my interview with Mr. Donald Farmer, we discussed the obstacles that this connection creates for Small to Medium sized businesses. But I didn't feel much of a change. I understand now that BI with Azure is becoming the solution through: SSRS in Azure, SSIS in Azure and hopefully (soon) SSAS in the Azure Cloud. In essence, I think what Microsoft is doing with Hadoop on Azure and BI is also a sign of things to come.
- Another point I was interested in understanding was Data Mining. In SQL Server 2005 it felt like Microsoft was giving Data Mining a big push forward. It was a part of SSAS, we had a Data Mining Excel 2007 Add – in, it had a team blog. Since then, Jamie MacLennan and Bogdan Crivat have moved to Predixion and development on Data Mining had seemed to stop for Microsoft. I was wondering what this meant for the current users of DM.
- Last but not least, Microsoft has also greatly changed the pricing for SQL Server 2012. They have moved from processor-based to core-based licensing, making it at times more expensive for their customers. I remember hearing some talk about it even in the BI User Group meeting which introduced SQL Server 2012. The move also sparked a word from their competitors QlickView. And so, I wanted to know what opportunities Microsoft saw in this big change.
And now, to the video:
(Please visit the blog for the video as I cannot embed it for the RSS feed)
I cannot finish this post without a few thanks: first of all to Ori Weinroth Product Marketing Manager at Microsoft Israel that helped me greatly in getting the meeting with Mr. Hathi. Secondly, to my trusted camera man, sound technician and good friend who calms me down – my brother Adlai Maschiach. Thirdly to Kidi who helped with the video. Also to Nir who found the newspaper article and showed it to me, urging me to do the interview. And last but not least, to Mr. Hathi who took the time to sit down for the interview :)
I hope you liked it and next time, if I have more time to prepare, I hope to include your questions as well!
OfficeWriter is an application that works with Reporting Services in two ways:
1. It Allows you to design a report in Word or Excel and render it to an RDL
2. It allows you to take data from an RDL report and use it in a Word or Excel document
I believe just those two point hold within them quite a lot, but we'll get to that shortly.
I decided to try my hand at creating a report on Adventure Works data:
I won't go through how to build a report using OfficeWriter as that's explained in the site in much detail. I would say that it's a matter of half an hour at most of playing with the application for you to learn it.
So, let's jump straight into what I liked and didn't like in OfficeWriter 8. I will refer mostly to using OfficeWriter in Excel though the pros and cons also apply to using it with Word:
- You can export your SSRS report to Excel 2007 and Word 2007 onwards even if you're using SQL Server 2005 (no need to upgrade to SQL Server 2012 for that).
- The data showing in your Excel or Word report will be refreshed every time you press the refresh button allowing you to always be on top of your most current data. You can add fields or parameters to the SSRS report that’s the data source for your user’s Excel or Word report and they’ll show up automatically. That's rather than rendering a new Excel or Word report every time the data changes.
- Your customers can actually connect directly to the updated data in your report without having to use PowerPivot on Excel 2010. That means they get refreshed data without having to move to Excel 2010 and without having to learn DAX to know how to manipulate the data. In my own report, I used the Excel year function to show just the year of the due date.
- OfficeWriter also offers you some SSRS functions in Excel as well if you need to extend your abilities for the Excel report. That means that if a customer created a report in excel and may need a complex calculation that he can't do on his own, you as a developer can take that (Excel as RDL) report and do that for him.
- Every user of OfficeWriter has to publish his \ hers reports to the Report Server before they can view them. That means that all the reports your users are creating with OfficeWriter will necessarily be saved on the Report Server. Now, that doesn’t give you all the capabilities that SharePoint has for managing the organizational data, but it’s definitely a good aid in helping you manage those reports and the security on them. You can use the Report Server to give each user a folder of his own with specific security rights. You can also view the reports your users created when you export the SSRS report that’s in the folder to OfficeWriter Excel.
- The SSRS report your customers use as a data source for their own report has to have all its fields showing if they want to look at it as an SSRS report. Otherwise, the reporting services report won't show the values for the fields that are hidden in the original data source of the SSRS report. The Excel and Word will show correctly.
Now all of this applies for OfficeWriter v8 which was out when I downloaded the application, but they have just now released OfficeWriter v8.1 and are offering a 15% discount if you like them on Facebook :)
So, in conclusion:
I have to be frank with you, if you're a company that's going to buy SQL Server 2012 and has SharePoint 2010 Enterprise Edition along with Office 2010 and PowerPivot, then maybe OfficeWriter is just a "nice to have" for you (though most companies I saw that have all of that, also have some "nice to have" too :)). But if perhaps you don't have all of that and you're interested in a solution to: your export to xlsx, docx (or even xls and doc), enhancing calculations in customers reports etc. etc. well, I suggest you give OfficeWriter for SSRS a look!
PASS are offering 24 Hours of PASS with 24 consecutive technical webcasts - all free and require only registration. The date for you to mark is March 21st 2012, depending on your time zone. This time around includes closed captions in 15 different languages. Sessions in 24 Hours of PASS are very SQL Server 2012 oriented and very varied in topic with something for everyone, ranging from: Tier-1 BI in the Age of Bees and Elephants - with Denny Lee, SQL Server 2012 Memory Management - with Neil Hambly, What to Look For in SQL Server 2012 Execution Plans - with Grant Fritchey and more... So give it a look!
SQL Server 2012 was released last weekend (and yes, I know this means I’m late to blog about it!). Along with it were released the Data Mining Add-In for Excel 2010 (with a 64 bit version) and PowerPivot v2.0. Whether your company just bought SQL Server 2012, you downloaded the Express Edition or the Evaluation, you could probably still use some help. The ”Introducing Microsoft SQL Server 2012 (second DRAFT preview)” EBook was released in January and is free to download – enjoy!
If you select a non – key granularity attribute, the server will not be able to aggregate data properly unless you make sure that all other attributes are directly or indirectly related to it by specifying them as related attributes
Now, I knew I had connected the attributes in the Period level to each other when I defined the user hierarchy (best practices and a little squiggly line in SSAS 2008 onwards recommend it :)):
(The squiggly line I still have is for not defining a default member for the year)
So I processed the cube with the new fact table connected to the year level. After the process finished, I looked straight away at what I got:
The year level gave me the desired number:
Opening to the intermediate level gave me the sum for the year level:
And so did the month level:
So actually I didn’t need to define any scope on the lower levels as Analysis services already knew to give the children in the user hierarchy the value of their parent. Analysis Services is one smart cookie – which is why I love her so much :)
As SQL Server 2012 and PowerPivot v2 release is getting closer and closer, knowing DAX becomes more of a necessity than a just an option. Via Kasper de Jonge’s blog I had learned of a nice 30 minutes Introduction to DAX basics. You can download an example workbook and try your hand at creating a measure formula and adding up to adding filter context to your formula. The post also had a ling to the TechNet Wiki page for the DAX Resource Center which has loads of articles, downloads and videos, along with link to other blogs and sites with more resources.
So you have all the resources you need to start learning DAX – now make those 30 minutes to actually do it :)
I was in the BI User Group meeting today (and if you're in Israel, why didn't you come too?) and heard something quite exciting. The speaker was Yossi Elkayam from Microsoft Israel and what he mentioned twice was that the release date for SQL Server 2012 Denali is March 2012. He was saying that while stating next month's meeting will preview the new release scheduled for the month after that. He also mentioned that PowerPivot version 2 will be released in proximity to the release of SQL Server 2012 Denali. This was some really exciting news for me because I remember hearing in one of the BI User Group's meeting that the release is scheduled for the first quarter of 2012, but that was put as a hopeful date rather than a closed one.
So upon hearing that the release is planned for March 2012 my first thought was that it would probably be on March 8th. That's because I was planning on going on vacation then. Somehow it made perfect sense in my mind that Microsoft would release the new version of SQL Server while I'm away and unable to read all the news and blog about it! Seems I wasn't too far away from the truth…
Microsoft is organizing a very big online event to launch SQL Server 2012 on the 7th of March! Both Ted Kummert Corporate Vice President, Business Platform Division and Quentin Clark Corporate Vice President, Database Systems Group are going to give keynote speeches. And if they're going to be there, I suggest you enlist as well ;)
So hurry up, the clock is ticking on the release of SQL Server 2012 Denali and Power Pivot v2
A short and sweet tip on using Report Builder 1.0
So, my users are still using Report Builder 1.0. I recently discovered that using a filter on a automatically calculated field there doesn't work. I tried running a report where I ran a filter on a year generated filed of a certain date. In essence, I filtered on Year creation date = 2011. The report took ages to run so I took a look at it on the profiler to see the query that was run. Apparently, it wasn't even running the filter correctly.
Changing the filter to creation date on or after 01/01/2011 ran fast and more importantly - the query was correct :)
I don't know if you've noticed it, but I don't blog all that much lately (I know blog, I know - but that doesn't mean I don't love you!). So as you may gather that if I do blog, it must be something that seems important to me.
The SQL Server Team has recently blogged about an initiative they're doing with the Pragmatic Works Foundation. Pragmatic Works Foundation is a non-profit that provides free technical training to veterans, the jobless, and underemployed. In 2012 they're doing a course for returning veterans. The SQL Server team will donate $50 for each story published about how the #SQLFamily has helped you (please visit the link for more details). Please send yours to firstname.lastname@example.org.
So I can talk the talk, but can I walk the walk?
The #SQLFamily for me is a lot of pieces forming a very big picture. It's articles about various topics in Simple-Talk which I read to better know SSIS and SSRS. It's people in the forums helping me find a solution to a problem I'm stuck with, or learn how to use a pre - released version of SQL Server or have a place to ask questions about the integration of SharePoint and BI. It's also a chance for me to meet up and talk with my peers at the BI User Group Meetings and see what great uses there can be for the technology (and for that thank you to Itay Braun and Ronen Chenn for organizing the meetings). It's where I also got to give a presentation about the Parent Child dimension and hear from other developers what they did with it in their work.
#SQLFamily for me is just like a family in the sense that it lets you learn and grow, supporting you all the way.
The news nowadays is filled with talks about a coming recession and economical crisis - no better time than now to give back.
has recently announced the SQL Server 2012
The main news is the new BI edition released with SQL Server 2012 which is
positioned between the Standard edition and the Enterprise edition. You can
read an SQL Server 2012
on the Adatis blog which lists the pros and cons of the new edition and payment
method. Amongst other things, they point out that:
Business Intelligence edition strips away
- Advanced Security (Advanced
auditing, transparent data encryption)
and provides a cut-down, basic (as opposed to advanced) level of High
felt like I read mostly good feedback from different bloggers on the matter,
coming from Chris Webb and also from Teo Lachev. So that made it
stand out for me when I saw that in Facebook Donald
Farmer wrote in his status:
new licensing for SQL Server BI Edition feels like an early Christmas present
from MSFT to #qlikview - thanks!" (With a link to the Adatis post on the
just in case you're new to the field of BI, then you should know that up till
January of 2011 Mr. Farmer was greatly perceived as the face of Microsoft BI.
But on January 2011 he chose to leave Microsoft for PowerPivot's
biggest competition - Qliktech. I value Mr. Farmer a very great deal and so, I
was more than intrigued to hear his full opinion on the matter. He wrote me
think I understand what MSFT are doing, but I think it is a mistake for them.
Standard edition was a great way for companies to start out on the BI journey. Now
BI Edition is good, but more expensive (although still competitive). But it
does not include SharePoint
which is crazy - collaboration is critical to modern BI, and the latest MSFT
tools like PowerPivot & Power View need it. Enterprise SharePoint is a big
course I believe they are getting ready for a cloud
offering at a lower cost. But cloud is a
different mode of working and we at QLIK see quite slow demand for cloud BI
because not enough data lives in the cloud yet."
note the added links are put in by me and Mr. Farmer had only given me basic
remark about the absence of SharePoint in the licensing offer was also expressed
in a comment left in Mr. Chris Webb's blog by Mr. James Snape. In SQL Server
2012 there is a growing connection between SQL
Server and SharePoint. Microsoft doesn't just enable you to integrate
SQL Server with SharePoint, but also develops SQL
Server 2012 features to work only on the SharePoint Platform.
can't help but wonder what Amir Netz,
who is chief architect for Microsoft's BI offering, is thinking about the new
Licensing options for SQL Server 2012...
So where are you in this debate? Happy for the new
edition or dissatisfied? Sound off in the comments for this post!
I found myself needing to define Model Item Security on a Report Builder 1.0 Model I did some time ago. I followed the guidelines detailed in the Model Item Security page (you can read more also about Securing Models). First of all, I would like to call to your attention that Model Item Security works only if your user isn't a Content Manager or Publisher (as noted in this article about Security in Report Builder 1.0. Please note that it describes the process in the way it worked in SQL Server 2005, which has changed for SQL Server 2008 R2). Also, as stated in the same article, you cannot create a subscription to a report which has Model Item Security in it (also mentioned in a post about Report Subscriptions by Bob Meyers).
Still, I found I had a problem with implementing the item security. I needed to show my users the total of files, but not the file ID. File ID is one of the Identifying Attributes of the entity. The user interface in the Report Manager would only let me hide File ID (and subsequently all its derivatives) without stating that Total file ID should be shown. So, I thought that maybe if I make the automatically generated Total of files as an attribute all of its own (and not as a variation of file ID) then that should work. Not so. After I turned the property of VariationOf from "File ID" to "None" for the Total files I tried to deploy the Model and got:
The attempt to connect to the report server failed. Check your connection information and that the report server is a compatible version.
Now the server was fine, but apparently disconnecting an automate aggregate from its source attribute can cause such an error message. I made it a variation of file ID again.
I also found that hiding an identifying attribute of an entity hides the entire entity(!) from the relevant user.
So as you may gather from the post, this caused me some woe... so the solution? As offered to me by our wonderful DBA Boris, I set the property of Hidden = True for the file Id and the Total of files. I then created two independent and unconnected expressions: one equal to file ID and one equal to Total of files. I could then hide each one independently, without causing me to hide the entire entity.
Now, I don't know if you've noticed but this is me writing in 2011 about Report Builder 1.0 which was released around 2006. Report Builder 1.0 isn't really supported nowadays by Microsoft as far as I know. But I keep working with it, and by blog statistics, you're still reading about it quite a bit here!
So, it's not my Birthday, but I still wanted to make a wish... if Microsoft is reading this (and I'm under the impression you are), then I would very much appreciate it if you could start developing us a wizard. You made a very simple wizard that converted SSAS cubes from 2005 to 2008 or 2008 R2. I would REALLY appreciate such a wizard from Report Builder 1.0 to a PowerPivot Model (or a Report Builder 4.0 if planned).
Not too long ago my boss Yaniv got a telephone from one of our customers saying that he thought a cube was missing some values from it. The thing is that at first that didn't sound logical to us because usually we would get an error on that. We expected the process of the cube to fail, or something of that sort. But not this time. Now, there's a few reasons why there are rows missing from a cube as recently blogged by Richard Lees. But our reason was slightly different.
We were using a view in the DSV. The view in the DB was a union of the original dimension table and a referential integrity of the dimension against the relevant fact table. That way we could ensure not falling in the process. What we found out was that the cube process really didn't fail, but we didn't see all the values added through the union, just the original values. What was even weirder was that turning the view into a table in the DB and using that in the DSV gave us back those missing keys. Also, using a named query in the DSV of type: Select * From X also worked in bringing those values back. But adding to that query another filed such as: last_name + ' ' + first_name as Z, again made the added keys disappear…
So yes, I know the conclusion is to do it all in the ETL phase and get a readymade table and not a view (I know, I know!), but just in case you also have a view in your DSV with a union clause in it, then maybe you should run a few checks…
We're currently using Panorama Nova View 6.2 in the municipality and we had a presentation of the upcoming version - Panorama Necto. So what's new?
Well, first of all - now there's just one entry point to Panorama, both for the user and the developer. So that's one entry point instead of one for the flash, java, desktop and dashboard. We're still using the desktop environment in Panorama 6.2 and we've come across more than once that things work in the desktop but for some reason don't show on the web. This should put an end to that problem.
Pages in the book and dashboard are now replaced by workboards.
On entry, you can choose one of the next views:
1. My recent workboards - the last workboards you worked on.
2. My favorite workboards - the workboards you liked (by clicking on the relevant button).
3. My relevant workboards - Panorama recommends you look at other workboards according to your likes and the likes people in your team gave to different workboards.
4. My recent discussions - a discussion between different workers of the same team on a certain workboard.
5. My recent subscriptions - in case you subscribed to a certain workboard.
Under "My recent workboards", "My favorite workboards" and "My relevant workboards" you can review workboards in a gallery quite similar to PowerPivot Gallery in SharePoint 2010. Other similarities to SharePoint 2010 also include a "like" button and a commenting option you have for each workboard.
"My recent subscriptions" gives me direct access to private workboards of other users (in case they chose to share those workboards with me and I have permission to the data). No need to send a page to be saved. Instead, you can give your co - worker a direct link and discuss online a workboard you created.
Components can be moved and resized directly on the viewing screen. There's no need to constantly move between a design area and a viewing area to do that. You can maximize and minimize each component on the workboard to either concentrate on it or give other things more space.
Graphs can also be changed on the fly.
Smart Report (the print out version of the view) got redesigned.
You can leave a comment per member in a report or per data cell. (A comment is left only per a specific workboard).
One click insight - a new automatic exception that marks a difference between the current period and the one before it. We can choose the size of the difference we want to check with a slider.
There is also a button to filter just those cells that have a marked exception.
Wherever we see an exception marked from the one click insight we can also click the little red triangle marking it for "cause and effect". That gives four matrixes with 4 suggested causes Panorama suggests as an explanation for that exception.
The social bar - Panorama is bringing the social feel into analysis. You can either choose to look at all the users which are allowed to watch that workboard or just a list of friends. You can then organize a sort of "meeting" online by picking a few people and connecting them to a certain member or cell in the workboard.
The superboard - according to Panorama offers you a screen that shows a "users who viewed this workboard also viewed" with recommended workboards for you and a "similar workboards". A bit like Amazon offers you other products according to the product you just chose.
Navigation control - before, you would have to code to develop a button that would enable you to create a filter for the entire page. Now, panorama offers you a chance to create it with just a few clicks!
A search box enables you to search a specific member without even opening the relevant attribute or dimension.
Like the previous version, this one also enables you to connect to the UDC model (Panorama's connection to a relational data layer) and PowerPivot which was deployed to SharePoint. I understood that Panorama is also working on creating all of this against SQL 11's BISM.
So, a few words to summarize:
I feel that quite a few of the new features have already been presented in SharePoint 2010. So if you do have SharePoint 2010 already installed in your organization or you're about to install it - then that's not an advantage for you.
Still, I really like the fact that components can be moved and resized directly on the viewing screen and the navigation control, making a filter just a matter of a few clicks. Most importantly, I really appreciated that now it's just one environment, for the user and the developer (no more desktop vs. web).
I think the main forte for Panorama is still its OLAP views. If you're using them as a cube viewer then you will most likely also enjoy incorporating them in your Dashboard. Even in SharePoint 2010, you still get pretty static reports and scorecards in PerformancePoint 2010 (though visually beautiful). But if you want the user the ability to play with the data, you're still going to use Panorama.
In a recent convention I went to, most of the attendants were using Panorama Dashboard (the current version) and were pretty satisfied with it. And it's not because they didn't have SharePoint. The new Panorama has a lot to offer and you can choose what of it you'd like to leverage. Though I don't think we'll use in our organization all the capabilities through Panorama and not through SharePoint, I still think we'll be thrilled to use Panorama Necto, so go check out the demos for yourself.
More Posts « Previous page
- Next page »