Friday, February 20, 2009

Exporting to a text file from Reporting Services

(Personal Reference)

A client recently asked me how one could add an export to text file in Reporting Services. I won’t go through the business reasons for this but this is how I suggested they go about it.

Add the following text:

<Extension Name="TXT" Type="Microsoft.ReportingServices.Rendering.CsvRenderer.CsvReport,Microsoft.ReportingServices.CsvRendering">
    <OverrideNames>
        <Name Language="en-US">TXT (Pipe Delimited Text File)</Name>
    </OverrideNames>
    <Configuration>
        <DeviceInfo>
            <FieldDelimiter>|</FieldDelimiter>
            <Extension>TXT</Extension>
            <Encoding>ASCII</Encoding>
        <NoHeader>true</NoHeader>
        </DeviceInfo>
    </Configuration>
</Extension>

to the SSRS config file, rsreportserver.config, usually located in:

C:\Program Files\Microsoft SQL Server\MSSQL.3\Reporting Services\ReportServer. The text will need to be included with the other renders in that file between the render tags.

Happy Deving!

Blogging about blogging…

Having recently decided to start blogging again I was on a hunt for a decent blog stat counter. I have used StatCounter in the past but it only allows the last 100 hits (without shelling out fun tokens) to be saved. This was not going to be sufficient to allow for analysis. I was told by a mate to try out Google Analytics and I must say I am very impressed! It allows for all types of analysis and reporting. Go check it out if you are blogging. It is really amazing!

Tuesday, February 17, 2009

Analysis Services Partition Processing

Recently, while trawling the internet, I noticed that there are not a lot of SSIS examples detailing how to process Analysis Services cubes, especially if you are using Analysis Services partitioning. So I thought I would add an approach I have been using over the last few years.

While reading this you are probably wondering what all the fuss is about. Why not just use the Analysis Services Processing Task? Well, I guess this is definitely an option when processing dimensions and full cubes or measure groups with small data sets. This task is very useful and does provide a logging framework with a lot of detail. However, it is not dynamic. If the partitioning strategy defined is by month for example and the requirement is to process just those partitions with new data, then a more dynamic approach may be needed.

This is the scenario for which I am aiming this blog. This approach does add more code and therefore possibly more maintenance but may be a necessity when encountering larger data sets.

So if we look at the example mentioned above where we are partitioning by month. We could have a measure group with n monthly partitions and a need to process the last two as they could have new data. In this case, we may need a new Analysis Services partition or two which would represent these new months. (I am not going to add the code that creates new partitions in this blog to avoid complexity. I will save this for another blog!)

In this approach I am going to create the relevant XMLA scripts and then use the SSIS Analysis Services Execute DDL Task to execute them. I have seen many people attempt this approach by creating their own XMLA scripts. However, anyone who has tried this before will agree that the string manipulation required can be a real headache. So having a bit of a development background myself I prefer to allow the AMO (Analysis Management Objects) object model to do the hard work for me.

So onto the SSIS package…

First I declare a package variable: strXMLAScript (string)

Then add a script task and add the following code:

Imports System, System.Data, System.Math, Microsoft.SqlServer.Dts.Runtime
Imports AMO = Microsoft.AnalysisServices, Microsoft.SqlServer.Management
Imports Microsoft.AnalysisServices.QueryBinding

Note: You will need to add the relevant AMO reference to the script task before being able to use it. However, when adding a reference the relevant component i.e. Analysis Management Objects may not be available. This can be made available by coping the Microsoft.AnalysisServices.dll from “\Program Files\Microsoft SQL Server\90\sdk\Assemblies\Microsoft.AnalysisServices.DLL” into the relevant framework folder version e.g. “\windows\Microsoft.NET\Framework\v2.0.50727”.

Then declare script variables and set the task variable for event firing:

Dim amoServer As AMO.Server, amoMeasureGroup As AMO.MeasureGroup, amoPartition As AMO.Partition
Dim strXMLAScript As String, strTaskName As String, oVariables As Variables = Dts.Variables

Dts.VariableDispenser.LockOneForRead("System::TaskName", oVariables)
strTaskName = oVariables.Item("System::TaskName").Value.ToString


Then initialized the relevant Analysis Services objects:



amoServer = New AMO.Server()
amoServer.Connect("Data Source=LOCALHOST;Initial Catalog=Adventure Works DW;Provider=MSOLAP.3;Integrated Security=SSPI;Impersonation Level=Impersonate;")
amoMeasureGroup = amoServer.Databases.FindByName(amoServer.ConnectionInfo.Catalog.ToString).Cubes.FindByName("Adventure Works").MeasureGroups.FindByName("Reseller Sales")


Then set the server property CaptureXML. (This will make sure that all actions executed against the server during this session will not be implemented but rather that the XMLA for each action is made available.)


amoServer.CaptureXml = True

Once this is done the search criteria for finding the correct partitions will need to be implemented. This will obviously be site specific but I usually name the partitions by measure group and then yyyyMM. So the relevant partitions I am looking for are ResellerSales200901, ResellerSales200902.


' Loop through measure groups partitions
For Each amoPartition In amoMeasureGroup.Partitions
Select Case amoPartition.Name.ToUpper
Case amoMeasureGroup.Name.ToUpper & " " & DateAdd(DateInterval.Month, -1, Now.Date).ToString("yyyyMM"), _
amoMeasureGroup.Name.ToUpper & " " & Now.Date.ToString("yyyyMM")
amoPartition.Process(AMO.ProcessType.ProcessFull) ' Could be any process Type required

' Report the partions that have been added to the process list
Dts.Events.FireInformation(0, strTaskName, "Added partition - " & amoPartition.Name & _
" to process list.", String.Empty, 0, True)
End Select
Next



Note: Even when the loop is complete, no processing has been done. Only the XMLA script has been captured. Also note that I usually add an information event in the loop so I can easily highlight in my logging framework the partitions that have been added to the process list. I have found this information very useful when debugging.


We are now able to capture the XMLA script that the AMO object model has generated and set the ProcessInTransaction and ProcessInParallel properties as required. The package variable (strXMLAScript) is then populated with the XMLA string .


' Set in transaction and in parallel properties and capture
strXMLAScript = amoServer.ConcatenateCaptureLog(True, True)

' Write to package variable
Dts.VariableDispenser.LockOneForWrite("User::strXMLAString", oVariables)
oVariables.Item("User::strXMLAString").Value = strXMLAScript


The AMO server object does allow for the execution on the XMLA script once it has all been captured. This can be done with the following method:


amoServer.ExecuteCaptureLog(True, True)


However, this will require that you write a lot more code to capture all the possible errors and/or warnings that may occur. I prefer to allow the Analysis Services Execute DDL Task to do all this work for me. It also makes sure that all the relevant events that may need to be captured i.e. Onprogress, OnInformation, OnWarning and OnError are easily captured by your logging framework.


So all that is needed now is the Analysis Services Execute DDL Task source property is set to the variable strXMLA and we are ready to process.


The following shows the execution results with all the possible events that could be captured by your logging framework:


Package


This is clearly a very simple example but can easily extend to accommodate for most, if not all, of your partition processing needs.


Hope you find this helpful!


Happy Deving!


(I have added a zip file to SkyDrive with the sample package and xmla to add the required partitions to the Adventure Works DW Cube. It can be found here).

Tuesday, February 3, 2009

A new blog

An ex colleague and close mate has started a new blog which I have recently added to my Google reader. You can find it here. His opinions and humour definitely require a read!