Friday, February 20, 2009

Exporting to a text file from Reporting Services

(Personal Reference)

A client recently asked me how one could add an export to text file in Reporting Services. I won’t go through the business reasons for this but this is how I suggested they go about it.

Add the following text:

<Extension Name="TXT" Type="Microsoft.ReportingServices.Rendering.CsvRenderer.CsvReport,Microsoft.ReportingServices.CsvRendering">
    <OverrideNames>
        <Name Language="en-US">TXT (Pipe Delimited Text File)</Name>
    </OverrideNames>
    <Configuration>
        <DeviceInfo>
            <FieldDelimiter>|</FieldDelimiter>
            <Extension>TXT</Extension>
            <Encoding>ASCII</Encoding>
        <NoHeader>true</NoHeader>
        </DeviceInfo>
    </Configuration>
</Extension>

to the SSRS config file, rsreportserver.config, usually located in:

C:\Program Files\Microsoft SQL Server\MSSQL.3\Reporting Services\ReportServer. The text will need to be included with the other renders in that file between the render tags.

Happy Deving!

Blogging about blogging…

Having recently decided to start blogging again I was on a hunt for a decent blog stat counter. I have used StatCounter in the past but it only allows the last 100 hits (without shelling out fun tokens) to be saved. This was not going to be sufficient to allow for analysis. I was told by a mate to try out Google Analytics and I must say I am very impressed! It allows for all types of analysis and reporting. Go check it out if you are blogging. It is really amazing!

Tuesday, February 17, 2009

Analysis Services Partition Processing

Recently, while trawling the internet, I noticed that there are not a lot of SSIS examples detailing how to process Analysis Services cubes, especially if you are using Analysis Services partitioning. So I thought I would add an approach I have been using over the last few years.

While reading this you are probably wondering what all the fuss is about. Why not just use the Analysis Services Processing Task? Well, I guess this is definitely an option when processing dimensions and full cubes or measure groups with small data sets. This task is very useful and does provide a logging framework with a lot of detail. However, it is not dynamic. If the partitioning strategy defined is by month for example and the requirement is to process just those partitions with new data, then a more dynamic approach may be needed.

This is the scenario for which I am aiming this blog. This approach does add more code and therefore possibly more maintenance but may be a necessity when encountering larger data sets.

So if we look at the example mentioned above where we are partitioning by month. We could have a measure group with n monthly partitions and a need to process the last two as they could have new data. In this case, we may need a new Analysis Services partition or two which would represent these new months. (I am not going to add the code that creates new partitions in this blog to avoid complexity. I will save this for another blog!)

In this approach I am going to create the relevant XMLA scripts and then use the SSIS Analysis Services Execute DDL Task to execute them. I have seen many people attempt this approach by creating their own XMLA scripts. However, anyone who has tried this before will agree that the string manipulation required can be a real headache. So having a bit of a development background myself I prefer to allow the AMO (Analysis Management Objects) object model to do the hard work for me.

So onto the SSIS package…

First I declare a package variable: strXMLAScript (string)

Then add a script task and add the following code:

Imports System, System.Data, System.Math, Microsoft.SqlServer.Dts.Runtime
Imports AMO = Microsoft.AnalysisServices, Microsoft.SqlServer.Management
Imports Microsoft.AnalysisServices.QueryBinding

Note: You will need to add the relevant AMO reference to the script task before being able to use it. However, when adding a reference the relevant component i.e. Analysis Management Objects may not be available. This can be made available by coping the Microsoft.AnalysisServices.dll from “\Program Files\Microsoft SQL Server\90\sdk\Assemblies\Microsoft.AnalysisServices.DLL” into the relevant framework folder version e.g. “\windows\Microsoft.NET\Framework\v2.0.50727”.

Then declare script variables and set the task variable for event firing:

Dim amoServer As AMO.Server, amoMeasureGroup As AMO.MeasureGroup, amoPartition As AMO.Partition
Dim strXMLAScript As String, strTaskName As String, oVariables As Variables = Dts.Variables

Dts.VariableDispenser.LockOneForRead("System::TaskName", oVariables)
strTaskName = oVariables.Item("System::TaskName").Value.ToString


Then initialized the relevant Analysis Services objects:



amoServer = New AMO.Server()
amoServer.Connect("Data Source=LOCALHOST;Initial Catalog=Adventure Works DW;Provider=MSOLAP.3;Integrated Security=SSPI;Impersonation Level=Impersonate;")
amoMeasureGroup = amoServer.Databases.FindByName(amoServer.ConnectionInfo.Catalog.ToString).Cubes.FindByName("Adventure Works").MeasureGroups.FindByName("Reseller Sales")


Then set the server property CaptureXML. (This will make sure that all actions executed against the server during this session will not be implemented but rather that the XMLA for each action is made available.)


amoServer.CaptureXml = True

Once this is done the search criteria for finding the correct partitions will need to be implemented. This will obviously be site specific but I usually name the partitions by measure group and then yyyyMM. So the relevant partitions I am looking for are ResellerSales200901, ResellerSales200902.


' Loop through measure groups partitions
For Each amoPartition In amoMeasureGroup.Partitions
Select Case amoPartition.Name.ToUpper
Case amoMeasureGroup.Name.ToUpper & " " & DateAdd(DateInterval.Month, -1, Now.Date).ToString("yyyyMM"), _
amoMeasureGroup.Name.ToUpper & " " & Now.Date.ToString("yyyyMM")
amoPartition.Process(AMO.ProcessType.ProcessFull) ' Could be any process Type required

' Report the partions that have been added to the process list
Dts.Events.FireInformation(0, strTaskName, "Added partition - " & amoPartition.Name & _
" to process list.", String.Empty, 0, True)
End Select
Next



Note: Even when the loop is complete, no processing has been done. Only the XMLA script has been captured. Also note that I usually add an information event in the loop so I can easily highlight in my logging framework the partitions that have been added to the process list. I have found this information very useful when debugging.


We are now able to capture the XMLA script that the AMO object model has generated and set the ProcessInTransaction and ProcessInParallel properties as required. The package variable (strXMLAScript) is then populated with the XMLA string .


' Set in transaction and in parallel properties and capture
strXMLAScript = amoServer.ConcatenateCaptureLog(True, True)

' Write to package variable
Dts.VariableDispenser.LockOneForWrite("User::strXMLAString", oVariables)
oVariables.Item("User::strXMLAString").Value = strXMLAScript


The AMO server object does allow for the execution on the XMLA script once it has all been captured. This can be done with the following method:


amoServer.ExecuteCaptureLog(True, True)


However, this will require that you write a lot more code to capture all the possible errors and/or warnings that may occur. I prefer to allow the Analysis Services Execute DDL Task to do all this work for me. It also makes sure that all the relevant events that may need to be captured i.e. Onprogress, OnInformation, OnWarning and OnError are easily captured by your logging framework.


So all that is needed now is the Analysis Services Execute DDL Task source property is set to the variable strXMLA and we are ready to process.


The following shows the execution results with all the possible events that could be captured by your logging framework:


Package


This is clearly a very simple example but can easily extend to accommodate for most, if not all, of your partition processing needs.


Hope you find this helpful!


Happy Deving!


(I have added a zip file to SkyDrive with the sample package and xmla to add the required partitions to the Adventure Works DW Cube. It can be found here).

Tuesday, February 3, 2009

A new blog

An ex colleague and close mate has started a new blog which I have recently added to my Google reader. You can find it here. His opinions and humour definitely require a read!

Thursday, August 21, 2008

Stored Procedure - Parameter sniffing

I spent a fair amount of time yesterday on a stored procedure which was performing very badly. Once I had done all the usual checks I decided to run the SQL that was contained in the sp and was very confused to see that it was a lot quicker. The sp was taking over 3 minutes to execute while the SQL query was taking 2 seconds.

I ended up finding out why from this link and this one.

Lets hope this parameter sniffing issue doesn't catch you out for as long as it did me.

Happy deving!

Friday, October 19, 2007

The RWC Final

It is a huge weekend for South African rugby. Lets hope the guys can hold it together and take the cup home. Go Bokke!

Friday, July 27, 2007

SSIS : Running a package through a SQL Agent Job

On the project I am currently working on, I have set up a few of my SSIS packages to execute under a global executor job executed by the SQL Agent. The other evening at closing time a colleague of mine needed to execute this job. He used remote desktop to get onto the relevant server and manually started the job. He then logged off from the server, packed up and went home, happy in the knowledge that the job would be run by the time we returned in the morning. However in the morning we were greeted with a failed job and the following message "The job was stopped prior to completion by (Unknown) "

After investigating further I found the following explanation :

http://support.microsoft.com/kb/922527

A bit of an interesting one. Definitely worth knowing if you are like me and run batch jobs at night and hope to have them completed in the morning.

Wednesday, April 25, 2007

SSIS : Package Configurations

There are many advantages to setting up configurations for your SSIS packages. If you have configuration values set up throughout your packages you can avoid unnecessary retesting and deployment processes later on. It also makes the process of redeploying to multiple environments (Dev, UAT, SysTest, and Prod) a lot easier.

On a recent project my colleagues and I decided that it would be very useful to implement package configurations in our SSIS packages. After a little bit of investigation we realised that the out-of-the-box package configuration although useful would not provide us with what we required.

Our requirements were as follows:

• Multiple filtering options - Because a package can be used in multiple systems and multiple environments (Development, UAT, Production)
• System and Package variable - Some variables (e.g. warehouse connection string) maybe system wide variables whereas some may be unique for a specific package.
• Encryption of Sensitive Information- Some of the variables may contain sensitive information that we wanted to have encrypted in the database.
• Config Audit Log - An audit log of what configuration values were during the time the package executed.
• Config Audit Trail - An audit trail of who has changed the config values
• Simple implementation - an easy way to implement the config process throughout all existing and future packages.

Database
I am sure most SSIS implementation use an Audit database of some kind to capture package and task execution logs. (See a future blog on SSIS logging) This is a good place to store your configuration information as it goes hand-in-hand with your logging information.

As we are storing this information in a database which can be moved between servers with varying users it is advisable to have its connection string being set up using the out of the box SSIS configuration functionality.

The database contains configuration tables i.e. PackageConfig, PackageAudit, PackageConfigLog. The PackageConfig table contains information uniquely defining each package through it’s package GUID and each of the package variables, variable types and variable values along with a flag whether the value should be encrypted or not. You should also include multiple columns for filtering your configuration values by i.e. System, Environment and Warehouse Process etc. The PackageAudit table contains all the necessary columns from the PackageConfig table that need to be audited as well a column for the user who made the change and the time of the change. This table has a trigger that will add a new row to the audit table whenever the main table is added to or changed. The PackageConfigLog will contain the necessary columns from the PackageConfig table that will allow the administrator to see which configuration values were used during each execution. Trust me, without this table you can be testing and get unusual results, take many hours hunting down what you might have done wrong and in the end find out that one of you colleagues change a config value that was vital to your test. If you had checked this table first, life could have been a lot less painful.

SSIS
Inside you package you will need a task to overwrite the variable values as defined in the package. You may want to write a custom task for this or just use a script task. As this is a task that will be used in all you packages I would suggest using the custom task options as changes to the script once it has been implemented in 100s of packages can be a real nightmare.

I have included a simplified version of the code below


' open connection
oSQLConn = New SqlConnection(strConnectionString) : oSQLConn.Open()
oSQLComm
= New SqlCommand("uspGetConfigInfo", oSQLConn)
oSQLComm.CommandType
= CommandType.StoredProcedure

' Add parameters to SQL command
Call ParametersGetConfig()

' Execute and return dataset
oSQLReader = oSQLComm.ExecuteReader()

' Loop through reader
While oSQLReader.Read
If Dts.VariableDispenser.Contains(oSQLReader.GetString(0).ToString) Then
' Lock variable for write
Dts.VariableDispenser.LockOneForWrite(oSQLReader.GetString(0).ToString, oVariables)

' Check variable is the correct type
If oVariables.Item(oSQLReader.GetString(0).ToString).DataType.ToString = oSQLReader.GetString(2) Then
oVariables.Item(oSQLReader.GetString(
0).ToString).Value = Convert.ChangeType(oSQLReader.GetValue(1), _
oVariables.Item(oSQLReader.GetString(
0).ToString).DataType)

' Report information
Dts.Events.FireInformation(0, strComponent, "The variable - " & oSQLReader.GetString(0).ToString & " has been updated.", _
String.Empty, 0, True)

Else ' Report incorrect data type
Dts.Events.FireWarning(0, strComponent, "The variable - " & oSQLReader.GetString(0).ToString & " has a different data type then " & _
"that defined in the config database.", String.Empty, 0)
End If
Else ' Report missing variable
Dts.Events.FireWarning(0, strComponent, "The variable - " & oSQLReader.GetString(0).ToString & " is missing.", String.Empty, 0)
End If
End While

' Close SQL Data Reader
oSQLReader.Close()

A stored procedure (uspGetConfigInfo) will get the variable information from the table discussed earlier and find the relevant variables, verify that the variable type are correct and then replace the value in the package. The change will be reported to the package through the component event namespace and can be captured by your chosen logging mechanism in the OnInformation event handler. The config log information could be implemented using a similar process in which the valid variables (with the correct data type) are written away to the PackageConfigLog table.

Encryption:
Using SQL Server encryption you can encrypt variable values in your PackageConfig table and decrypt them in your GetConfigValues stored procedure. Ensure that these values are not written to the package config log as that would obviously ruin the whole purpose of the encryption process.

Wednesday, April 18, 2007

Hall shines in Barbados

What an amazing spell of bowling from Andrew Hall Yesterday! The most underrated player in South African Cricket showing his stuff. Good on you mate, good on you!

Thursday, March 29, 2007

It has been a while!

Wow, it has been a long time since I lasted blogged. Apologies for that but I have been on holiday in Cape town. While sitting on the beach getting a healthy dose of sun, sea and sand the geek in me was thinking what my next few blogs should involve.

After a second or two of thought I decided that the folowing would be nice to share :

1) SSIS : Package Configurations
2) SSIS : Package and Task logging
3) SSIS : Creating and processing Analysis Services partitions
4) SSMS : A sample SSMS addin

So while I get hard to work on these I hope you will occassionally check in to see my progress.