SSRS Cancel Report Still Running In The Background

There are some people who cancel a SSRS report because its taking to long by just closing visual studio or their browser. Then they reopen SSRS/browser and try and view the report again. In the mean time the previous report they were trying to run the query is still running on the database. This takes up quite a bit of resources and people think that there query has been cancelled, but this in not the case.

This is a common problem with users accessing reports. There is a way to stop the query from running in the background by using the KILL command, however you will need a DBA to do this for you unless you have permission to do this your self.

Hope this helps other who are struggling with the same issue 🙂

SSIS Email Task Delay Validation

Recently I was trying to send a file via Email in SSIS.

For the connection to the file I was using a variable as the file name changes everyday.

This was causing an issue because it validates the whole package before the file gets generated and the package would fail.

The simplest solution was to delay the validation on the Send Mail Task. You can do this in the properties by setting Delay Validation to True.

Hope this helps you!

MDS “Service Unavailable” Error

After restarting a server that has MDS on, I could no longer access MDS via the browser. This seemed very strange as I have restarted the server multiple times and it had never done this before.

Turns out the Application Pool had stopped for MDS. So I started it up and then it would stop again, frustrating me further.

Then is occurred to me that it was using my domain details and my password had changed. Then I went into the advanced settings of my MDS Application Pool to update my details under Identity.

And there we go, it works and I can access MDS again via the browser!

Hope this helps someone else with this error.

Limiting SSIS to a Single File Type

A while ago I was having some issues with SSIS and moving files. There was a hidden file called thumbs.db

When SSIS was running the move file task it would fail because it can’t place duplicate files in the same location. The thumbs.db file was the problem. Even though it was hidden as a windows file, SSIS still saw it and wanted to move it.

This is where limiting SSIS to a specific file type saves you from this problem.

To do this you filter on the Foreach Loop Container. You select the container and then you click on Properties.

In the Properties right at the top you shall see FileSpec. There you can set it to the file type you would like to filter/limit for.

And that’s how you limit SSIS to a specific file type!

SSIS Error – the path for isserverexec.exe cannot be found

Recently been working with getting multiple SQL Instances on one machine to work. When I uninstalled an old one it affected the latest one. Then all of a sudden my ETL jobs failed. It did not make sense at all seeing as I removed a different instance.

Got the following error when trying to run my ETL jobs or when trying to deploy the jobs:
“the path for isserverexec.exe cannot be found”

Looking for solutions online, they recommend going into the registry. Which didn’t work for me.

I ended up using the install file and running it to add features to an existing connection. And there is was Integration services was not installed for the instance that I was getting the error with.

Installed SSIS on the current instance that needed it and all worked perfectly after that and the jobs ran as perusal.

Hope this helps someone else in the future.

Indexing Temp Tables

While busy with query performance tuning and trying to see whether the query would run better using a CTE’s (Common Table Expression) or Temp tables and comparing times, I realised I forgot to think about adding indexes on my temp tables which could increase the performance.

I then came across Brent Ozar’s post on Indexing Temp Tables and found it extremely useful and thought I would share it here.

Most people don’t think about indexing temp tables when they are trying to optimise their queries. Which is a shame, because I see people sticking fairly large amount of data into temp tables. On the rare occurrence that I do see them indexed, it’s a nonclustered index on a column or two. The optimser promptly ignores this index while you select 10 columns and join 10,000 rows to another temp table with another ignored nonclustered index on it.

Now, not every temp table needs a clustered index. I don’t even have a rule of thumb here, because it depends on so many things, like

  • Are you joining it to other tables?
  • What does the execution plan look like?
  • Is this even part of the stored procedure that needs tuning?

Assuming that all three things lead us to index temp tables, let’s create some and see how they react to different indexing strategies.

SELECT TOP 100000 *
INTO    #TempUsers
FROM    dbo.Users AS u;

SELECT TOP 100000 p.*
INTO    #TempPosts
FROM    dbo.Posts AS p
JOIN #TempUsers AS tu ON p.OwnerUserId = tu.Id;

SELECT TOP 100000 c.*
INTO    #TempComments
FROM    dbo.Comments AS c
JOIN #TempUsers AS tu ON c.UserId = tu.Id
JOIN #TempPosts AS tp ON c.PostId = tp.Id;

FROM    #TempUsers AS tu
JOIN    #TempPosts AS tp    ON tu.Id = tp.OwnerUserId
JOIN    #TempComments AS tc ON tc.UserId = tu.Id
                            AND tc.PostId = tp.Id;

We have three temp tables with 100k rows in them. I joined them a bit to make sure we get good correlation in the data.

If we get the plan for select statement, this is our query plan. A bunch of table scans and hash joins. SQL is lazy. I don’t blame it one bit.

I’m so parallel…

After frequent user complaints and careful plan analysis, we decide to add some indexes. Let’s start with nonclustered indexes.

CREATE NONCLUSTERED  INDEX ix_tempusers ON #TempUsers (Id);
CREATE NONCLUSTERED INDEX ix_tempposts ON #TempPosts (Id, OwnerUserId);
CREATE NONCLUSTERED INDEX ix_tempposts2 ON #TempPosts (OwnerUserId, Id);
CREATE NONCLUSTERED INDEX ix_tempcomments ON #TempComments (UserId, PostId);

I’ve gone ahead and added two indexes to #TempPosts so the optimizer has a choice if joining in another order is more efficient.

How’d we do?

Not so hot…

Not quite there. Two table scans and an index seek followed by a key lookup that SQL estimates will execute 62k times.

Let’s try that again with clustered indexes. We can only give #TempPosts one, so I picked at random. In real life, you might want to do your job!

CREATE CLUSTERED INDEX cx_tempposts ON #TempPosts (Id, OwnerUserId);
CREATE CLUSTERED INDEX cx_tempusers ON #TempUsers (Id);
CREATE CLUSTERED INDEX cx_tempcomments ON #TempComments (UserId, PostId);

Our indexes get used and everyone is happy. Well, sort of. TempDB isn’t happy, but then it never is. It’s like your liver, or Oscar the Grouch.

Used and abused
When starting to index temp tables, I usually start with a clustered index, and potentially add nonclustered indexes later if performance isn’t where I want it to be.

You can do yourself a lot of favors by only bringing columns into the temp table that you need. If it’s still a lot, there’s a much higher chance SQL will judge a table scan cheaper than using a narrow nonclustered index along with a lookup.

Remember to test this carefully. You want to make sure that adding the indexes doesn’t introduce more overhead than they alleviate, and that they actually make your query better.

Global Azure Bootcamp 2017 – Cape Town

Welcome to Global Azure Bootcamp! All around the world user groups and communities want to learn about Azure and Cloud Computing! On April 22, 2017, all communities will come together once again in the fifth great Global Azure Bootcamp event! Please join the Cape Town Azure User Group for a jam packed day!

Register now! 

SQLCape April Meetup

Good Day SQL Professionals!

This months meeting is on Azure SQL DB for the SQL Pro.

In this session we will look at what SQL Azure DB means to the SQL Pro moving from on prem to Azure. What are the differences and why planning is essential and not a case of lift and shift.

Just a few reminders for upcoming events

Don’t forget about the Azure Bootcamp taking place on the 22nd April at the Microsoft Cape Town Offices. You can register here:

On the 9th September we have SQL Saturday taking place! Don’t forget to register now!

Hope to see you all there!

The SQLCape Team

PowerBI Campaign/Brand Management for Twitter

PowerBI has released a fantastic template for Twitter called Campaign/Brand Management for Twitter. After following the “wizard” you will have a complete report filled with sophisticated analytics.

In order to set this up, all you need is a twitter account and an Azure account. If you would like to play with this you can set up a trial Azure account.

To get started you can go to the following link
This will open the following page. Select install to start the “wizard”.

Next you will get a summary of what is needed to complete this Twitter template.

On the Login page you will connect to Azure. If you do not have a subscription there is a link to set up a free Azure trial. You will then select whether you have a personal or work Azure account.

If you are not sure what to put in the Azure Tenant section you can find this in the azure portal. If you hover over your icon on the top right hand corner it shall show you some details. It is the Domain details that you shall place in the Azure Tenant. It will have the “”

Once you click Connect to Azure you will need to accept that the Business Platform Solution Template can access your Azure account.

Once accepted you shall get a confirmation page that all was successfully validated and you can click on next to progress to the Source page.

On the Source page you will specify the database name to store the Twitter data, a username and a password. Make sure the password contains upper case, lower case and a special character.

Next you will need to check for availability, sometimes the database name you choose is already taken on another azure server.

Once validated you shall get the below screen and you can select next to go onto the Twitter page.

On this page you shall connect to Twitter. This will connect to your actual Twitter account. You can get data about any Twitter account using this. It is not limited to your own just because you are logging in. In this demo I take a look at the @MSPowerBI twitter account.

Complete your Twitter login details and authorize the Azure app to have access.

Once authorised you will get a screen to show you that your Twitter account has been validated and you can move onto the next page.

On the Search Terms page you can place in the twitter handles you would like to take a look at or the keywords you are keen on. You can do both by separating the two with the word OR.

Once validated you move onto the Twitter Handles page. This is where you can search all the Twitter handles you are interested in by separating them with a space. For this I took a look at @MSPowerBI and @Azure

After this you reach the Summary page. Once you click on run it starts to process all the data and build the report.

You then move onto the Deploy page where you can see the progress of the data being extracted and compiled in a report.

Once complete you shall get a notification and you can download your pbix file to view on your desktop.

Windows Update KB2919355 Not Applicable

When trying to install SQL, Windows Update KB2919355 is a prerequisite, and on a new server this is not always installed and the SQL install stops until it is rectified.

Occasionally when you try and install the update you get the error KB2919355 is not applicable, yet the update is not installed. After searching the internet high and low a solution was found.

If you install Windows Update KB2975061 first and restart the machine and then install KB2919355 and restart the machine again, the SQL install prerequisites will be successful and you can continue with your install.

Hope this solution helps you and you get less grey hair and frustration trying to figure it out.