SQL Server 2017 – TRIM, at last!

AzureSQLLogoThere is loads of good stuff in SQL Server 2017, run on Linux, Graph Databases, Python, but some times it is the small things in the daily T-SQL grind that really make the difference. The Trim function is one of these.

Cleaning up dirty data from source systems, well I say source systems, most likey xls, xlsx, csv and tsv files, your going to come across leading or trailing spaces, the ghosts in the string!

So for example if we had ‘   Some string from some funky csv export   ‘ with 3 leading and 3 trailing spaces we had to do this in SQL Server 2016 and below:

DECLARE @str AS VARCHAR(50)
SET @str = '  Some string from some funky csv export   '

SELECT RTRIM(LTRIM(@str)) AS TrimExample

Which can be converted to a UDF so you could just do your own TRIM function, but it would look untidy with a schema in front of it, blah!

DECLARE @str AS VARCHAR(50)
SET @str = ' Some string from some funky csv export '
SELECT udf.TRIM(@str) AS TrimExample

But now you can just do

DECLARE @str AS VARCHAR(50)
SET @str = ' Some string from some funky csv export '
SELECT TRIM(@str) AS TrimExample

No more having to recall which is your left and right, nice and tidy! Worth updating just for this function IMHO.

Power BI Custom Visuals – My top picks part two

If you missed part one of ‘Power BI Custom Visuals – my top picks’, which was basically a love letter to the OK Viz team, check it out now.

So this is the second part of my top picks, looking at the visuals I tend to use when creating Power BI reports. As mentioned in part one, there are about 12 ones that I use, I do tend to be fairly conservative in using custom visuals, there are a quite a number now, and tend to side with functional, and less flashy/gimmicky choices, as those tend to  kick in the attentive process and not be as intuitive to use, and ultimately distract from the information that you wish to show. Anyway here’s the final few.Read More »

SUMX vs DISTINCT COUNT

During the preparation for delivering a Power BI training session for a client, I was looking at the Tabular Data Model that was their data source, and I was struggling with a long running query. It was taking 1 minute and 24 seconds to return the values after chewing through 2 million rows of data. So I had a go at optimising it… and got it down to 6 seconds.

Mind Blown

Wow! When the query ran in that short of time, I went ‘No way… that’s not right, what have I done wrong?’, but no, after checking it was right.

Let’s start at the beginning…Read More »