(scottish) sql bob blog

rss

Some thoughts of your typical data shepherd / data plumber / data dance teacher sort of person.


sql bob
sql bob
Twitter handle - @SQL_Bob

Yes l could well be dyslexic deal with it ;-)


Create dynamic text with M (Power Query Formula Language)
This post is wholly inspired by this post (https://blog.crossjoin.co.uk/2016/04/25/dynamic-chart-titles-in-power-bi/) by Chris Webb about creating chart titles.  We have a set of reports which are the same for each client, so when we are asked to create set of reports for a new client.  The process is relatively straightforward and simple.  One frustration is that we had to change the client name in a text box at the top of each page.  l am lazy so as programmer l like the computer to do the work for me.

So first l created a variable in the query window see figure 1.


Figure 1 – Select 'Blank Query' from the new source options

When the  blank query window opens, replace the text 'Query' with the name of the variable, then enter the value that will be returned from the variable.    See Figure 2 for details  


Figure 2 – Showing location of the variable name and where value should be placed.

In figure 3 what l have done is to enter the client name that will be used in the example 'Rainbows' and changed the variable name to 'ClientName'.  Once completed those details then click on the 'Close & Apply' button.


Figure 3 – Showing the client name and variable name.

Once returned to the main PowerBI desktop window, find the newly created dataset in this example it is called 'ClientName' with a field called ' ClientName'.  Now that the variable has been surfaced its time to use.  So let create a new measure, expand out the dataset click once with mouse on the ellipsis and select 'New measure' from the list see figure 4.


Figure 4 – Creating a new measure

When the new measure dialog box appears in this example l created a measure called 'TicketsTextBox' as  l want to the measure to appear in a box containing text.  Which is the reason for my naming convention.  The measure l used is as follows

TicketsTextBox = CONCATENATEX(ClientName,[ClientName]) & "- Tickets"  
 
In figure 5 you can see the newly created measure.


Figure 5 – showing the measure 'TicketsTextBox'

The next step is to get this measure on the report canvas, so insert a card visualization on the report canvas.  Then drag the measure to the values section on the visualization configuration section.  Click on the paint brush icon and expand out the 'Data label' section adjust the size of the label to the desired size.  Then turn off the 'Category Label.  These steps are shown in the video below.

Now you have a card with dynamic title, if you want to change the client name all you have to do is change the text in the variable. A sample PowerBI report file can be downloaded here which contains the report shown above.


M (formally know as Power Query Formula Language), PowerBI and parameterising data sources - part two

In the previous post (see here) l set up the blank datasets as parameters storing the values of the database server and name.  Whilst attempting to create a suitable dataset to a report for an internal customer, l was frustrated that l could not do it using Power Query.  Yet on the other hand l know that l could write some TSQL that would give me the dataset l wanted very easily.  So l wondered if this was possible to combine the two?

So lets return to the SQL.Database function which has this very nice example
Sql.Database("localhost", "Northwind", [Query= select * from Customers]) The last setting is "Query as text - Provide a query text that is run on the server to return values". It's almost like someone thought it would be a good idea to allow a user to run a SQL query. A few minutes later l had the query l wanted to run and it looked something like this -:

DECLARE @IssueType AS VARCHAR(10) = 'Problem'; DECLARE @ClientName AS NVARCHAR(100) = 'RainbowPainters';
SELECT  DATEPART(MONTH, [Issue_Created]) AS Month_Number        ,DATENAME(MONTH, [Issue_Created]) AS Month_Name        ,DATENAME(YEAR, [Issue_Created]) AS Year_Name        ,COUNT(DATEPART(MONTH, [Issue_Created])) AS Total FROM    [dbo].[IssueData] AS ID WHERE   [Issue_type_name] = @IssueType         AND [ClientName] = @ClientName GROUP BY DATEPART(MONTH, [Issue_Created])        ,DATENAME(MONTH, [Issue_Created])        ,DATENAME(YEAR, [Issue_Created]);
Next when back to the dataset opened the 'Advanced Editor' and added the query, you can it in Figure 1 below

Figure 1 - Advanced Editor window with the new query in place

To make it a bit easier to read below is the actual text from the 'Advanced Editor' window above.  All that has been added is the text in red,and the query text which is in italics
let

Source = Sql.Database(DatabaseServer, DatabaseName
,[Query = "DECLARE @IssueType AS VARCHAR(10) = 'Problem'; DECLARE @ClientName AS NVARCHAR(100) = 'RainbowPainters'; SELECT  DATEPART(MONTH, [Issue_Created]) AS Month_Number        ,DATENAME(MONTH, [Issue_Created]) AS Month_Name        ,DATENAME(YEAR, [Issue_Created]) AS Year_Name        ,COUNT(DATEPART(MONTH, [Issue_Created])) AS Total FROM    [dbo].[IssueData] AS ID WHERE   [Issue_type_name] = @IssueType         AND [ClientName] = @ClientName GROUP BY DATEPART(MONTH, [Issue_Created])        ,DATENAME(MONTH, [Issue_Created])        ,DATENAME(YEAR, [Issue_Created]);" ]) in     Source

When l refreshed the dataset, the data appeared :-)  Job done?  No, as often happens one client gets something and then everybody wants why they have.  So l was asked if we could create the same set of reports for another customer.  There was two choices at this point, l could just change the hard code client name, or could l parameterise the client name.  Having heard this before l knew about passing parameters from excel to filter datasets see the following posts.

https://blog.oraylis.de/2013/05/using-dynamic-parameter-values-in-power-query-queries/
http://blog.crossjoin.co.uk/2015/11/24/using-parameter-tables-to-control-data-refresh-in-power-bi/

Having looked at the blog posts which declare and then used parameters this looked like a simple concept. Kudos to the Power BI team for that.  First step was to add additional dataset as a parameter.   So l now had three dataset as parameters, the latest one being 'ClientName', which stores the client name


Figure 2 - List of datasets containing parameters

So having looked at the blog posts above l realised that l just need to declare a query which would hold the query text and pass in the Client Name parameter to the query text.  Following some trial and error, l ended up with the query text below

let
pQuery =
"DECLARE @IssueType AS VARCHAR(10) = 'Problem';
DECLARE @ClientName AS NVARCHAR(100) = '" & ClientName & "';
SELECT  DATEPART(MONTH, [Issue_Created]) AS Month_Number
       ,DATENAME(MONTH, [Issue_Created]) AS Month_Name
       ,DATENAME(YEAR, [Issue_Created]) AS Year_Name
       ,COUNT(DATEPART(MONTH, [Issue_Created])) AS Total
FROM    [dbo].[IssueData] AS ID
WHERE   [Issue_type_name] = @IssueType
        AND [ClientName] = @ClientName
GROUP BY DATEPART(MONTH, [Issue_Created])
        ,DATENAME(MONTH, [Issue_Created])
        ,DATENAME(YEAR, [Issue_Created]);",
    Source = Sql.Database(DatabaseServer, DatabaseName,[Query=pQuery])

in

    Source


Figure 3 - showing the query text in the Advanced Editor Window

Much to my delight this syntax works as expected.  So l can now have a way that it is possible to specific parameters which can be passed to SQL server.  From reading about M (Power Query Formula Language) there's more to learn. For the moment l am happy with the new found knowledge, whilst l play with some of the other technologies offered by PowerBI.

The only downside is this only works in PowerBI Desktop :-( 


M (formally know as Power Query Formula Language), PowerBI and parameterising data sources - part one

One of the attractions of being a programmer was that l could find ways to make routine tasks easier and quicker.  As our company and clients have discovered the joy of Power BI also the reports l have created.  There are more clients are asking for reports.  Some of these are the same reports to ensure everyone has the same view of their data.

Scenario
There are two databases l populate with data, one of these is on my machine, the other is an Azure database.  As you would expect l use the local copy to test out various changes to the database.  Then using the Redgate tool SQL Compare l can script any changes and apply them to the Azure database.  It is possible to change the data source for each dataset easily using the following method.

From the main window, on the 'Home' tab click once on the 'Edit Queries' button see Figure 1



Figure 1 - showing the location of the 'Edit Queries' button

This will open the 'Edit Queries' window now to see where the data source for a selected dataset.  First select a dataset in the example shown in Figure 2, it is 'DimDate'.  Next click on the 'Advanced Editor' button see figure 2.


Figure 2 - Showing location of 'Advanced Editor' button

When the Advanced Editor window opens it will look something like the one shown in Figure 3 below.



Figure 3 - Advanced Editor window, with M (Power Query Formula Language) query showing the Database Server Name and Database Name

The syntax of the statement is relatively simple.  In this case we are looking at the Sql.Database data function, the syntax is simple and looks like this -:

Source = Sql.Database( "Name of Database Server", "Name Of Database")

This is the text shown in Figure 3

let
    Source = Sql.Database("localhost", "DBdata"),
    dbo_DimDate = Source{[Schema="dbo",Item="DimDate"]}[Data]
in
    dbo_DimDate

To change the data source to point at the Azure database the only changes that have to made are as follows

let
    Source = Sql.Database("AzureDatabase""DBdata"),
    dbo_DimDate = Source{[Schema="dbo",Item="DimDate"]}[Data]
in
    dbo_DimDate


Job done.  This was nice until you have to do it several times, and personally I got bored then wondered if there was a better way?  I am glad you asked if there is a better way let me show you.

Having seen some examples where the data source was an excel spreadsheet l was aware that it was possible to supply parameters to change the data source.  From one spreadsheet to another one.  
The first step is to set up the parameters.  Open the Query Editor window, click on the 'New Sources' button from the list click on 'Blank Query' (see Figure 4)


Figure 4 - Creating a new blank query

Figure 5 shows what the Query window will look like when it first opens.  Change the name of the query to 'DatabaseServer' then enter the name of the database serving into the text box.  Once you have entered the database server name, click once on the tick to save your changes.


Figure 5 - Show where to make changes to the new query that will store the connect parameter.

Having set up the DatabaseServer dataset (parameter value), repeat the process and create a DatabaseName dataset (parameter value).  Figure 6 shows what you query editor window would look like if you entered the same details shown in this example.


Figure 6 - Showing the two datasets in place containing the parameters, showing the completed 'DatabaseServer' dataset.

So you should now have two additional datasets -:

  * DatabaseServer - containing the value 'AzureDatabase' or the name of the database server you are going to connect to.
  * DatabaseName - containing the value 'DBData' or the name of the database you are going to connect to.

If we look at the advanced properties of the dataset 'DimDate' the data source would look like the one shown below.

let
    Source = Sql.Database("AzureDatabase""DBdata"),
    dbo_DimDate = Source{[Schema="dbo",Item="DimDate"]}[Data]
in
    dbo_DimDate


Next step is to replace the database server name and database name with the names of the Datasets that have just been set up.  So the data source will look like the one below.

 let
    Source = Sql.Database(DatabaseServer, DatabaseName),
    dbo_DimDate = Source{[Schema="dbo",Item="DimDate"]}[Data]
in
    dbo_DimDate

The advanced query window should look like the one show in figure 7 below.  When the data source is queried the values of 'DatabaseServer' and 'DatabaseName' are replaced by the values from the relevant dataset.  In this PowerBI desktop report l just replaced all the hard coded instances of the database server name and database name with the relevant parameter.  Which allows me to switch between two databases with the ease l wanted, only with PowerBI desktop :-(


Figure 7 - Showing the final M (Power Query Formula Language) text (data source)

If you click on this link you can download a copy of a .PBIX with the two dataset's shown above.  

There is more you can do with this you will need to wait till part two of this blog post for more details.


Power BI online user credentials and multiple accounts
Whilst "publishing" some reports for some of our customers from PowerBI desktop to PowerBI online. I made the assumption that the data credentials used in PowerBI desktop would be the same ones used in PowerBI online.

Scenario 
We wanted to share PowerBI reports with some of our customers. It is not possible currently to allow anyone from outwith our Azure AD group to securely access the PowerBI reports. So the only other solution was to get our Office 365 Admin to create an account for each customer. All the customers will connect to the same database to extract their data. On the database an account was created for each customer, so when they connected to the database the customers account could only access that customers data.

First step - Find credentials in PowerBI Desktop 

Open PowerBI desktop at the top left hand side click once with left hand mouse button the menu icon .  The from the menu that appears, go to 'Options and settings' menu item and click once with left hand mouse button (see figure 1)

 
Figure 1 - Showing the 'Options and settings' menu item. 
 
In the 'Data Source Settings' window select the data source (3), that is to say where the data is extracted from. Next press the 'Edit' button (4) this will open the window shown in Figure 2.

Figure 2 - First data source settings window.

In the second  'Data Source Settings' window click on the 'Edit' button 


Figure 3 - Second data source settings window

Now you can get the details eg Username and password used to connect to the data source are stored. If these are required. 
 
Figure 4 - Data source credentials window
 
So we now have the follow credentials
Username = [email protected] 
Password = JoeB123 
N.B. We will be using these details later. 

Tip if using multiple accounts with PowerBI online  
Next step would be to publish your reports to the PowerBI service at PowerBI.com.   If you have multiple accounts which you use to publish your reports to a PowerBI service. Then one tip is log out of the account and log back in as there is nothing in the PowerBI desktop application which indicates which account is logged in. 

Adding credentials in PowerBI Service (Aka PowerBi.com) 
One quick tip here is to open PowerBi.com using Internet Explorer. At the time of writing process shown next did not reliability render in other browsers.  Log into PowerBI.com once successfully log in, go to the top right hand side of the window. Click with left hand mouse button the settings icon, from the menu click on the 'Settings' menu item. 

 
Once the 'Settings' window opens click on the 'Datasets' tab 

 

 Once in 'Settings->Datasets' click once with left hand mouse button on the triangle to the left of 'Data source credentials' 



 Once the 'Data source credentials' have been expanded click on the 'Edit credential' link. 

 

 Once the  credentials configuration window opens the drop down box below the authentication method select the 'Basic' authentication method. 

 

 When you select the 'Basic' Authentication method, text boxes will appear at the bottom of the form. Enter the username and password, then click on the 'Sign In' button. 



 If the user name and password has been successful then you will see a message appear in the top right hand corner of the PowerBi window, indicating the data source has been successfully updated. 




Ordering month order in Power BI desktop

Whilst preparing a simple dataset to with a stacked bar chart in PowerBI l included both the month number and name.  The dataset is shown below

Month_Number Month_Name Total Type
1 January 61 Created
1 January 63 Resolution
5 May 14 Created
5 May 8 Resolution
6 June 24 Created
6 June 26 Resolution
7 July 33 Created
7 July 36 Resolution
8 August 51 Created
8 August 48 Resolution
9 September 126 Created
9 September 109 Resolution
10 October 147 Created
10 October 152 Resolution
11 November 98 Created
11 November 88 Resolution
12 December 65 Created
12 December 77 Resolution

The data set was created in PowerBI desktop, the next step was to plot the data on a clustered column chart.  The month name was the value l wanted to appear on the axis, there was a little challenge with the way data was being displayed.  The month names where being displayed in alphabetical order (see Figure 2)

Figure 2 - Power BI chart after placing data on the canvas

What l wanted was the month names to be sorted according to month numbers so l knew l had missed something.  First step was to go to the data tab to have a look at the data.  So I licked on the the dataset and selected the 'Modeling' tab (see Figure 3).


Figure 3 - Data view 'Modeling' tab selected the 'Month_Name' column selected

Everything looked the way l expected then with the 'Month_Name' column selected l clicked on the 'Sort By Column'.  This was sorted by 'Month_Name' so it would be sorted alphabetically, so l changed it to sort by the 'Month_Number' see Figure 4.

Figure 4 - The first figure shows the default setting for the 'Month_Name' column, the second figure shows the amended selection to sort the 'Month_Name'Column by the 'Month_Number' column

Then l returned to the report view to review the change see Figure 5


Figure 5 - The clustered bar chart report showing the Month Names sorted in the order of the 'Month_Number' column

Success the months where ordering as l wanted them, next I'm off to resolve my next challenge, of which there is never a shortage.


Monitoring replication status using Nagios (using PowerShell script and TSQL)

For various reasons which l have now forgotten, l set up transactional replication for some clients.  The result of this is l am the caretaker of transactional replication for two of our clients, what l lucky person l am !

T-SQL code used to check Transaction replication 
Once l got the process up and running (a very, very long and stressful story).  At that point l realised that I would have to monitor these processes.  Following some goggling this article was found with some TSQL written by SQLSoldier here.  This worked for me and l used this to monitor replication by running the script and checking the results manually.
 
Nagios output 
Our SysAdmin uses a tool called nagios to monitor out IT estate.  So they suggested that a script could be written to monitor the replication process and send an alert if anything needed to be looked at.  This sounded like an excellent idea, how to do it?  The approach that was arrived at, involved using a PowerShell script which would run a SQL query examine the results then respond to Nagios, with the following values  

0 - all ok nothing to see here 
1- something happening nothing to worry about (just now) 
2-yes there is something that really needs some attention

See this page for more guidance.

Stored procedure 

Next we decided to use PowerShell to return the results.  Following some goggling we found this page http://www.madeiradata.com/cross-server-replication-health-check-using-powershell/ which runs a SQL script and returned a data set.  First challenge was the TSQL script from SQLSoldier was rather long for my first powershell script, l wanted some thing smaller.  So l created a stored procedure based on the script, and placed it in my Distribution database on the replicated database server.  Doing this had a two reasons, first less TSQL in the PowerShell script, second changing one of the parameters meant it returns different amounts of data.  The stored procedure takes the following parameters ; 

@Publisher  - The name of the publisher database server instance 
@PublisherDB - The name of the publisher database 
@NagiosOutput - Y = only output return code and short error code( max of 80 characters), N = output all results 

The following script was used to check the results that would be returned to Nagios. 
USE [distribution]; 
DECLARE @Publisher AS sysname 
DECLARE @PublisherDB AS sysname 
DECLARE @NagiosOutput AS Char(1) 
SET @Publisher = 'ReplPublisherServerName' 
SET @PublisherDB = 'ReplPublisherDBName' 
SET @NagiosOutput = 'Y' 
EXEC [dbo].[Check_Replication] @Publisher, @PublisherDB, @NagiosOutput; 

Note the account connecting to the database from Nagios will require execute permissions to the stored procedure otherwise it cannot run the stored procedure.  The code for the stored procedure is here.

PowerShell script 
Having adapted the PowerShell script found here to run the stored procedure.  When the PowerShell script was run by Nagios there was no 'Return Code' returned (this is what Nagios expects).  We did find the solution on this page, and inserted function ExitWithCode, and made a few other changes.  The resulting PowerShell script is below -

## Beginning of Monitor 
##Connection String With Server Variable, Distribution Database name is 'Distribution' 
$con = "server=127.0.0.1;database=Distribution;Integrated Security=sspi" 
 
##Begin SQL Query 
$cmd = "SET NOCOUNT ON; " 
$cmd = $cmd + " USE [distribution];" 
$cmd = $cmd + " DECLARE @Publisher  AS sysname" 
$cmd = $cmd + " DECLARE @PublisherDB AS sysname" 
$cmd = $cmd + " DECLARE @NagiosOutput AS Char(1)" 
$cmd = $cmd + " SET @Publisher = 'ReplPublisherServerName' " 
$cmd = $cmd + " SET @PublisherDB = 'ReplPublisherDBName' " 
$cmd = $cmd + " SET @NagiosOutput = 'Y'" 
$cmd = $cmd + " EXEC [dbo].[Check_Replication] @Publisher, @PublisherDB, @NagiosOutput;" 
 
##Creating DataSet Object 
$set = new-object system.data.dataset 
##Running Query 
$da = new-object System.Data.SqlClient.SqlDataAdapter ($cmd, $con) 
##Filling DataSet With Results 
$da.fill($set) | out-null 
##Creating Table Object and Inserting DataSet 
$dt = new-object System.Data.DataTable 
$dt = $set.Tables[0] 
 
## loop over each column in the DataSet 
foreach ($row in $set.Tables[0].Rows) 

##write out the 2nd row which contains the message text 
write-host $row[1].ToString() 

$exitcode = $row[0].ToString() 
 
## The 'exit code' fragment below was adapted from: 
## http://weblogs.asp.net/soever/returning-an-exit-code-from-a-powershell-script 
##SysAdmin, 2015-Nov 
function ExitWithCode  
{  
    param  
    (  
        $exitcode  
    ) 
    $host.SetShouldExit($exitcode)  
    exit  

 
The output from stored procedure when  is very short.  Note that Nagios only permits a maximum of 80 characters to be returned.  Hence the sample output if run the TSQL will looking this ; 

No issues
Error Code   ErrorMessage  
0 Replication OK 

Issue(s) requiring attention 
Error Code  ErrorMessage  
2  TNotRepl=45 CNotRepl=4 Latency=5 Status=In progress 

Nagios expects a error message of a maximum of 80 characters which is the reason for the brevity of the error message. The error messages are -

TNotRepl=45
 -  'Transactions not replicated' total number of commands queued to be applied to the subscriber database. 
CNotRepl=4 -  'Commands not replicated' total number of commands queued to be applied to the subscriber database 
Latency=5 - This is the time taken to from when a transaction is applied to the publisher database the amount of time it takes till the same transaction is applied to the subscriber database (in seconds).  
Status=In progress - current status of the replication process

When considering this error message it was primarily to give some guidance as to what might be happening with the replication process.  It is not intended to give any guidance on the underlying reason that is causing the issue.  All that is required is that the Nagios process shows that there is something wrong.  What ever the reason it requires some form of human intervention.  Once an error condition has been detected then the issue will be handed to me to resolve.  At least now l do not have to check the process periodically, now l just have to wait for a message from our sysadmin.


Setting Firewall rules in new style Azure portal for SQL server

Since this one caught me out and a few other people it seemed like a good idea to blog about this.

Sometime ago the Azure portal got a make over, which l have say make the portal look very pretty.  Whilst setting up a proof of concept for PowerBI l thought it was also a good time to look at setting up an Azure SQL database.  The last time l did this was in the previous portal.  So l duly set up the database, one of the last steps was to allow the IP address from my work organisation access to the database.  So l goggled and found this page - https://azure.microsoft.com/en-gb/documentation/articles/sql-database-configure-firewall-settings/

The Wrong Way
Open the new style Azure Portal 
Click on browse (1)
From the list of blades click once on SQL databases (Figure 1)

Figure 1 - Opening the SQL databases blade

Click on the 'SQL databases' blade this will open a list of the all the SQL databases that are in your account (Figure 2)


Figure 2 - List of SQL databases showing area where to click to show settings

Clicking on the ellipsis will cause a new menu window to appear (Figure 3)


Figure 3 - Clicking on the ellipsis to get the settings link

Click once on the settings link this will then open the settings window (Figure 4)


Figure 4 - Settings for the database 'JiraData'

As can be seen in Figure 4 there is no firewall settings.

The Right Way

Open the new style Azure Portal 
Click on browse (1)
From the list of blades click once on SQL servers (Figure 5)


Figure 5 - Opening the SQL servers blade

Click on the SQL Servers blade and this will open a list of all the SQL servers in your account (Figure 6)


Figure 6 - List of SQL servers

Move the mouse over the row which is the SQL server that require to see the settings for.  Click once with left hand mouse button the selected SQL server in Figure 6 this is SQL server 'kjsgbci03i'.  This will then open Settings window on the far right hand side (see Figure 7)


Figure 7 - Settings page for SQL server

Click once on the firewall row and this will open the 'Firewall settings' window (see Figure 8)


Figure 8 - Firewall settings window

Final Note

Splitting the settings for SQL servers and SQL databases, does make sense, the same logic is used within SSMS.  That said in the old style portal, the link to the page for setting allowed IP Addresses is under 'SQL databases' (see Figure 9).  Hence why l must have looked at SQL databases blade first.  At least that's my excuse and l am sticking to it ;->


Figure 9 - SQL server page with link to Manage allowed IP addresses (Firewall rules) in the old style Auzure portal.

 


Why should programmers be more like plumbers, electricians or mechanics

Over the years l have worked with many different people, and observed many people at work.  When working in a mechanical garage l was impressed with the range of tools used by the time served mechanic's.  When a plumber and electrician come to fit a new kitchen l was always was curious about the range to tools to do different jobs.

Today l had a relatively simple task when completing data migration task, which l was doing using SSIS.  Most of the work uses TSQL when possible.  In this case though l was asked convert the a string value from the source for one specific control on one form to Camel Case.  First thought use play to my strengths and use TQL, so quick google, and found this stack overflow posting -> http://stackoverflow.com/questions/5164201/is-there-any-sql-server-built-in-function-to-convert-string-in-camel-case

Ok yes it was possible to use TSQL, the phrase that ran through my mind at that point was "when you have a hammer everything looks like a nail".  Yes l could create the function and call it from the query etc.... 
SSIS is the most wonderful toolbox though (pun intended if you notice it).  One of them is the script task.  Since l only had to convert one control (there is a large number!) why not use some code?  Would that be easier?  Goggling lead to stack overflow posting -> http://stackoverflow.com/questions/1206019/converting-string-to-title-case-in-c-sharp ( This article filled in the gaps for me https://msdn.microsoft.com/en-us/library/system.globalization.textinfo.totitlecase.aspx

My code 
public void Main() 

            string fieldName = (string)Dts.Variables["User::str_Field_ID"].Value; 
            string stringValue = (string)Dts.Variables["User::str_NNLFP_VALUE_TXT"].Value; 
            if (fieldName == "XYZ014") 
            { 
               TextInfo textInfo = new CultureInfo("en-GB", false).TextInfo; 
                stringValue = textInfo.ToTitleCase(stringValue); 
                Dts.Variables["User::str_NNLFP_VALUE_TXT"].Value = stringValue.ToString(); 
           } 
Dts.TaskResult = (int)ScriptResults.Success; 


Yes it has taken me time and effort to learn C#, that said it's an excellent additional to my toolbox.  Yes l could have done the same in TSQL, it would have taken probably taken longer. Right tool for the right job?  That l am sure is a matter of opinion and or debate.  Speaking personally for this specific project yes l believe so.  It has encouraged me to keep adding to my own personal IT toolbox......


Win 10 Pro and SQL Server 2016 SSRS report mgr

Ok long story, short. I downloaded and installed SQL server 2016 CTP3 on Windows 10 Pro X64 Virtual machine which was set up for testing purposes. Yes l know that was a silly idea, what the heck living dangerously is fun sometimes! The OS was a standard install and it's standalone as in not connected to a domain or clever stuff. When propmted to enter a user account, l used a personal Microsoft account l have.  Next l installed SQL Server 2016 CTP nothing fancy and choose the native SSRS install

(If you just interested in list of steps l followed they are at the bottom of this posting)

Next logged in as the user account that was used during set up, this is Microsoft account. Now l wanted to play with SSRS, so open the default browser (Microsoft Edge) and entered the url “localhost/reportserver” and waited. Next got a dialog box asking me to enter my user credentials 



So l entered my Microsoft account details, and eventually ended up with the error message below.



Yes l know that I should have remembered, an admin account requires to grant permission on Report Manager to the account you are connecting with “Doh!”. Next step was to see if l could run Microsoft Edge (the default browser) as Administrator, no that was not possible.  As can be seen from the screen shot below it was possible to run command prompt as Administrator, which did not make sense, but not Microsoft Edge.

So off to Google and found this link http://www.ghacks.net/2014/11/12/how-to-enable-the-hidden-windows-10-administrator-account/

So following the instructions, I enabled the administrator account and set the password. Just as an aside l loved the fact when you type in the password, nothing appeared on the screen. Yes l expected some stars at least! So password duly set for the administrators account. Next step was to switch accounts to the Administrator account. At this point l thought “simple run Microsoft Edge as when logged in as the Administrator and set the privilege’s in SSRS report manager”. Got the error message below.



Again Google the rescue and found this page - http://www.virtualizationhowto.com/2015/07/windows-10-edge-opened-builtin-administrator-account/. Followed instructions open Microsoft Edge, restarted Windows was able to run Microsoft Edge as Administrator, "Yippee!!!" Typed in the url “localhost/reportserver”, then eventually got the following error message.


During earlier search found this page http://www.windowscentral.com/how-find-internet-explorer-windows-10-if-you-really-need-it. I had tried this before making the change in this page http://www.ghacks.net/2014/11/12/how-to-enable-the-hidden-windows-10-administrator-account/. So entered ‘Internet ‘ into the search box and selected to run Internet Explorer as Administrator.



Success !!!!!! (see screenshot below) As the saying goes “ a long way for a short cut”, however it works!



So l set about making the relevant changes in report manager, setting up the Microsoft user account l normally log in with as content mgr etc. That all seemed to go as expected. Next switched accounts back to the Microsoft account and thought l will just open Microsoft Edge. Since the account has been set up and l do not need to run it as Administrator. This did not work l got the error message below. So l typed into the search box 'Internet' and Internet Explorer was one of the applications l was able choose to run, the result can be seen below. It worked as expected.



Is there an easier way? Then please let me know, this was not the most fun learning journey, and I’m always open to learn.

So what steps did l follow?

1) http://www.ghacks.net/2014/11/12/how-to-enable-the-hidden-windows-10-administrator-account/

Used instructions to enable the administrator account and set the password for the Administrator account.
Switched accounts from Microsoft account to Administrator account 

2) http://www.virtualizationhowto.com/2015/07/windows-10-edge-opened-builtin-administrator-account/
Used these instructions to make relevant change to the security policy

3) http://www.windowscentral.com/how-find-internet-explorer-windows-10-if-you-really-need-it
Used these instructions to find Internet Explorer and open using "Run as Administrator" option

4) Open SSRS report manager (running as local administrator), set the relevant permissions for the Microsoft account

5) Switched accounts from Administrator account, to Microsoft account. Searched for Internet Explorer then browsed to "localhost/reports" and was able to see SSRS report manager.


A picture can be worth > 1000 words
It's been a interesting time in my new role which l have been in for about 12 months now.  The product l get to work with is fantastic in so many ways for the end user.  The way that is configured allows it to be changed and moulded to fit the users wishes (within reason).  The downside to this is that there is a lot of complexity. 

The team l work with has responsibility for migrating data from our clients existing system to our system.  Speaking for myself, this can be a bit of a Rubik's Cube puzzle of what bit of data goes where and how.  Which speaking as a data geek can be fun.  Its taken me a while to understand both the product and the data model that supports the product and l am still learning very day!

Whilst working on a migration for one client, the form in which we received the data was a set of a large number of spreadsheets.  The information was spread over several spreadsheets mapping the data had been done by my colleague.  During a regular telephone conference with the client, we released that the client was not completely clear on how the data was being mapped from the spreadsheets to the application.  The spreadsheet view was a in a form they as the client understood and trusted (think trusted blanket).   Where as the application was still new shiny complicated and cold.  So my colleague took some screenshots of the spreadsheets, and of the application of where the data was being mapped to.  Using Google Diagrams, they drew some lines showing where values on the spreadsheet was placed in the application.  This was then passed to the client to review.

During the next telephone conference, the client was delighted with this simple diagram.  What my colleague had done was to delight and reassure the client at the same time.  They received from both the client and our own team praise for we saw as a simple task.

As l write this post l am creating some diagrams for another client we are working with.  One their requests was for a data dictionary for the views we provide for reporting.   The data dictionary was to include primary and foreign keys including which tables the foreign keys referenced. I was tasked with this bit of work, which l duly delivered to the client, it when down well.  The client then asked could be do some Entity Relationship Diagrams, with the object names and foreign keys.

At first l did not think this was going to be of much benefit.  All the required information was in the data dictionary after all.  Once l had completed the first one l had to say that my mind was changed.  Even though l had a good grasp of the data model, mapping the data dictionary to the ERD diagram was not as easy and simple as l first thought.  Even worst than that first l was enjoying the process, secondly l was learning as l went along.  Another of our regular meetings came round again.  So l had completed a rough draft of two diagrams, so l presented them.

During the updates l presented the two diagrams, explaining that they where intended primarily for non-technical users who might be required to do some work on reporting.  Much to my surprise the client was delighted, and related that these would prove to be very useful to all users.

The take away for me is that even the simplest scruffiest diagram (back of a paper napkin) can communicate so much more than we might appreciate.  As adults we spend much of time, complicating verbally.  We should from time to time get the crayons out and just draw lines, circles, shapes.  It might be possible to explain in words something.  Yet l am reminded of the simple diagram of joins that has cemented firmly in my mind SQL joins (http://blog.codinghorror.com/a-visual-explanation-of-sql-joins/).  To this day, when l am thinking of a left or right join, that diagram pops into my head.  This says to me that what l need to remember its not how l communicate something, more that l communicate it in a way the client can readily (or instantly) understand.  Sometimes a picture is worth more than a thousand words.