(scottish) sql bob blog

rss

Some thoughts of your typical data shepherd / data plumber / data dance teacher sort of person.


Import sessionize.com JSON data into SQL Server - Part 2
Import sessionize.com JSON data into SQL Server - Part 2
In the first blog post the Json string from sessonize.com API has been placed into the table dbo.JsonData.  The first check run is to see if the data is valid JSON, using the ISJSON() function.  By running the query shown in Figure 1 the result shows that the data contained in the table dbo. JsonData is valid JSON.









 



Figure 1 – The function ISJSON() returns 1 showing the data in the field [Data] is valid JSON

Now the JSON data is in the SQL database lets see about doing something useful with it.  In SQL Server 2016 there was a number of new functions added that allow the querying and manipulation of JSON data.  Having done some research, I found this blog post - https://visakhm.blogspot.com/2016/07/whats-new-in-sql-2016-native-json_13.html.  Using code in this blog post I was able to extract the data from the JSON string supplied by the API from the sessionise.com website.
 

Before querying the data I need to explain one concept which is crucial for extracting data from structured JSON.  In the example in Figure 1 below the path of the ‘title’ key value pair is as follows 

Sessions.0.title this would have the key value pair 'title: “DAX Gotchas”' see Figure 2 



 

 

 

 

 

 

 

 

Figure 2 – JSON data showing the sessions node and the first speaker node. 

In the JSON object that was returned from sessionize.com API there are a number of nodes for each session.  Starting with the number 0 through to 29 within each node there are a number of Key : Value pairs eg 'id : “117469”'.  The path, nodes and arrays eg []speakers, and []categoryItems are what TSQL is going to extract values from. 

Enough with all that waffling about JSON objects, lets write some proper TSQL.  In the next example we are going to use a function called OPENJSON().  This is only available in SQL 2016 or upwards.  Using OPENJSON() in this example we are going to provide two arguments, @AllJson which contains the JSON object and must be datatype NVARCHAR().  Next is the path, the way I think about the path, is it specifies the node or array that I want to return from the @AllJson.  The other function that we will use is JSON_VALUE(). This function also accepts two parameters, and an expression which is a variable or field name containing JSON data.  The other one is path, the way I think about the path is it specifics the node or array that I want to return from the JSON data (yes I said that already just wanted to see if you are paying attention ;->).   

That’s a lot of words so let's look at some TSQL in Figure 3 below 












Figure 3 – The JSON data from the sessions node returned as a result set in SSMS

When we look at Figure 3 we will notice that the first row of the data is the same as the data shown in Figure 2.  In essence the FROM OPENJSON(@AllJson, ‘$.sessions’) is returning a dataset which consists of three fields namely Key, Value, and Type.  The field Value contains the JSON object for all 30 session nodes.  Next the JSON_VALUE() function takes the Json and extracts the value for one key pair.  This is done by specifying the Key value for the 'Key:Value pair'.  So in the case of title the path ‘$.title’ is supplied for the path parameter.  Since there is only one 'Key:Value' pair where the Key = title, the value is return from the JSON_VALUE() function, and returned in the field ‘SessionTitle’. 

Looking at Figure 2, there is a Key:Value pair in the []speakers array.  So sessions[0].id.value is “1174469”, the corresponding lookup value is speakers[7].sessions.value is “117469”.  The two values are their locations in the JSON object are shown in Figure 4 below.


Figure 4 – Showing the lookup values for both sessions to speakers and vice versa. 

So we know that we want to get access to the data in the []speakers array as this contains the list of speakerID’s for each session.  How is this done?  Well I found an answer in this blog post - https://visakhm.blogspot.com/2016/07/whats-new-in-sql-2016-native-json_13.html.  Below in Figure 5 is the TSQL and result set. 


 

 

 

 

 

 

 

 

 

Figure 5 – Updated query to return the speakerID from the []speakers array.

All we have done in the query shown in Figure 5 is to add a CROSS APPLY with a simple select statement. Now the speaker ID is returned, note that if there is more than one speakerID, such as in the case of sessionID 117615 (which has two awesome speakers).  In which case the query returns two rows, returning a different speakerID for each, which is just what we wanted. 

Next let's have a look at returning data for the speaker's node.  Below in Figure 6 the TSQL to return some data from the []speakers array. 










Figure 6 – TSQL query to return data from the []speakers array

Looking at the query inside the CROSS APPLY 

SELECT Value  FROM OPENJSON(s.Value, '$.links') 
WHERE Value LIKE '%Twitter%' 

There are a couple things that are worth looking at.  First it is possible to use a WHERE clause on the columns returned by the OPENJSON() function.  The reason for using the WHERE clause is that the links node can contain more than one type of link.  During development some of the speakers had a LinkedIn profile, which they then removed 🙁. 

So by now I am sure you are saying “show me the money”.  After some work I created a query which extracts, the session, speaker and room information.  Then returns it as a single result set as shown in Figure 7 below.










Figure 7 – Result set with Session, Speaker and room details 


If you want to have a try yourself and play with the code then you will find 

Python Code is in this Azure Data Studio Notebook is here

 

If you have not run the python code to import the data to import the data, then I have created a azure data studio notebook, containing the code to create the database and other tasks.  The notebook can be found here.

 

Last, but very much not least why did I spend some much effort to get all the data out of the sessonize API?  The end goal was to supply the data to SQL Server Report Builder (download from here https://www.microsoft.com/en-us/download/details.aspx?id=53613) .  This standalone tool will allow you to build an SSRS report.  Using this tool I created a report which when you run the report outputs pages that look like the one shown in Figure 8 below. 

Figure 8- Data finally published on the SSRS report


Import sessionize.com JSON data into SQL Server - Part 1
Import sessionize.com JSON data into SQL Server - Part 1
....TLDR in the next few blog posts I will be showing how to 

  • Export Json data from a Sessionise.com API url
  • Insert the Json data into SQL Server 2016 database
  • Generate a dataset from the imported Json data
  • Display data on SSRS report
I love Azure Data Studio with the addition of the notebook feature it's just soooo much more wonderful.  For me this feature is something that makes the product even more amazing.  My first introduction to notebooks was Jupiter notebooks, when I did some courses on Python. The coursework required you to use them to submit coursework using Jupiter notebooks. So when the Azure Data Studio team announced that they would support notebooks I was very excited. Even better the notebooks can support SQL, Python, and several other languages. So this was a great opportunity for me to look at using Python and SQL. 

During one of the discussions about organising Data Scotland it was suggested to create some cards with the session details. The card would include the speakers photograph, session title, room name, twitter handle.  The required information is already stored on the sessionize.com website.

So what am I going to show in the following blog posts?  

  • Call the sessionize API grab the data returned as a JSON String 
  • Place the data into SQL server, using some TSQL code with a bit of Python magic
  • Query the JSON data in SQL server using TSQL
  • Return a recordset from the JSON data that can be read by SSRS Report Builder 

So that’s what I am going to show, next let’s look at what is required to do it.

Ingredients

  • SQL Server 2016 database instance 
  • SQL Server 2016 database (compatibility level 130) running on 2016 SQL Server 
  • Azure data studio with Python installed

First download and install Azure Data Studio you can download the program from here. 

Once you have installed Azure Data Studio, open the application.  In Azure Data Studio in the menu find ‘File’ and click it, from the menu select ‘New Notebook’ see Figure 1 below.

 

 

 

 

 

 



Figure 1 – File menu showing where to find the ‘New Notebook’ menu item. 


This will open a new notebook (yippee!!) this might not sound very exciting yet, however it is!  When a new notebook opens the Kernel must be set.  The way that I think about this that it sets the language which will be run in the notebook, and will default to SQL.  What we want run is Python v3.   From the list of Kernels available selected ‘Python 3’, this will set the language that will be run in the notebook. 


 

 

 

 



Figure 2 – selecting the Kernel (programming language) that will be run in the notebook.

Once ‘Python 3’ has been selected and if Python is not set up and installed, then Azure Data Studio will prompt you to set up and configure Python for Notebooks.  A screen will open as we can see in Figure 3.  For this blog post I accepted the default location and clicked on the ‘install’ button.





















Figure 3 – Install and configure python for use in Azure Data Studio

If everything has gone to plan, then you should see something that looks like Figure 4.


 

 

 

 

 

 

 

 

 

 

Figure 4 – installation of Python going as planned

Installing of Python can take sometime so it might be good idea to get a hot beverage or do something else till it is finished installing.  

 

 

 

 

 

 

 

 

 

 


Figure 5 – Installation of python is now completed successfully

In sessionize.com it is possible to create different API’s to output data, with this example the data is outputted as JSON.   It is possible to select different parts of the data to be outputted, in this example ‘All Data’ is selected.  Selecting the data from sessionize.com is beyond the scope of this blog post, it is very easy to do though. 

 In figure 6 the last step is to get the URL to be called in the code, this can be seen in Figure 6 below.

 

 


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Figure 6 - API /Embed screen in Session.com for Data Scotland 2019.

In figure 6a (yes I forgot to include this till a later edit) is the columns that are outputted from Sessions.com for the API endpoint used.


Figure 6a - Settings for Available API endpoint used in this blog post.

Ok enough setting up lets write some code.  To get access to other libraries in Python, the command that is used is import <library name>.  In this example there are four libraries which are imported to be used.   If you run the code shown in figure 7 you might get the error message shown.  












Figure 7 – Error message if the package for the library being imported is not installed.

If you do see this error message then all you need to do is install the required package.  In figure 7 at the top left hand side there a button titled ‘Install Packages’.  Click on that button and the terminal window will open (see Figure 8).  The command that installs the library ‘pyodbc’ is ‘.\python.exe - m pip install pyodbc, type the command into the terminal window and press enter.



 

 

 

 

 

 

 

 

Figure 8 – Entering the command to install the ‘pyodbc’ package in the terminal window.

Hopefully the ‘pyodbc’ package will install without any challenges.  If like me you are not so lucky and you get the error message shown in Figure 9.  Then this is quite easy to fix.

 

 

 

 


Figure 9 – Error message stating PIP (Pip Installs Packages) requires to be upgraded to install ‘pyodbc’ package
If you get the error message shown in Figure 9 then enter the following command at the prompt ‘.\python.exe - m pip install –upgrade pip’.   If everything goes well you will see a message like the one shown in Figure 10.

 

 

 

 


Figure 10 – Successfully upgraded PIP to v 18.

Once the new version of PIP has been installed restart Azure Data Studio.  Then open a notebook select Python 3 as the kernel language then click on the ‘Install Packages’ and install ‘pyobdc’ library (see Figure 8).  Once ‘pyobc’ has been installed, it is now time to run the Python script

The Python Script will do the following

1 - call the API call and get the Json string returned is this into a dict Object which is then cast to a string object.

2 - open a connection to a SQL database run SQL script to create table if does not exist

3 - insert Json string into field in the table

Below is the Python script that is used.  Much of the credit must go to the various websites which I have add references to in the script. In figure 10 we can see the script that is used.  All that is require to change, is URL for the sessionize.com API, user credentials in the connection string.  Otherwise this is the script is what I used.



 

 

 

 

 

 

 

 

 

 

 

 

 

Figure 11 - Python script in Azure Data Studio Notebook to import Json in SQL server 2016

The Azure Data Studio Notebook that is shown in Figure 11 can be downloaded from here.

In the next blog post we will look at how work with the Json data in SQL Server.


How to find SQL Server Instance IP address
Quite often I need to get some information about a SQL server instance, this morning it was the IP address for an SQL server instance.  Having googled for this more times than I care to remember this time I thought I would put the information in a blog post.


Method 1 – TSQL 
SELECT   ConnectionProperty('net_transport')          AS 'net_transport'
      , ConnectionProperty('protocol_type')          AS 'protocol_type'
      , ConnectionProperty('auth_scheme')            AS 'auth_scheme'
      , ConnectionProperty('local_net_address')      AS 'local_net_address'
      , ConnectionProperty('local_tcp_port')          AS 'local_tcp_port'
      , ConnectionProperty('client_net_address')      AS 'client_net_address'
      , ConnectionProperty('physical_net_transport') AS 'physical_net_transport'

Local_net_address was the field that I required.

This post was inspired by - https://www.sqlservercentral.com/forums/reply/1519373
Further information on ConnectionProperty can be found here.

Method 2 – DBA Tools

Test-DbaConnection <InstanceName>  see -> https://docs.dbatools.io/#Test-DbaConnection

The output includes a property IPAddress which is what I wanted.   If you have not looked at DBATools then you should, it is an amazing, open source project, and a genuine time saver.

Are there any other methods which I have not thought of please let me know.


Monitoring replication status using Nagios (using PowerShell script and TSQL)

For various reasons which l have now forgotten, l set up transactional replication for some clients.  The result of this is l am the caretaker of transactional replication for two of our clients, what l lucky person l am !

T-SQL code used to check Transaction replication 
Once l got the process up and running (a very, very long and stressful story).  At that point l realised that I would have to monitor these processes.  Following some goggling this article was found with some TSQL written by SQLSoldier here.  This worked for me and l used this to monitor replication by running the script and checking the results manually.
 
Nagios output 
Our SysAdmin uses a tool called nagios to monitor out IT estate.  So they suggested that a script could be written to monitor the replication process and send an alert if anything needed to be looked at.  This sounded like an excellent idea, how to do it?  The approach that was arrived at, involved using a PowerShell script which would run a SQL query examine the results then respond to Nagios, with the following values  

0 - all ok nothing to see here 
1- something happening nothing to worry about (just now) 
2-yes there is something that really needs some attention

See this page for more guidance.

Stored procedure 

Next we decided to use PowerShell to return the results.  Following some goggling we found this page http://www.madeiradata.com/cross-server-replication-health-check-using-powershell/ which runs a SQL script and returned a data set.  First challenge was the TSQL script from SQLSoldier was rather long for my first powershell script, l wanted some thing smaller.  So l created a stored procedure based on the script, and placed it in my Distribution database on the replicated database server.  Doing this had a two reasons, first less TSQL in the PowerShell script, second changing one of the parameters meant it returns different amounts of data.  The stored procedure takes the following parameters ; 

@Publisher  - The name of the publisher database server instance 
@PublisherDB - The name of the publisher database 
@NagiosOutput - Y = only output return code and short error code( max of 80 characters), N = output all results 

The following script was used to check the results that would be returned to Nagios. 
USE [distribution]; 
DECLARE @Publisher AS sysname 
DECLARE @PublisherDB AS sysname 
DECLARE @NagiosOutput AS Char(1) 
SET @Publisher = 'ReplPublisherServerName' 
SET @PublisherDB = 'ReplPublisherDBName' 
SET @NagiosOutput = 'Y' 
EXEC [dbo].[Check_Replication] @Publisher, @PublisherDB, @NagiosOutput; 

Note the account connecting to the database from Nagios will require execute permissions to the stored procedure otherwise it cannot run the stored procedure.  The code for the stored procedure is here.

PowerShell script 
Having adapted the PowerShell script found here to run the stored procedure.  When the PowerShell script was run by Nagios there was no 'Return Code' returned (this is what Nagios expects).  We did find the solution on this page, and inserted function ExitWithCode, and made a few other changes.  The resulting PowerShell script is below -

## Beginning of Monitor 
##Connection String With Server Variable, Distribution Database name is 'Distribution' 
$con = "server=127.0.0.1;database=Distribution;Integrated Security=sspi" 
 
##Begin SQL Query 
$cmd = "SET NOCOUNT ON; " 
$cmd = $cmd + " USE [distribution];" 
$cmd = $cmd + " DECLARE @Publisher  AS sysname" 
$cmd = $cmd + " DECLARE @PublisherDB AS sysname" 
$cmd = $cmd + " DECLARE @NagiosOutput AS Char(1)" 
$cmd = $cmd + " SET @Publisher = 'ReplPublisherServerName' " 
$cmd = $cmd + " SET @PublisherDB = 'ReplPublisherDBName' " 
$cmd = $cmd + " SET @NagiosOutput = 'Y'" 
$cmd = $cmd + " EXEC [dbo].[Check_Replication] @Publisher, @PublisherDB, @NagiosOutput;" 
 
##Creating DataSet Object 
$set = new-object system.data.dataset 
##Running Query 
$da = new-object System.Data.SqlClient.SqlDataAdapter ($cmd, $con) 
##Filling DataSet With Results 
$da.fill($set) | out-null 
##Creating Table Object and Inserting DataSet 
$dt = new-object System.Data.DataTable 
$dt = $set.Tables[0] 
 
## loop over each column in the DataSet 
foreach ($row in $set.Tables[0].Rows) 

##write out the 2nd row which contains the message text 
write-host $row[1].ToString() 

$exitcode = $row[0].ToString() 
 
## The 'exit code' fragment below was adapted from: 
## http://weblogs.asp.net/soever/returning-an-exit-code-from-a-powershell-script 
##SysAdmin, 2015-Nov 
function ExitWithCode  
{  
    param  
    (  
        $exitcode  
    ) 
    $host.SetShouldExit($exitcode)  
    exit  

 
The output from stored procedure when  is very short.  Note that Nagios only permits a maximum of 80 characters to be returned.  Hence the sample output if run the TSQL will looking this ; 

No issues
Error Code   ErrorMessage  
0 Replication OK 

Issue(s) requiring attention 
Error Code  ErrorMessage  
2  TNotRepl=45 CNotRepl=4 Latency=5 Status=In progress 

Nagios expects a error message of a maximum of 80 characters which is the reason for the brevity of the error message. The error messages are -

TNotRepl=45
 -  'Transactions not replicated' total number of commands queued to be applied to the subscriber database. 
CNotRepl=4 -  'Commands not replicated' total number of commands queued to be applied to the subscriber database 
Latency=5 - This is the time taken to from when a transaction is applied to the publisher database the amount of time it takes till the same transaction is applied to the subscriber database (in seconds).  
Status=In progress - current status of the replication process

When considering this error message it was primarily to give some guidance as to what might be happening with the replication process.  It is not intended to give any guidance on the underlying reason that is causing the issue.  All that is required is that the Nagios process shows that there is something wrong.  What ever the reason it requires some form of human intervention.  Once an error condition has been detected then the issue will be handed to me to resolve.  At least now l do not have to check the process periodically, now l just have to wait for a message from our sysadmin.


Setting Firewall rules in new style Azure portal for SQL server

Since this one caught me out and a few other people it seemed like a good idea to blog about this.

Sometime ago the Azure portal got a make over, which l have say make the portal look very pretty.  Whilst setting up a proof of concept for PowerBI l thought it was also a good time to look at setting up an Azure SQL database.  The last time l did this was in the previous portal.  So l duly set up the database, one of the last steps was to allow the IP address from my work organisation access to the database.  So l goggled and found this page - https://azure.microsoft.com/en-gb/documentation/articles/sql-database-configure-firewall-settings/

The Wrong Way
Open the new style Azure Portal 
Click on browse (1)
From the list of blades click once on SQL databases (Figure 1)

Figure 1 - Opening the SQL databases blade

Click on the 'SQL databases' blade this will open a list of the all the SQL databases that are in your account (Figure 2)


Figure 2 - List of SQL databases showing area where to click to show settings

Clicking on the ellipsis will cause a new menu window to appear (Figure 3)


Figure 3 - Clicking on the ellipsis to get the settings link

Click once on the settings link this will then open the settings window (Figure 4)


Figure 4 - Settings for the database 'JiraData'

As can be seen in Figure 4 there is no firewall settings.

The Right Way

Open the new style Azure Portal 
Click on browse (1)
From the list of blades click once on SQL servers (Figure 5)


Figure 5 - Opening the SQL servers blade

Click on the SQL Servers blade and this will open a list of all the SQL servers in your account (Figure 6)


Figure 6 - List of SQL servers

Move the mouse over the row which is the SQL server that require to see the settings for.  Click once with left hand mouse button the selected SQL server in Figure 6 this is SQL server 'kjsgbci03i'.  This will then open Settings window on the far right hand side (see Figure 7)


Figure 7 - Settings page for SQL server

Click once on the firewall row and this will open the 'Firewall settings' window (see Figure 8)


Figure 8 - Firewall settings window

Final Note

Splitting the settings for SQL servers and SQL databases, does make sense, the same logic is used within SSMS.  That said in the old style portal, the link to the page for setting allowed IP Addresses is under 'SQL databases' (see Figure 9).  Hence why l must have looked at SQL databases blade first.  At least that's my excuse and l am sticking to it ;->


Figure 9 - SQL server page with link to Manage allowed IP addresses (Firewall rules) in the old style Auzure portal.

 


Why should programmers be more like plumbers, electricians or mechanics

Over the years l have worked with many different people, and observed many people at work.  When working in a mechanical garage l was impressed with the range of tools used by the time served mechanic's.  When a plumber and electrician come to fit a new kitchen l was always was curious about the range to tools to do different jobs.

Today l had a relatively simple task when completing data migration task, which l was doing using SSIS.  Most of the work uses TSQL when possible.  In this case though l was asked convert the a string value from the source for one specific control on one form to Camel Case.  First thought use play to my strengths and use TQL, so quick google, and found this stack overflow posting -> http://stackoverflow.com/questions/5164201/is-there-any-sql-server-built-in-function-to-convert-string-in-camel-case

Ok yes it was possible to use TSQL, the phrase that ran through my mind at that point was "when you have a hammer everything looks like a nail".  Yes l could create the function and call it from the query etc.... 
SSIS is the most wonderful toolbox though (pun intended if you notice it).  One of them is the script task.  Since l only had to convert one control (there is a large number!) why not use some code?  Would that be easier?  Goggling lead to stack overflow posting -> http://stackoverflow.com/questions/1206019/converting-string-to-title-case-in-c-sharp ( This article filled in the gaps for me https://msdn.microsoft.com/en-us/library/system.globalization.textinfo.totitlecase.aspx

My code 
public void Main() 

            string fieldName = (string)Dts.Variables["User::str_Field_ID"].Value; 
            string stringValue = (string)Dts.Variables["User::str_NNLFP_VALUE_TXT"].Value; 
            if (fieldName == "XYZ014") 
            { 
               TextInfo textInfo = new CultureInfo("en-GB", false).TextInfo; 
                stringValue = textInfo.ToTitleCase(stringValue); 
                Dts.Variables["User::str_NNLFP_VALUE_TXT"].Value = stringValue.ToString(); 
           } 
Dts.TaskResult = (int)ScriptResults.Success; 


Yes it has taken me time and effort to learn C#, that said it's an excellent additional to my toolbox.  Yes l could have done the same in TSQL, it would have taken probably taken longer. Right tool for the right job?  That l am sure is a matter of opinion and or debate.  Speaking personally for this specific project yes l believe so.  It has encouraged me to keep adding to my own personal IT toolbox......


Win 10 Pro and SQL Server 2016 SSRS report mgr

Ok long story, short. I downloaded and installed SQL server 2016 CTP3 on Windows 10 Pro X64 Virtual machine which was set up for testing purposes. Yes l know that was a silly idea, what the heck living dangerously is fun sometimes! The OS was a standard install and it's standalone as in not connected to a domain or clever stuff. When propmted to enter a user account, l used a personal Microsoft account l have.  Next l installed SQL Server 2016 CTP nothing fancy and choose the native SSRS install

(If you just interested in list of steps l followed they are at the bottom of this posting)

Next logged in as the user account that was used during set up, this is Microsoft account. Now l wanted to play with SSRS, so open the default browser (Microsoft Edge) and entered the url “localhost/reportserver” and waited. Next got a dialog box asking me to enter my user credentials 



So l entered my Microsoft account details, and eventually ended up with the error message below.



Yes l know that I should have remembered, an admin account requires to grant permission on Report Manager to the account you are connecting with “Doh!”. Next step was to see if l could run Microsoft Edge (the default browser) as Administrator, no that was not possible.  As can be seen from the screen shot below it was possible to run command prompt as Administrator, which did not make sense, but not Microsoft Edge.

So off to Google and found this link http://www.ghacks.net/2014/11/12/how-to-enable-the-hidden-windows-10-administrator-account/

So following the instructions, I enabled the administrator account and set the password. Just as an aside l loved the fact when you type in the password, nothing appeared on the screen. Yes l expected some stars at least! So password duly set for the administrators account. Next step was to switch accounts to the Administrator account. At this point l thought “simple run Microsoft Edge as when logged in as the Administrator and set the privilege’s in SSRS report manager”. Got the error message below.



Again Google the rescue and found this page - http://www.virtualizationhowto.com/2015/07/windows-10-edge-opened-builtin-administrator-account/. Followed instructions open Microsoft Edge, restarted Windows was able to run Microsoft Edge as Administrator, "Yippee!!!" Typed in the url “localhost/reportserver”, then eventually got the following error message.


During earlier search found this page http://www.windowscentral.com/how-find-internet-explorer-windows-10-if-you-really-need-it. I had tried this before making the change in this page http://www.ghacks.net/2014/11/12/how-to-enable-the-hidden-windows-10-administrator-account/. So entered ‘Internet ‘ into the search box and selected to run Internet Explorer as Administrator.



Success !!!!!! (see screenshot below) As the saying goes “ a long way for a short cut”, however it works!



So l set about making the relevant changes in report manager, setting up the Microsoft user account l normally log in with as content mgr etc. That all seemed to go as expected. Next switched accounts back to the Microsoft account and thought l will just open Microsoft Edge. Since the account has been set up and l do not need to run it as Administrator. This did not work l got the error message below. So l typed into the search box 'Internet' and Internet Explorer was one of the applications l was able choose to run, the result can be seen below. It worked as expected.



Is there an easier way? Then please let me know, this was not the most fun learning journey, and I’m always open to learn.

So what steps did l follow?

1) http://www.ghacks.net/2014/11/12/how-to-enable-the-hidden-windows-10-administrator-account/

Used instructions to enable the administrator account and set the password for the Administrator account.
Switched accounts from Microsoft account to Administrator account 

2) http://www.virtualizationhowto.com/2015/07/windows-10-edge-opened-builtin-administrator-account/
Used these instructions to make relevant change to the security policy

3) http://www.windowscentral.com/how-find-internet-explorer-windows-10-if-you-really-need-it
Used these instructions to find Internet Explorer and open using "Run as Administrator" option

4) Open SSRS report manager (running as local administrator), set the relevant permissions for the Microsoft account

5) Switched accounts from Administrator account, to Microsoft account. Searched for Internet Explorer then browsed to "localhost/reports" and was able to see SSRS report manager.