So what am I going to show in the following blog posts?
So that’s what I am going to show, next let’s look at what is required to do it.
First download and install Azure Data Studio you can download the program from here.
Once you have installed Azure Data Studio, open the application. In Azure Data Studio in the menu find ‘File’ and click it, from the menu select ‘New Notebook’ see Figure 1 below.
This will open a new notebook (yippee!!) this might not sound very exciting yet, however it is! When a new notebook opens the Kernel must be set. The way that I think about this that it sets the language which will be run in the notebook, and will default to SQL. What we want run is Python v3. From the list of Kernels available selected ‘Python 3’, this will set the language that will be run in the notebook.
Figure 2 – selecting the Kernel (programming language) that will be run in the notebook.
Once ‘Python 3’ has been selected and if Python is not set up and installed, then Azure Data Studio will prompt you to set up and configure Python for Notebooks. A screen will open as we can see in Figure 3. For this blog post I accepted the default location and clicked on the ‘install’ button.
Figure 3 – Install and configure python for use in Azure Data Studio
If everything has gone to plan, then you should see something that looks like Figure 4.
Figure 4 – installation of Python going as planned
Installing of Python can take sometime so it might be good idea to get a hot beverage or do something else till it is finished installing.
Figure 5 – Installation of python is now completed successfully
In sessionize.com it is possible to create different API’s to output data, with this example the data is outputted as JSON. It is possible to select different parts of the data to be outputted, in this example ‘All Data’ is selected. Selecting the data from sessionize.com is beyond the scope of this blog post, it is very easy to do though.
In figure 6 the last step is to get the URL to be called in the code, this can be seen in Figure 6 below.
Figure 6 - API /Embed screen in Session.com for Data Scotland 2019.
In figure 6a (yes I forgot to include this till a later edit) is the columns that are outputted from Sessions.com for the API endpoint used.
Figure 6a - Settings for Available API endpoint used in this blog post.
Ok enough setting up lets write some code. To get access to other libraries in Python, the command that is used is import <library name>. In this example there are four libraries which are imported to be used. If you run the code shown in figure 7 you might get the error message shown.
Figure 7 – Error message if the package for the library being imported is not installed.
If you do see this error message then all you need to do is install the required package. In figure 7 at the top left hand side there a button titled ‘Install Packages’. Click on that button and the terminal window will open (see Figure 8). The command that installs the library ‘pyodbc’ is ‘.\python.exe - m pip install pyodbc’, type the command into the terminal window and press enter.
Figure 8 – Entering the command to install the ‘pyodbc’ package in the terminal window.
Hopefully the ‘pyodbc’ package will install without any challenges. If like me you are not so lucky and you get the error message shown in Figure 9. Then this is quite easy to fix.
Figure 9 – Error message stating PIP (Pip Installs Packages) requires to be upgraded to install ‘pyodbc’ package
If you get the error message shown in Figure 9 then enter the following command at the prompt ‘.\python.exe - m pip install –upgrade pip’. If everything goes well you will see a message like the one shown in Figure 10.
Figure 10 – Successfully upgraded PIP to v 18.
Once the new version of PIP has been installed restart Azure Data Studio. Then open a notebook select Python 3 as the kernel language then click on the ‘Install Packages’ and install ‘pyobdc’ library (see Figure 8). Once ‘pyobc’ has been installed, it is now time to run the Python script
The Python Script will do the following
1 - call the API call and get the Json string returned is this into a dict Object which is then cast to a string object.
2 - open a connection to a SQL database run SQL script to create table if does not exist
3 - insert Json string into field in the table
Below is the Python script that is used. Much of the credit must go to the various websites which I have add references to in the script. In figure 10 we can see the script that is used. All that is require to change, is URL for the sessionize.com API, user credentials in the connection string. Otherwise this is the script is what I used.
Figure 11 - Python script in Azure Data Studio Notebook to import Json in SQL server 2016
The Azure Data Studio Notebook that is shown in Figure 11 can be downloaded from here.
In the next blog post we will look at how work with the Json data in SQL Server.
Method 1 – TSQL
SELECT ConnectionProperty('net_transport') AS 'net_transport'
, ConnectionProperty('protocol_type') AS 'protocol_type'
, ConnectionProperty('auth_scheme') AS 'auth_scheme'
, ConnectionProperty('local_net_address') AS 'local_net_address'
, ConnectionProperty('local_tcp_port') AS 'local_tcp_port'
, ConnectionProperty('client_net_address') AS 'client_net_address'
, ConnectionProperty('physical_net_transport') AS 'physical_net_transport'
Local_net_address was the field that I required.
This post was inspired by - https://www.sqlservercentral.com/forums/reply/1519373
Further information on ConnectionProperty can be found here.
Method 2 – DBA Tools
Test-DbaConnection <InstanceName> see -> https://docs.dbatools.io/#Test-DbaConnection
The output includes a property IPAddress which is what I wanted. If you have not looked at DBATools then you should, it is an amazing, open source project, and a genuine time saver.
Are there any other methods which I have not thought of please let me know.
For various reasons which l have now forgotten, l set up transactional replication for some clients. The result of this is l am the caretaker of transactional replication for two of our clients, what l lucky person l am !
T-SQL code used to check Transaction replication
Once l got the process up and running (a very, very long and stressful story). At that point l realised that I would have to monitor these processes. Following some goggling this article was found with some TSQL written by SQLSoldier here. This worked for me and l used this to monitor replication by running the script and checking the results manually.
Our SysAdmin uses a tool called nagios to monitor out IT estate. So they suggested that a script could be written to monitor the replication process and send an alert if anything needed to be looked at. This sounded like an excellent idea, how to do it? The approach that was arrived at, involved using a PowerShell script which would run a SQL query examine the results then respond to Nagios, with the following values
Next we decided to use PowerShell to return the results. Following some goggling we found this page http://www.madeiradata.com/cross-server-replication-health-check-using-powershell/ which runs a SQL script and returned a data set. First challenge was the TSQL script from SQLSoldier was rather long for my first powershell script, l wanted some thing smaller. So l created a stored procedure based on the script, and placed it in my Distribution database on the replicated database server. Doing this had a two reasons, first less TSQL in the PowerShell script, second changing one of the parameters meant it returns different amounts of data. The stored procedure takes the following parameters ;
When considering this error message it was primarily to give some guidance as to what might be happening with the replication process. It is not intended to give any guidance on the underlying reason that is causing the issue. All that is required is that the Nagios process shows that there is something wrong. What ever the reason it requires some form of human intervention. Once an error condition has been detected then the issue will be handed to me to resolve. At least now l do not have to check the process periodically, now l just have to wait for a message from our sysadmin.