Whilst nothing can beat experience having a certificate can also improve the chances when looking for a new job. If an interviewer is looking at two evenly matched candidates, one without a certificate one with a certificate. Personally speaking, if l was the person making the decision then the person with the certificate would the one whom l would recommend getting the job. By studying and passing the exam the candidate has demonstrated, initiative, self-motivation, and a genuine desire for technology. At least that's my opinion. In one case l know of a colleague whom l work with, who said one the reasons they were offered a job was due to the fact that had studied and passed two exams, which were relevant to the role they applied for.
Craig Porteous spoke to some recruiters he knows and asked them some questions about their view of certification. So these thoughts are directly from those who make hiring decisions.
What weight do you put on certifications when hiring?
1) A lot. With two otherwise comparable candidates, the one with certifications wins in my mind.
2) A lot, think it shows that candidates are focused in developing their career.
3) For a technical role I see it as essential
Do you encourage the pursuit of certifications by your team?
Do you see any downsides/negative aspects to certifications?
1) No, none at all.
2) Some of the accreditation's could be more hands on focused.
3) Cost (retaining skilled up workers)
With staff who have completed certifications, do you see any differences in working practice etc to those who haven’t?
1) Yes. People working towards certifications are more engaged with technology and tend to apply their learning in the work environment, sharing their knowledge and improving the overall team dynamic leading to improved productivity.
2) It really depends on the individual so don’t think it’s a fair comparison. Better way to look at it is how doing accreditations adds value to that person in regards technical ability and confidence.
3) Last example was a infrastructure type. Stuck him on a sccm course. A,month later our sccm world upgraded. A year later he left and now heads up sccm at dell secure works in the US . For me a benefit – I get a sccm upgrade from a capable engineer. They get a badge they can use to get their career upgraded. Win win.
One reason l have heard for not doing any certification is the cost. My personal point of view on this is that l am investing in myself. If l learn a new skill gain or learn some new techniques. Yes, the company l am working for will benefit which is excellent news for them. If l choose to move to another role with a different company, then those skills transfer with me. Those skills l have invested both my time and (more importantly very often) my money in they are mine. So my choice is to invest in myself as l believe that the return on investment (ROI) is excellent.
So you have decided to study for certification, what resources are there available? The following list is just suggestions, based on largely on my experience and some others.
If you are planning to take the data platform exams then l would strongly suggest investing in the books for the relevant exam. For example, this one is for the 70-461 exam. The book covers all the topics that could be questioned in the exam. There is an accompanying CD has an electronic copy of the book and practice exam questions. One series of books which I used whilst studying for 70-461 exam Querying Microsoft SQL Server 2012/14 was series titled 2 Joes. The purpose of the books is to take a complete beginner through all the skills required to pass the exam. Personally l found this is the best explanation of how to query xml data using TSQL. They where for me worth the investment.
CBT Nuggets - an excellent resource, the videos l viewed for my 70-462 exam were really helpful. As part of the training package l signed up for included exam questions which l also found where excellent. More expensive that other sites, you have access to all the courses. Really worth considering if want to maximize your study time.
Udemy - with this provider you purchase one course at a time which allows you "lifetime access" to the course. When the courses are on sale, the prices are low. The quality of the courses can vary, so have a look a the reviews on the course before purchasing. Personally, l found the 70-463 covered the basics well, on the other hand, it did not go into sufficient detail for the exam questions.
Pluralsight - this is a well know training site (ok l have heard the name in quite a few place) there are lists of videos for specific certifications. The quaility of the courses l have watched were of a very high quality.
Microsoft Virtual Academy - its Free, which is not always a recommendation. That said the quality of the courses that l have viewed have been excellent. For the exams search for jump start videos, which are a really good jumping off point to start your studying. So excellent hints and tips in the videos from people who passed and training people for the exams.
SQL Bits - apart from being one of the best SQL conferences in Europe. The organizers have given back to the SQL community by recording some of the sessions and making the available for FREE on their website. To find what you are looking for might take a bit of searching, most topics will have at least one video on them.
YouTube - there are a lot of videos uploaded on a wide variety of topics. There will be some searching to find the topic you are looking. On the downside the quality of content is variable. Ranging from the excellent to the not so good.
Blog posts - again this will require some searching. It has in my experience, been worth the time and energy required. One author whilst studying for the 70-463 exam blogged about what she was learning as they went along.
Passing the exams
What is required to pass the exams? Practice and lots of it! One of the keys that are borne out by other people who passed the exams is the practice exams. These exam questions will not be exactly like the exam questions. What they will do is get in the way of thinking when doing the exams. Reading exam questions to see the question, examining multiple choice questions for the correct answer. Best way to get practice is to get hold of practice exam questions and take the exam. There are a number of providers which will allow you to purchase them, MeasureUp, CBTNuggets and others. They will not be exactly the same as the exam questions you take when you go into the exam room. On the other hand they will give you practice at answering the questions. If you pay attention to the score at the end of the practice exam you will also be able to see where you need to improve.
The questions are designed quite deliberately to test your knowledge, well you would not want them to be too easy ? This article from Pluralsight has some excellent examples of the format of the type of question you will be answering in the exam. If you are really interested in how the question are constructed and methodology behind them, this video from Pluralsight has an interview with someone who designs the exams.
All that remains to say if you have decided to study for a certification, good luck and happy studying.
Ok long story, short. I downloaded and installed SQL server 2016 CTP3 on Windows 10 Pro X64 Virtual machine which was set up for testing purposes. Yes l know that was a silly idea, what the heck living dangerously is fun sometimes! The OS was a standard install and it's standalone as in not connected to a domain or clever stuff. When propmted to enter a user account, l used a personal Microsoft account l have. Next l installed SQL Server 2016 CTP nothing fancy and choose the native SSRS install
(If you just interested in list of steps l followed they are at the bottom of this posting)
Next logged in as the user account that was used during set up, this is Microsoft account. Now l wanted to play with SSRS, so open the default browser (Microsoft Edge) and entered the url “localhost/reportserver” and waited. Next got a dialog box asking me to enter my user credentials
So l entered my Microsoft account details, and eventually ended up with the error message below.
Yes l know that I should have remembered, an admin account requires to grant permission on Report Manager to the account you are connecting with “Doh!”. Next step was to see if l could run Microsoft Edge (the default browser) as Administrator, no that was not possible. As can be seen from the screen shot below it was possible to run command prompt as Administrator, which did not make sense, but not Microsoft Edge.
Again Google the rescue and found this page - http://www.virtualizationhowto.com/2015/07/windows-10-edge-opened-builtin-administrator-account/. Followed instructions open Microsoft Edge, restarted Windows was able to run Microsoft Edge as Administrator, "Yippee!!!" Typed in the url “localhost/reportserver”, then eventually got the following error message.
During earlier search found this page http://www.windowscentral.com/how-find-internet-explorer-windows-10-if-you-really-need-it. I had tried this before making the change in this page http://www.ghacks.net/2014/11/12/how-to-enable-the-hidden-windows-10-administrator-account/. So entered ‘Internet ‘ into the search box and selected to run Internet Explorer as Administrator.
Success !!!!!! (see screenshot below) As the saying goes “ a long way for a short cut”, however it works!
So l set about making the relevant changes in report manager, setting up the Microsoft user account l normally log in with as content mgr etc. That all seemed to go as expected. Next switched accounts back to the Microsoft account and thought l will just open Microsoft Edge. Since the account has been set up and l do not need to run it as Administrator. This did not work l got the error message below. So l typed into the search box 'Internet' and Internet Explorer was one of the applications l was able choose to run, the result can be seen below. It worked as expected.
Is there an easier way? Then please let me know, this was not the most fun learning journey, and I’m always open to learn.
So what steps did l follow?
Used instructions to enable the administrator account and set the password for the Administrator account.
Switched accounts from Microsoft account to Administrator account
Used these instructions to make relevant change to the security policy
Used these instructions to find Internet Explorer and open using "Run as Administrator" option
4) Open SSRS report manager (running as local administrator), set the relevant permissions for the Microsoft account
5) Switched accounts from Administrator account, to Microsoft account. Searched for Internet Explorer then browsed to "localhost/reports" and was able to see SSRS report manager.
Figure 1 – The function ISJSON() returns 1 showing the data in the field [Data] is valid JSON
Now the JSON data is in the SQL database lets see about doing something useful with it. In SQL Server 2016 there was a number of new functions added that allow the querying and manipulation of JSON data. Having done some research, I found this blog post - https://visakhm.blogspot.com/2016/07/whats-new-in-sql-2016-native-json_13.html. Using code in this blog post I was able to extract the data from the JSON string supplied by the API from the sessionise.com website.
Before querying the data I need to explain one concept which is crucial for extracting data from structured JSON. In the example in Figure 1 below the path of the ‘title’ key value pair is as follows
Sessions.0.title this would have the key value pair 'title: “DAX Gotchas”' see Figure 2
Figure 2 – JSON data showing the sessions node and the first speaker node.
In the JSON object that was returned from sessionize.com API there are a number of nodes for each session. Starting with the number 0 through to 29 within each node there are a number of Key : Value pairs eg 'id : “117469”'. The path, nodes and arrays eg speakers, and categoryItems are what TSQL is going to extract values from.
Enough with all that waffling about JSON objects, lets write some proper TSQL. In the next example we are going to use a function called OPENJSON(). This is only available in SQL 2016 or upwards. Using OPENJSON() in this example we are going to provide two arguments, @AllJson which contains the JSON object and must be datatype NVARCHAR(). Next is the path, the way I think about the path, is it specifies the node or array that I want to return from the @AllJson. The other function that we will use is JSON_VALUE(). This function also accepts two parameters, and an expression which is a variable or field name containing JSON data. The other one is path, the way I think about the path is it specifics the node or array that I want to return from the JSON data (yes I said that already just wanted to see if you are paying attention ;->).
That’s a lot of words so let's look at some TSQL in Figure 3 below
Figure 3 – The JSON data from the sessions node returned as a result set in SSMS
When we look at Figure 3 we will notice that the first row of the data is the same as the data shown in Figure 2. In essence the FROM OPENJSON(@AllJson, ‘$.sessions’) is returning a dataset which consists of three fields namely Key, Value, and Type. The field Value contains the JSON object for all 30 session nodes. Next the JSON_VALUE() function takes the Json and extracts the value for one key pair. This is done by specifying the Key value for the 'Key:Value pair'. So in the case of title the path ‘$.title’ is supplied for the path parameter. Since there is only one 'Key:Value' pair where the Key = title, the value is return from the JSON_VALUE() function, and returned in the field ‘SessionTitle’.
Looking at Figure 2, there is a Key:Value pair in the speakers array. So sessions.id.value is “1174469”, the corresponding lookup value is speakers.sessions.value is “117469”. The two values are their locations in the JSON object are shown in Figure 4 below.
Figure 4 – Showing the lookup values for both sessions to speakers and vice versa.
So we know that we want to get access to the data in the speakers array as this contains the list of speakerID’s for each session. How is this done? Well I found an answer in this blog post - https://visakhm.blogspot.com/2016/07/whats-new-in-sql-2016-native-json_13.html. Below in Figure 5 is the TSQL and result set.
Figure 5 – Updated query to return the speakerID from the speakers array.
All we have done in the query shown in Figure 5 is to add a CROSS APPLY with a simple select statement. Now the speaker ID is returned, note that if there is more than one speakerID, such as in the case of sessionID 117615 (which has two awesome speakers). In which case the query returns two rows, returning a different speakerID for each, which is just what we wanted.
Next let's have a look at returning data for the speaker's node. Below in Figure 6 the TSQL to return some data from the speakers array.
Figure 6 – TSQL query to return data from the speakers array
Looking at the query inside the CROSS APPLY
SELECT Value FROM OPENJSON(s.Value, '$.links')
WHERE Value LIKE '%Twitter%'
There are a couple things that are worth looking at. First it is possible to use a WHERE clause on the columns returned by the OPENJSON() function. The reason for using the WHERE clause is that the links node can contain more than one type of link. During development some of the speakers had a LinkedIn profile, which they then removed 🙁.
So by now I am sure you are saying “show me the money”. After some work I created a query which extracts, the session, speaker and room information. Then returns it as a single result set as shown in Figure 7 below.
Figure 7 – Result set with Session, Speaker and room details
If you want to have a try yourself and play with the code then you will find
TSQL source code is in this Azure Data Studio Notebook is here
Python Code is in this Azure Data Studio Notebook is here
If you have not run the python code to import the data to import the data, then I have created a azure data studio notebook, containing the code to create the database and other tasks. The notebook can be found here.
Last, but very much not least why did I spend some much effort to get all the data out of the sessonize API? The end goal was to supply the data to SQL Server Report Builder (download from here https://www.microsoft.com/en-us/download/details.aspx?id=53613) . This standalone tool will allow you to build an SSRS report. Using this tool I created a report which when you run the report outputs pages that look like the one shown in Figure 8 below.
Figure 8- Data finally published on the SSRS report
I've been working with my current company for over two years now. During that time, on my own initiative, I decided to review the BI market to see what tool(s) that the company should be looking at adopting. There are quite a few restrictions, data privacy, our clients are very cautious about their data. So must be an on-premise server and yes I have asked lots of questions about this. Also, our clients are mostly non-profit or charities, the budget is a massive consideration.
PowerBI has been my tool of choice for reporting. It is used for a POC (proof of concept) project reporting service desk incidents to our clients. It is fantastic we had a Pro account, we shared the reports with our clients. We loved it, the clients loved it. During the next two years, I invested time, energy, effort, working on other POC projects. At the same time showing the relevant directors why we should look at Power BI for future development. The deal breaker was an on-prem server, no negotiation on that point. The start of 2017 exciting news, on-premise server was coming Which version would get it, how much would it cost, could we use it. Answers from Microsoft, zero, zilch, nada, nothing, brick wall impression.
So we waited and waited and waited. Then, Power BI premium. By the time I had digested the news, it felt like someone had kicked me black and blue. There is no point in even approaching our Managing Director with a minimum of £3k per month for this project. Our budget is not even in the same country, let alone same ballpark. Next, our sharing reports with other free accounts using Power BI Pro, at least for some of our client has gone. Now have it, now you don't.
Your company might be a large enterprise, then these costs are reasonable, we are not a large enterprise. So in essence over two years investment of my time down the drain, time to start again. Now I am in the process of contacting clients for the POC project to show them how to access the reports, as the can no longer use their own Power BI accounts. Disappointed would be a mild word to use to describe my feelings.
Very recently Tableau announced a price change. Long story short, my line manager saw the new pricing structure, complete with on-premise server, per user cost of $35 per month, PowerBI cannot compete with that deal. What will happen now is my company most likely to become a Tableau customer, Microsoft's loss. The tools released at the data summit (June 2017) now places PowerBI toe to toe with Tableau. Sadly I believe that Power BI will likely loose in the long run due to the pricing currently in place. Whilst I understand the business logic and reasoning pursuing this model. Microsoft has also demonstrated very clearly they do not understand the market in the way that Tableau seems to, which is reflected in their pricing structure. Great for me, another toolset to add to my CV. As I see it, Tableau leaves PowerBI dead in the water for customers like my company. There is NO competition, Tableau has this market to themselves. Which is bad news for me as a customer.
Microsoft has got it right before, yes I will stand up, shout, cheerlead, and applaud when they do get it right. As my tweet to James Phillps / Power BI team expressed. When Microsoft get it wrong I need to be just as vocal, and I believe they have got it wrong, with the pricing in a big way, at least from where I am standing. Yes I will continue to let people know about PowerBI, it not be with the same enthusiasm, that makes me sad :-(
Last but not least a more personal public apology to Chris Web (@Technitrain) was on the end of my rant via Twitter regarding pricing, sorry Chris, my bad.
My latest adventure was speaking at DataRelay in Leeds and Nottingham. As is the way with my submissions the session selected was not the one, that I expected to be chosen. It was titled “How to be awesome at finding your next job”.
Speaking is very much outside of my comfort zone and something that I really enjoy. Why do it then? Well when I see someone in a session, that their body language says they understand what you are trying to say. That feeling is just an absolute joy. As I have learned from others in the community, so I want to pass my knowledge to others. That is one of my primary reasons for speaking. Somehow, somewhere, someone, will be helped to do something. From the few brief conversations that I had during DataRelay I might have achieved that goal. If you came to my session, thank you for coming along, hopefully you got something from it. As I said if you want to chat about the session or ask questions then please do. When I do a session it’s because I am passionate about the topic and have so much to say. I can also listen if you need someone just to talk things over.
What did I get from DataRelay? More than I expected, not in ways I thought that I might. It was possible for me to get to a few sessions. There was time to see how DataRelay do things as an event (very slick BTW). Get to know more about people in the community, make some new friends, meet some old friends. At least one session I attended used Slack / Microsoft Teams in a way I never dreamed of. Yet the possibilities are amazing and mind blowing. Speaking to the sponsors was one of the goals I had. Thank you to the two gentlemen from Microsoft who took time to answer my questions. The time, effort and genuine enthusiasm is much appreciated. It’s nice to speak to a real person who is knowledgeable and passionate about their area of expertise. Over the two days I took quite a few notes which I am excited to share with my team back in Glasgow.
It will take me a little time to recover, there is a long train journey back from DataRelay which will help. Like some of us in the community, it takes energy to be around lots of people. It's worth every single moment of the effort. Especially when I see a room full of people who I can try and show just the one little nugget of information that makes the difference to something in their lives.
This post is being written as the train travels back to Sunny Glasgow. Tomorrow morning, I will be back in the office with my new company Eyecademy. They have supported and encouraged me to go and speak at DataRelay. Which I really appreciate more than anyone in Eyecademy might realise. Thank you for the encouragement and support.
What’s next? Next year I will be speaking at Scottish Summit on the 29th of February 2020 about Soft Skills for Success. Speaking in your home city is both exciting and challenging. So, I will need to put some more work into the session to tailored it to the event and audience. There are a few other submissions to other conferences which I am waiting to hear about. Some other conferences have caught my eye, so I need to consider where and when I would like to speak. Hopefully once I know more, I’ll update the blog.
The tweet says to me that , changing your self-talk can make a difference.
More than once I have considered speaking about this and some other topics, hopefully I will submit a session with them in it to a conference one day soon…..
When I speak in my sessions, I have one goal. It is clear and that goal is what I focus on when preparing and then delivering my sessions. That one person will hear what I say and that it helps just one person. That is the goal of this post that one person will be helped even just realising it’s not just you that feels that way. Also that you have the power to change.
This post is just some reflections on my experience on speaking, as part of the newcomer track at SQL Grillen 2018
Pre SQL Grillen
This is the first part of this post is written on Tuesday 19th June 2018 only 3 days to go before the session. My current feelings are rather to say the least nervous. Added to that is a considerable feeling of imposter syndrome. At this time my thoughts and feelings are these -:
Rehearsals - without a doubt, going over the presentation multiple times has helped so much more than anticipated. Whilst at this point the words that are going to be used have been repeated many times. They are in my mind. As each slide comes up, there is little doubt in my mind what I'm going to say or how.
Less is more - the amount of material that came out of my research into the topic, not all of it has made it into the presentation. So, what made it in is just what is required to get my point across. Which has upsides and downsides. It should make the session better, more focused, and if people ask questions after the session then other examples and illustrations will spring to mind. On the downside there is part of me that feels like my audience is not getting all that I want to get across. Then again there are only 60 mins which is more than enough for most people.
Imposter Syndrome - not sure what can be said about this. It seems natural, there are many presenters who feel the same. Right now, the best strategy seems to be to focus on the presentation. The goal of the presentation to help one person take one thing away from the session. Who that person is at this moment I do not know. So if just one person takes one thing away that I will count as a win. That person might just be me, which is also good.
Mentor - SQL Grillen, did an awesome job with the new comer's track. Assigning each person, a mentor, for me, I was very luck and have been assigned Cathrine Wilhelmsen as my mentor. Her insights and attention to detail was invaluable in so many ways. Cathrine made excellent suggestions and helped me to see the presentation from the point of view of a non-native English speaker's. Most importantly just generally very encouraging :->
Post SQL Grillen
Phew! OMG! That was soooo scary! Can I try that again?
Back in Sunny Glasgow. Now looking to see what lessons I can learn, and other thoughts spring into my mind.
Bunny in the headlights - It's easy to forget that as a speaker that I felt front and centre. That is to say, everyone can see you and knows you're a speaker, thats how I saw it. Even better or worse, each of the new speakers was given an orange apron to wear. The other speakers had different colours. For me, it was a strange feeling not in a bad way, more that I am usually part of the audience. On reflection its not a bad thing, all part of the learning experience.
Rehearsals – This really worked for me I was able to sit at the speaker's table check my equipment worked, run over my presentation quickly and that was me ready to go. Doing so many rehearsals (and not having any demos) meant for me that I knew what I was going to say and all the notes I needed were on the slide deck. Sitting at the speaker's desk was scary, with so many people who I have seen speak before. At least I was able to make it to a session before I was due to present. Which allowed me to relax and listen to the awesome trio of Rob Sewell (@sqldbawithbeard), Chrissy LeMaire (@cl) and Cláudio Silva (@ClaudioESSilva) talk about the new dbaChecks module.
The Presentation - nervous? YES! Waiting for the session to start was the worst part. Having seen some advice from Brent Ozar I had some music playing (that only I could hear) only thing was I had to resist dancing around. Knowing the presentation allowed me to concentrate on other things.
Audience – making sure I spoke to the whole audience front row to back, both sides, making eye contact with everyone, looking at their body language, to see if my points hit home
Pace - at some points my pace was a little faster than should be, I felt able to vary according to the material and audience reactions.
Body language - both my own to ensure I got points across. More importantly the body language of the audience. Was the audience looking at the slide, or looking at me, did they react how I expected?
The Audience - Think about this afterwards, there were so many more people than I would have even dared hoped for. My guess was about 30 people, some of the people I recognized, my colleagues from Scotland, Craig Porteous, Paul Broadwith, and of course Cathrine :->, and Grant Fritchey aka "The Scary Dba" (yes really!). Somethings seemed to work really well, like the acronyms game, and my alternative job description, yes you had to be there to get the point.
Feedback - for me this was the hardest part. The best that I had expected something like "Meh".
What I did not expect was people saying how well I had done. Grant Fritchey who attended my session, congratulated me on my presentation, even tweeting about as well. Then Alexander Arvidsson also congratulated me on the presentation, his kind and encouraging words can be found in this blog post. Catherine was very generous with her compliments and encouraged me to review the feedback, which was complimentary and insightful.
Finishing - needs more rehearsing, so that the presentation finishes on more of a high, at least from my point of view.
Timing – instead of using a stopwatch, I used a countdown timer. At several points, I was trying to see how much time had elapsed. As my notes had time elapsed at key points. The countdown timer did not make it easier for me to see the time elapsed.
Hard work – Over the years I have been fortunate enough to see many people speak who make it look so easy. Having done it now, its like a swan look they look graceful and elegant as it glides across the water's surface. Yet hidden away underneath the water are the webbed feet working really hard all the time. That’s my experience of presenting, making it look easy requires a lot of hard work, which remains unseen, the way is should be.
Last point is to thank the SQL Grillen team. William Durkin, who does an amazing job of making everyone feel welcome. Ben Weissman for creating and picking the speakers for the newcomers track. There are as l know so many more in the SQL Grillen team, thank you to all.
There are some ideas which are being considered. Where, when and what who knows, watch this space.