Regain sysadmin rights to a SQL instance without stopping its service

I came across a situation a few weeks ago where there was a set of SQL servers, some with single and others with multiple instances, which were missing the required domain groups for management.

Worse, I had no permissions to the server, and without these, I was unable to login to the SQL instance. Another issue was that under no circumstances could any of the instances be stopped.

After a bit of head bashing I remembered that by default installation, the local system admin account is a system administrator in SQL Server, and it’s not one I remove after installation.

If I could write a script to add the required groups in, and run a script in a local system context then maybe i could restore the required permissions.

I came up with the script below. The only caveat is that Local System must be resident within the instance level users, and have sysadmin rights.

It’s a bit of a hack, but it served its purpose. šŸ™‚

In the script, two text blocks are created.

The first one, assigned to the $task variable, is the XML for the scheduled task we are going to create.

The second one, assigned to $code, is the actual powershell code that the scheduled task would run. This code contains a function for executing the required sql commands, and additionally there is code which iterates through each instance on the server and runs the function with the required SQL commands to add the required permissions to each instance. The script writes these text blocks out to new files (in my case on the D drive).

Then, schtasks.exe is used to create the scheduled task, using the new XML file we’ve just created as the input source. Amongst other settings, the XML specifies that the tasks runs in local system context, and also the command and arguments to be executed. The arguments point to the second file just created, the .ps1 file. schtasks.exe is then used again to trigger the job and we wait until the status of the job changes from running to anything else. Once the task has completed, or failed, it is removed from Task Scheduler. Finally, the script removes the files it has just created so no footprint is left on the system at all. Sounds a bit ‘naughtyish’, but it’s really just to keep things tidy!

The code can be run remotely by specifying it as a scriptblock in an Invoke-Command against a remote system, if so desired.

NB You should also be able to use this for also adding single domain accounts, as the formatting of the sql command will be exactly the same. Just replace or add the required details.

You can find the source code for this at my repo




Getting MKV Stream Data Information with PowerShell and the MKVToolnix Toolkit

I often want to get information about an MKV file, usually to find out if it has one or more subtitle tracks. MKVToolNix is my toolset of choice for this. Automation of this process turned out to be relatively straight forward with PowerShell (naturally!) and one of their tools, mkvinfo.

Before we go into the cmdlet details, you will need to download and install the MKVToolNix toolset if you do not already have it already You can get this by visiting the site of the author, Moritz Bunkus, at

A word of warning. We’re using ‘Prayer Based Parsing’. If a future revision of mkvinfo changes the format of output, there’s a good chance our script will cease to work. Iā€™m pretty certain more RegEx aware gurus will be able to tighten the parsing a bit to lessen the chance of this, but it’s still something to think about.

Looking at the code, ‘FullName’ is defined as an alias for Path in the cmdlet, to allow the use of pipeline output from cmdlets such as Get-ChildItem. That way, track information from multiple files can be obtained quite simply.

Also remember to change the path in the code below to where your mkvinfo.exe file exists.

Once you’ve loaded the function into memory, it can be used simply the following way :

An example, also showing how we can combine it with Get-ChildItem is below.

Get-MKVInfoChildItemIn the next post, we’ll make use of another MKVToolnix tool and PowerShell to allow us to extract subtitle files from MKV files.

Any feedback, comments, errata always welcome. šŸ™‚


URL Shortening

Love it or loathe it, URL shortening has been with us a while now and can certainly be handy. TinyURL are one such company to offer this service. Nicely for us, we do not need to register in order to use their API, and yet nicer still is that we can use it simply by entering a standard format of URL.

Before we see how we can use PowerShell to automate this process, let’s take a look at the format of URL that we need to use with TinyURL.

Where targetaddress refers to the URL that you wish to shorten.

And that’s it.

Let’s say we wanted share a link containing information about this years PowerShell Summit Europe event in Stockholm. The full length URL for this is :

If we wanted to get the TinyURL equivalent of this, we’d use the following URL, pasting it into the address bar of our browser.


For making this happen via PowerShell, Invoke-WebRequest is our friend. All we need to do is provide the required address via the Uri parameter, and the Content property of the returned HtmlWebResponseObject will contain its shortened equivalent.

So for the case of the above we’d be using a command (note the pipeline symbol) of the type :

And can expect to get :


I’ve put together a cmdlet called Get-TinyURL for doing this. At its simplest, you can run it with the Uri parameter, and it will return a PSObject containing the original full address and its shortened equivalent.



It’s also been bulked out a bit to give some extra functionality, such as being able to read from and write to the clipboard if we want. With both options enabled, we can copy a full address into the clipboard, run the cmdlet, and automatically have the shortened URL available for pasting wherever we want it next.

Navigate to desired URL and copy it to the clipboard

Run the required command

pseuemail Paste where required

The code used is listed below, and will also be posted on GitHub in due course.