PowerShell and Twilio: SMS

Twilio is a cloud based messaging service; it can do everything from sending SMS’s, to being the basis of an entirely cloud based virtual call centre.  It’s pretty powerful stuff.

My requirements are fairly basic though, I just want to be able to send SMS’s from a PowerShell script to engineers to alert them when things go wrong.  The first thing to do is head over to Twilio and set yourself a test account up, which is free, and will allow you to send messages to yourself.

All interaction with Twilio is via their REST API and the first method I have to interact with it is using the official Twilio C# DLLs, the instructions to download are here. Once you have the DLLs, pop them in the same directory as the script you are going to be running and here’s a function to make use of them along with a sample call:

function invoke-twilioSMS ( [Parameter(Mandatory=$true)][String]$AccountSid, [Parameter(Mandatory=$true)][String]$message, [Parameter(Mandatory=$true)][String]$fromTel, [Parameter(Mandatory=$true)][String]$toTel, [Parameter(Mandatory=$true)][String]$authToken, [Parameter(Mandatory=$true)][String]$dllPath ) { Add-Type -path "$dllPath\RestSharp.dll" Add-Type -path "$dllPath\Twilio.Api.dll" $twilio = new-object Twilio.TwilioRestClient($AccountSid,$authToken) $msg = $twilio.SendSmsMessage($fromTel, $toTel, $message) } invoke-twilioSMS -AccountSid "<AccountSid>" ` -authToken "<authToken>" -message "<message>" ` -fromTel "<fromTel"> -toTel "<toTel>" ` -dllPath "<scriptPath>"

The problem with this method is that it’s awkward to get hold of the DLLs, and I find there is something clunky about having to use DLLs to call a REST API.  So in method 2, I make use of the Invoke-RestMethod (which arrived in PowerShell 3.0) to talk to the REST API directly.

function invoke-twilioRESTSMS ( [Parameter(Mandatory=$true)][String]$AccountSid, [Parameter(Mandatory=$true)][String]$message, [Parameter(Mandatory=$true)][String]$fromTel, [Parameter(Mandatory=$true)][String]$toTel, [Parameter(Mandatory=$true)][String]$authToken ) { # Build a URI $URI = "https://api.twilio.com/2010-04-01/Accounts/$AccountSid/SMS/Messages.json" # encode authorization header $secureAuthToken = ConvertTo-SecureString $authToken -AsPlainText -Force $credential = New-Object System.Management.Automation.PSCredential($AccountSid,$secureAuthToken) # content $postData = "From=$fromTel&To=$toTel&Body=$message" # Fire Request $msg = Invoke-RestMethod -Uri $URI -Body $postData -Credential $credential -Method "POST" -ContentType "application/x-www-form-urlencoded" } invoke-twilioRESTSMS -AccountSid "<AccountSid>" ` -authToken "<authToken>" -message "<message>" ` -fromTel "<fromTel"> -toTel "<toTel>"

Method 2 is my preferred why of doing things for something as simple as sending a SMS.

There is still one further way to send a message using PowerShell and Twilio, and this addresses those who are using PowerShell 2.0, so can’t use Invoke-RestMethod, and don’t want to use the DLLs; we can still build a request from scratch using the System.Net.WebRequest object:

function post-twilioSMS ( [Parameter(Mandatory=$true)][String]$AccountSid, [Parameter(Mandatory=$true)][String]$message, [Parameter(Mandatory=$true)][String]$fromTel, [Parameter(Mandatory=$true)][String]$toTel, [Parameter(Mandatory=$true)][String]$authToken ) { # Build a URI $URI = "https://api.twilio.com/2010-04-01/Accounts/$AccountSid/SMS/Messages.json" $requestUri = new-object Uri ($URI) # Create the request and specify attributes of the request. $request = [System.Net.WebRequest]::Create($requestUri) # encode authorization header $authText = $AccountSid + ":" + $authToken $authUTF8 = [System.Text.Encoding]::UTF8.GetBytes($authText) $auth64 = [System.Convert]::ToBase64String($authUTF8) # Define the requred headers $request.Method = "POST" $request.Headers.Add("Authorization: Basic $auth64"); $request.Accept = "application/json, application/xml, text/json, text/x-json, text/javascript, text/xml" $request.ContentType = "application/x-www-form-urlencoded" # content $fromTel = [System.Web.HttpUtility]::UrlEncode($fromTel) $toTel = [System.Web.HttpUtility]::UrlEncode($toTel) $message = [System.Web.HttpUtility]::UrlEncode($message) $postData = "From=$fromTel&To=$toTel&Body=$message" $request.ContentLength = $postData.Length # Stream Bytes $postBytes = [System.Text.Encoding]::ascii.GetBytes($postData) $requestStream = $request.GetRequestStream() $requestStream.Write($postBytes, 0,$postBytes.length) $requestStream.flush() $requestStream.Close() # Fire Request $response = $request.GetResponse() # Output Response $responseStream = $response.GetResponseStream() $responseReader = New-Object System.IO.StreamReader $responseStream $returnedResponse = $responseReader.ReadToEnd() $response.close() } post-twilioSMS -AccountSid "<AccountSid>" ` -authToken "<authToken>" -message "<message>" ` -fromTel "<fromTel"> -toTel "<toTel>" ` -dllPath "<scriptPath>"

Method 3 works just fine, its lacks the simplicity of Method 2, but it does give you very granular control over what the Web Request is doing; which I have found very useful when working with other REST API’s which aren’t quite as well behaved as Twilio.

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

Advertisements

Parse IIS log files with PowerShell

I recently got asked if there was an easy way to find out the average time-taken for an IIS instance to complete a request.  This information is available in the IIS log files, you just need to parse it out.  Now there are many IIS log parsers available on the internet, but I thought “I wonder how easily I could do that in PowerShell”; it turns out very easily!

First thing is to define the path to you log file

$IISLogPath = "C:\Temp\sample.log"

Next we need to find the headers that are available in this particular log file.  First load the file, pick the headers out (always on the 4th line) using ‘split’ to separate the headers delimitated by a white space, and then get rid of the “#Fields: “ prefix from the headers.

Note I’ve used  [System.IO.File]::ReadAllLines to load the file as it’s a lot faster than get-content, this makes a big difference if your iterating through a lot of files!

$IISLogFileRaw = [System.IO.File]::ReadAllLines($IISLogPath) $headers = $IISLogFileRaw[3].split(" ") $headers = $headers | where {$_ -ne "#Fields:"}

Now we need to actually import the file, which is nice and simple, as we’ve already got the headers we can just use import-csv to do the work for us, and then do a little bit of clean up removing any comment lines which start with a #.

$IISLogFileCSV = Import-Csv -Delimiter " " -Header $headers -Path $IISLogPath $IISLogFileCSV = $IISLogFileCSV | where {$_.date -notlike "#*"}

Finally lets collect all the time-taken values into an array.  Note I have had a to do a little bit of hoop jumping to get the “-“ to not be interpreted by PowerShell.

$timeTaken = $IISLogFileCSV | foreach {$_.$("time-taken")}

So putting it all together and we get this:

$IISLogPath = "C:\Temp\sample.log" $IISLogFileRaw = [System.IO.File]::ReadAllLines($IISLogPath) $headers = $IISLogFileRaw[3].split(" ") $headers = $headers | where {$_ -ne "#Fields:"} $IISLogFileCSV = Import-Csv -Delimiter " " -Header $headers -Path $IISLogPath $IISLogFileCSV = $IISLogFileCSV | where {$_.date -notlike "#*"} $timeTaken = $IISLogFileCSV | foreach {$_.$("time-taken")}

Once you’ve got the array of time-taken, you can ask questions like “what was the average and max time-taken”, but I’ll leave that bit up to you”!

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

DAC SQL Azure Import Export Service PowerShell Client Module

SQL Azure offers a hosted service to import/export Databases between SQL Azure and Azure Blob storage, essentially they have put up a REST API and you can fire commands at it.  There is even a Codeplex project with SQL DAC example client  implementations. 

When I recently attempted to automate exports of a number of databases we host in Azure I grabbed a copy of the client and wrapped it in PowerShell and thought job done. That’s where I ran into issue number one, the exe randomly hangs for me. 

C# is not one of my strong points, so I decided that attempting to debug the C# source probably wasn’t a good idea and instead I decided to re-implement the client in PowerShell. 

So a lot of coffee, detective work (the REST API isn’t very well documented currently) and digging around in the example source code I’ve put together a PowerShell module implementing the three main features.  Export, Import, Status.

Usage

I’ve attempted to keep the command switches as close as I could to the Codeplex project so if you’re switching from one to the other, you should be able to figure out what’s going on very quickly. 

start-DacExport -s <server> -d <database> -u <username> -p <password> -bloburl <bloburl> -blobaccesskey <key> start-DacImport -s <server> -d <database> -u <username> -p <password> -bloburl <bloburl> -blobaccesskey <key> -size <inGB> -edition <web/business> get-DacJobStatus -s <server> -u <username> -p <password> [-requestid <GUID> -withInfo]

Both start-DacExport and start-DacImport will return the GUID of the job, which you can then use with get-DacJobStatus.  get-DacJobStatus will return an xml object containing then job’s status information; this is great if you are using the function in your own script, but if you just want to print the results to screen make sure you use –withInfo and the XML will be sent to the console instead.

Installation Instructions

  • Create a new folder in called DacIESvcPS in your modules directory e.g C:\Windows\System32\WindowsPowerShell\v1.0\Modules\DacIESvcPS
  • Download the latest version of the PSM1 file from https://github.com/stevenaskwith/DacIESvcPS into the new directory
  • Launch a PowerShell console and run
    • import-module DacIESvcPS
  • To confirm module loaded correctly run
    • Get-Command -Module DacIESvcPS

You should get something like this:

image

An example output of get-DacJobStatus –withInfo would look like this:

db3prod-dacsvc.azure.com <?xml version="1.0" encoding="ibm850"?> <ArrayOfStatusInfo xmlns="http://schemas.datacontract.org/2004/07/Microsoft.SqlServer.Management.Dac.ServiceTypes" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"> <StatusInfo> <BlobUri>http://myExportBlob.blob.core.windows.net/sqlexports/someDatabase.bacpac</BlobUri> <DatabaseName>someDatabase</DatabaseName> <ErrorMessage /> <LastModifiedTime>2012-03-22T10:18:57.1864719Z</LastModifiedTime> <QueuedTime>2012-03-22T10:16:03.7488387Z</QueuedTime> <RequestId>2bbbf314-3ec5-4f7c-afbd-ba219a61954b</RequestId> <RequestType>Import</RequestType> <ServerName>eccaps1fj1.database.windows.net</ServerName> <Status>Completed</Status> </StatusInfo> </ArrayOfStatusInfo>

I would love to hear back from anyone who uses this in the field.

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

What Focal Lengths do I take photos at? Part 2 – Charting

So in my previous post I showed how you can leverage PowerShell and the .NET framework to analyse your photo collection and get some meaningful statistics from them, in my example figuring out the frequency distribution of the focal lengths at which you take your photographs. 

The results are presented in a simple hash table, which does the job but doesn’t make it very easy to visualise the results, presenting the data in a chart would make the output a lot easier to understand.  You could just copy past the output into Excel and create a chart from the data, but I thought there must be a way to do this within PowerShell; so after a quick dig and a very helpful blog post later this is what I came up with:

Changes

Once your script grows over a 100 lines it starts to get really hard to keep track of what’s going on, so whilst functions are great at wrapping up bits of repeatable and reusable code, I find they are good for breaking your script down into easy to manage sections of code.  Finally I coordinate the script via control logic in the ‘main’ at the end of the script, which is around 15 line of code including comments, but very readable!

The two functions (and parameters) I have added are:

  • get-FocalLengths <files>
    • returns the frequency distribution in a hashtable
  • createChart <hashtable of values> <title> <x-axis title> <y-axis title>
    • Takes a hashtable (which doesn’t have to be focal lengths) and creates a chart, prints it to screen and writes it to a file

Part of the createChart function requires that you load some additional .NET assemblies, these are not part of the standard .NET 3.5 SP1 package, but are a Microsoft add-on, if you don’t have them installed on your machine when you run the code we need to handle that error.

When the script loads assembly the output of that operation is captured in the $assembly variable, if $assembly equals null, then nothing was loaded i.e. you don’t have the “System.Windows.Forms.DataVisualization” assembly on you pc.  In this case my error handling code launches an internet explorer to the download page for the assembly and exits the script.

$assembly = [Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms.DataVisualization") if($assembly -eq $null) { Write-Host "Charting module not installed, please install it" # launch IE and navigate to the correct page $ie = New-Object -ComObject InternetExplorer.Application $ie.Navigate("http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=14422") $ie.Visible = $true break }

Source

I’ve moved the source code for this project into GitHub and you can down load the source code here:

https://github.com/stevenaskwith/Focal-Lengths/blob/master/get-FocalLengths.ps1 click the “RAW” button and save as to grab a copy of the ps1.

Usage

Call the script in the same way as the previous version:

.\get-Focallengths –Path “c:\images” –fileType “.png” –model “Canon EOS 7D”

or drop the parameters you don’t need and let the script use it’s defaults e.g.

.\get-Focallengths –Path “c:\images”

Summary

Firing the script against my personal photo collection resulted in a nice graph shown below.  It’s quite obvious to see now that I either take my zoomed all the way in or out.  Now I’ve just got to pick where to spend my money…

THIS POSTING IS PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

What Focal Lengths do I take photos at?

I’ve been working with PowerShell for sometime and I have recently been encouraged to share some of the work that I’ve been doing.  So I thought I’d start out with a fun one. 

I have a SLR camera and I’m considering buying a new Lens, but what I’d like to know before I go and purchase one is: What Focal Lengths do I take my Photos at?  More importantly what focal length do I take most of my photos at; so I can focus my money on a lens that I’m going to use a lot!  Here’s what I came up with:

What the script will do:

  • Find all the files in a directory structure with a given file extension
  • Load each image and gather up it’s focal length and the type of camera that took it
  • If the camera type matches what we are looking for, add the focal length to the running score
  • Print the frequency distribution on focal length to screen

Script

# setup default parameters if none were specified param([string]$path = (get-location), # current path [string]$fileType = ".jpg", # search for .jpg [string]$model = "Canon EOS 550D") # default camera type # Clear the Screen clear ##### Assemblies # load the .NET Assembly we will be using Add-Type -AssemblyName System.Drawing ##### Constants $filter = "*" + $fileType # clean up the search filter $Encode = new-object System.Text.ASCIIEncoding # find all the image files we are interested in $files = get-childitem -recurse $path -filter $filter # and how many we found $totalFiles = $files.count ##### Varibles $image = $null $imageHash = @{} $i = 0 $focalLength = $null ##### Main # if some files were returned if ($files -ne $null) { foreach ($file in $files) { # load image by statically calling a method from .NET $image = [System.Drawing.Imaging.Metafile]::FromFile($file.FullName) # try to get the ExIf data (silently fail if the data can't be found) # http://www.sno.phy.queensu.ca/~phil/exiftool/TagNames/EXIF.html try { # get the Focal Length from the Metadata code 37386 $focalLength = $image.GetPropertyItem(37386).Value[0] # get model data from the Metadata code 272 $modelByte = $image.GetPropertyItem(272) # convert the model data to a String from a Byte Array $imageModel = $Encode.GetString($modelByte.Value) # unload image $image.Dispose() } catch { #do nothing with the catch } # if the file contained both focalLength and A modelName if(($focalLength -ne $null) -and ($imageModel -eq $model)) { if($imageHash.containsKey($focalLength)) { # incriment count by 1 if focal length is already in hash table $count = $imageHash.Get_Item($focalLength) $count++ $imageHash.Set_Item($focalLength,$count) } else { # Add focal length to Hash Table if it doesn't exist $imageHash.add($focalLength,1) } } # Calculate the current percentage complete $i++ $percentComplete = [math]::round((($i/$totalFiles) * 100), 0) # Update that lovely percentage bar... Write-Progress -Activity:"Loading Focal Lengths" -status "$i of $totalFiles Complete:" -PercentComplete $percentComplete } # print results in ascending order of focal length $imageHash.GetEnumerator() | Sort-Object Name } else { Write-Host "No files found" }

Usage

Copy the script into a ps1 file and call it like so:

.\get-Focallengths –Path “c:\images” –fileType “.png” –model “Canon EOS 7D”

or drop the parameters you don’t need and let the script use it’s defaults e.g.

.\get-Focallengths –Path “c:\images”

Summary

I started out thinking this script was going to be trivial, but it’s turned into quite a good demonstration project of a lot of PowerShell’s abilities, specifically it’s ability to access the .NET framework.

THIS POSTING IS PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK