Encrypting Secrets using PowerShell and AWS KMS

AWS Key Management Service (KMS) is Amazons is a managed service that makes it easy for you to create and control encryption keys that you can then use to encrypt data.  A lot of the AWS services nativly integrate with KMS e.g. S3, but I wanted to use a KMS key to encrypt a secret (e.g. a password) that I could store inside a configuration file and decrypt it when required.

To do this I created a two PowerShell functions, one for encryption and one for decryption, that I can embed in scripts.  The encryption function will securely transfer your plaintext to KMS, KMS will encrypt the data and return an encrypted memory stream which I convert to a base64 string to make it easy to store in text/XML/JSON.

The decrypt function takes a previously encrypted base64 string, converts and sends it to KMS to decrypt (note you don’t have to tell KMS which key is required to decrypt) and KMS returns a plaintext memory stream which I convert back to a UTF8 encoded string.

I generally use these functions during userdata execution (boot time) on an AWS EC2 instance to decrypt secrets that I need to configure the instance and/or applications, but you could use this on any windows machine.  To support the use of IAM Roles on EC2 instances, I have made the access/secret key parameters optional i.e. if you don’t pass an access/secret key the function will attempt to use the privileges provided by the IAM role applied to the EC2 instance, assuming you are running the function on EC2.

Function to Encrypt

function Invoke-KMSEncryptText
(
	[Parameter(Mandatory=$true,Position=1,HelpMessage='PlainText to Encrypt')]
	[string]$plainText,
	[Parameter(Mandatory=$true,Position=2,HelpMessage='GUID of Encryption Key in KMS')]
	[string]$keyID,
	[Parameter(Mandatory=$true,Position=3)]
	[string]$region,
	[Parameter(Position=4)]
	[string]$AccessKey,
	[Parameter(Position=5)]
	[string]$SecretKey
)
{
	# memory stream
	[byte[]]$byteArray = [System.Text.Encoding]::UTF8.GetBytes($plainText)
	$memoryStream = New-Object System.IO.MemoryStream($byteArray,0,$byteArray.Length)
	# splat
	$splat = @{Plaintext=$memoryStream; KeyId=$keyID; Region=$Region;}
	if(![string]::IsNullOrEmpty($AccessKey)){$splat += @{AccessKey=$AccessKey;}}
	if(![string]::IsNullOrEmpty($SecretKey)){$splat += @{SecretKey=$SecretKey;}}
	# encrypt
	$encryptedMemoryStream = Invoke-KMSEncrypt @splat
	$base64encrypted = [System.Convert]::ToBase64String($encryptedMemoryStream.CiphertextBlob.ToArray())
	return $base64encrypted
}

Function to Decrypt

function Invoke-KMSDecryptText
(
	[Parameter(Mandatory=$true,Position=1,HelpMessage='CipherText base64 string to decrypt')]
	[string]$cipherText,
	[Parameter(Mandatory=$true,Position=2)]
	[string]$region,
	[Parameter(Position=3)]
	[string]$AccessKey,
	[Parameter(Position=4)]
	[string]$SecretKey
)
{
	# memory stream
	$encryptedBytes = [System.Convert]::FromBase64String($cipherText)
	$encryptedMemoryStreamToDecrypt = New-Object System.IO.MemoryStream($encryptedBytes,0,$encryptedBytes.Length)
	# splat
	$splat = @{CiphertextBlob=$encryptedMemoryStreamToDecrypt; Region=$Region;}
	if(![string]::IsNullOrEmpty($AccessKey)){$splat += @{AccessKey=$AccessKey;}}
	if(![string]::IsNullOrEmpty($SecretKey)){$splat += @{SecretKey=$SecretKey;}}
	# decrypt
	$decryptedMemoryStream = Invoke-KMSDecrypt @splat
	$plainText = [System.Text.Encoding]::UTF8.GetString($decryptedMemoryStream.Plaintext.ToArray())
	return $plainText
}

Below is some sample code that makes use of the functions, simply fill in the access/secret keys, the KMS Master key you want to use for encryption and the region where the key is stored.  Obviously you should consider handling your plaintext more securely than I am here, but this serves as a simple test.

Import-Module awspowershell
# set your credentials to access AWS, key you want to encrypt with, and the region the key is stored
$AccessKey = ''
$SecretKey = ''
$Region = 'eu-west-1'
$keyID = ''
$plainText = 'Secret'

# Encrypt some plain text and write to host
$cipherText = Invoke-KMSEncryptText -plainText $plainText -keyID $keyID -Region $Region -AccessKey $AccessKey -SecretKey $SecretKey
Write-host $cipherText

# Decrypt the cipher text and write to host
$plainText = Invoke-KMSDecryptText -cipherText $cipherText -Region $Region -AccessKey $AccessKey -SecretKey $SecretKey
Write-host $plainText

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

PowerShell and Twilio: SMS

Twilio is a cloud based messaging service; it can do everything from sending SMS’s, to being the basis of an entirely cloud based virtual call centre.  It’s pretty powerful stuff.

My requirements are fairly basic though, I just want to be able to send SMS’s from a PowerShell script to engineers to alert them when things go wrong.  The first thing to do is head over to Twilio and set yourself a test account up, which is free, and will allow you to send messages to yourself.

All interaction with Twilio is via their REST API and the first method I have to interact with it is using the official Twilio C# DLLs, the instructions to download are here. Once you have the DLLs, pop them in the same directory as the script you are going to be running and here’s a function to make use of them along with a sample call:

function invoke-twilioSMS ( [Parameter(Mandatory=$true)][String]$AccountSid, [Parameter(Mandatory=$true)][String]$message, [Parameter(Mandatory=$true)][String]$fromTel, [Parameter(Mandatory=$true)][String]$toTel, [Parameter(Mandatory=$true)][String]$authToken, [Parameter(Mandatory=$true)][String]$dllPath ) { Add-Type -path "$dllPath\RestSharp.dll" Add-Type -path "$dllPath\Twilio.Api.dll" $twilio = new-object Twilio.TwilioRestClient($AccountSid,$authToken) $msg = $twilio.SendSmsMessage($fromTel, $toTel, $message) } invoke-twilioSMS -AccountSid "<AccountSid>" ` -authToken "<authToken>" -message "<message>" ` -fromTel "<fromTel"> -toTel "<toTel>" ` -dllPath "<scriptPath>"

The problem with this method is that it’s awkward to get hold of the DLLs, and I find there is something clunky about having to use DLLs to call a REST API.  So in method 2, I make use of the Invoke-RestMethod (which arrived in PowerShell 3.0) to talk to the REST API directly.

function invoke-twilioRESTSMS ( [Parameter(Mandatory=$true)][String]$AccountSid, [Parameter(Mandatory=$true)][String]$message, [Parameter(Mandatory=$true)][String]$fromTel, [Parameter(Mandatory=$true)][String]$toTel, [Parameter(Mandatory=$true)][String]$authToken ) { # Build a URI $URI = "https://api.twilio.com/2010-04-01/Accounts/$AccountSid/SMS/Messages.json" # encode authorization header $secureAuthToken = ConvertTo-SecureString $authToken -AsPlainText -Force $credential = New-Object System.Management.Automation.PSCredential($AccountSid,$secureAuthToken) # content $postData = "From=$fromTel&To=$toTel&Body=$message" # Fire Request $msg = Invoke-RestMethod -Uri $URI -Body $postData -Credential $credential -Method "POST" -ContentType "application/x-www-form-urlencoded" } invoke-twilioRESTSMS -AccountSid "<AccountSid>" ` -authToken "<authToken>" -message "<message>" ` -fromTel "<fromTel"> -toTel "<toTel>"

Method 2 is my preferred why of doing things for something as simple as sending a SMS.

There is still one further way to send a message using PowerShell and Twilio, and this addresses those who are using PowerShell 2.0, so can’t use Invoke-RestMethod, and don’t want to use the DLLs; we can still build a request from scratch using the System.Net.WebRequest object:

function post-twilioSMS ( [Parameter(Mandatory=$true)][String]$AccountSid, [Parameter(Mandatory=$true)][String]$message, [Parameter(Mandatory=$true)][String]$fromTel, [Parameter(Mandatory=$true)][String]$toTel, [Parameter(Mandatory=$true)][String]$authToken ) { # Build a URI $URI = "https://api.twilio.com/2010-04-01/Accounts/$AccountSid/SMS/Messages.json" $requestUri = new-object Uri ($URI) # Create the request and specify attributes of the request. $request = [System.Net.WebRequest]::Create($requestUri) # encode authorization header $authText = $AccountSid + ":" + $authToken $authUTF8 = [System.Text.Encoding]::UTF8.GetBytes($authText) $auth64 = [System.Convert]::ToBase64String($authUTF8) # Define the requred headers $request.Method = "POST" $request.Headers.Add("Authorization: Basic $auth64"); $request.Accept = "application/json, application/xml, text/json, text/x-json, text/javascript, text/xml" $request.ContentType = "application/x-www-form-urlencoded" # content $fromTel = [System.Web.HttpUtility]::UrlEncode($fromTel) $toTel = [System.Web.HttpUtility]::UrlEncode($toTel) $message = [System.Web.HttpUtility]::UrlEncode($message) $postData = "From=$fromTel&To=$toTel&Body=$message" $request.ContentLength = $postData.Length # Stream Bytes $postBytes = [System.Text.Encoding]::ascii.GetBytes($postData) $requestStream = $request.GetRequestStream() $requestStream.Write($postBytes, 0,$postBytes.length) $requestStream.flush() $requestStream.Close() # Fire Request $response = $request.GetResponse() # Output Response $responseStream = $response.GetResponseStream() $responseReader = New-Object System.IO.StreamReader $responseStream $returnedResponse = $responseReader.ReadToEnd() $response.close() } post-twilioSMS -AccountSid "<AccountSid>" ` -authToken "<authToken>" -message "<message>" ` -fromTel "<fromTel"> -toTel "<toTel>" ` -dllPath "<scriptPath>"

Method 3 works just fine, its lacks the simplicity of Method 2, but it does give you very granular control over what the Web Request is doing; which I have found very useful when working with other REST API’s which aren’t quite as well behaved as Twilio.

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

Filtered Azure Blob to Blob Copy

I recently had the job of copying ten’s of thousands of IIS log files, each one at least 100MB, from one Azure Storage account to another.  Using something simple like CloudBerry to copy the file just wasn’t going to cut it as it copies the file first to the local client, then pushes it back into Azure, not efficient at all.

A quick bit of digging and I discovered that the Azure PowerShell cmdlet Start-AzureStorageBlobCopy allows you to trigger a copy Azure to Azure, which runs very quickly, it will even allow you to copy an entire container from one storage account to another; what it won’t allow you to do is pass a filter in so only copies files matching the filter.

So here’s a function that I wrote to get that functionality, with some progress bars and timers for added effect:)

Function Start-AzureStorageBlobContainerCopy ( [Parameter(Mandatory=$true)][String]$srcStorageAccountName, [Parameter(Mandatory=$true)][String]$destStorageAccountName, [Parameter(Mandatory=$true)][String]$SrcStorageAccountKey, [Parameter(Mandatory=$true)][String]$DestStorageAccountKey, [Parameter(Mandatory=$true)][String]$SrcContainer, [Parameter(Mandatory=$true)][String]$DestContainer, [String]$filter = "" ) { Import-Module Azure $srcContext = New-AzureStorageContext -StorageAccountName $srcStorageAccountName -StorageAccountKey $SrcStorageAccountKey $destContext = New-AzureStorageContext -StorageAccountName $destStorageAccountName -StorageAccountKey $DestStorageAccountKey $timeTaken = measure-command{ if ($filter -ne "") { $blobs = Get-AzureStorageBlob -Container $SrcContainer -Context $srcContext | ? {$_.name -match $filter} } else { $blobs = Get-AzureStorageBlob -Container $SrcContainer -Context $srcContext } } Write-host "Total Time to index $timeTaken" -BackgroundColor Black -ForegroundColor Green $i = 0 $timeTaken = measure-command{ foreach ($blob in $blobs) { $i++ Write-Progress -Activity:"Copying..." -Status:"Copied $i of $($blobs.Count) : $($percentComplete)%" -PercentComplete:$percentComplete $copyInfo = Start-AzureStorageBlobCopy -ICloudBlob $blob.ICloudBlob -Context $srcContext -DestContainer $DestContainer -DestContext $destContext -Force Write-host (get-date) $copyInfo.name } } write-host Write-host "Total Time $timeTaken" -BackgroundColor Black -ForegroundColor Green } Start-AzureStorageBlobContainerCopy -srcStorageAccountName "<src Storage>" -SrcStorageAccountKey "<src key>" -SrcContainer "<src Container>" ` -destStorageAccountName "<dest Storage>" -DestStorageAccountKey "<dest key>" -DestContainer "<dest Container>" ` -filter "<filter>"

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

Switching garbage collection on in an Azure Worker role

Whilst working on an issue with Microsoft on one of our production environments, we came across the fact that an Azure Worker role, by default, has it’s garbage collection set to workstation and not server mode.  If you are using medium or larger (hence multi-processor) you could see a performance benefit by switching to server mode.

Unfortunately the Azure tooling does not currently allow you to directly configure this setting, so you have to do it in a round about fashion, by creating a startup task that will perform changes as the instance boots.

First define a start up task in the Service Definition of your worker role:

<WorkerRole name="WorkerRole1" vmsize="Medium"> <Startup> <Task commandLine="startup.cmd" executionContext="elevated" taskType="simple" /> </Startup>

Now create a “startup.cmd” in the root of your worker that will be used to kick off the powershell that will modify the config file

@echo off powershell -command "Set-ExecutionPolicy RemoteSigned" powershell .\setServerGC.ps1 2>> err.out

And finally create the “setServerGC.ps1” file in the root of your worker role, this is the file that will actually make the modifications.

# Load up the XML $configFile = "${env:RoleRoot}\base\x64\WaWorkerHost.exe.config" [xml]$waXML = Get-Content $configFile if (($waXML.configuration.runtime.gcServer -eq $null) -and ($waXML.configuration.runtime.gcConcurrent -eq $null)) { # Modify XML $gcServerEl = $waXML.CreateElement('gcServer') $gcConcurrentrEl = $waXML.CreateElement('gcConcurrent') $gcServerAtt = $waXML.CreateAttribute("enabled") $gcServerAtt.Value = "true" $gcConcurrentrAtt = $waXML.CreateAttribute("enabled") $gcConcurrentrAtt.Value = "true" $gcServerEl.Attributes.Append($gcServerAtt) | Out-Null $gcConcurrentrEl.Attributes.Append($gcConcurrentrAtt) | Out-Null $waXML.configuration.runtime.appendChild($gcServerEl) | Out-Null $waXML.configuration.runtime.appendChild($gcConcurrentrEl) | Out-Null $waXML.Save($configFile) # Restart WaWorkerHost.Exe Get-Process | ? {$_.name -match "WaHostBootstrapper"} | Stop-Process -Force Get-Process | ? {$_.name -match "WaWorkerHost"} | Stop-Process -Force }

We saw a significant performance boost on the role we deployed this on, but your mileage will vary depending on your workload.

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

Simple Azure Storage Queue Monitor

If you need to monitor the length of a queue in Azure, you can use the Azure PowerShell CmdLets to help you out.

Below is a sample ticker script that uses the Azure CmdLets (so make sure you have them installed) and it polls the configured queue every 10 seconds. 

clear Import-Module Azure $cert = Get-Item cert:\currentuser\my\<cert thumprint> # management cert $subID = "<subscription ID>" # Subcription ID $storageAccount = "<storage account>" # storage account where queue lives $queueName = "<queueName>" # Queue you're interested in $interval = 10 # Time between ticks Set-AzureSubscription -SubscriptionID $subid -Certificate $cert ` -SubscriptionName "CurrentSubscription" ` -CurrentStorageAccount $storageAccount Select-AzureSubscription -SubscriptionName "CurrentSubscription" # do forever loop do { # measure how long it takes to run the command $timeTaken = Measure-Command{ # get the queue info $queueInfo1 = Get-AzureStorageQueue -Name $queueName # write it to screen Write-Host (Get-Date) $queueInfo1.ApproximateMessageCount } # take the time take to run command off the interval time $totalTimeToWait = New-TimeSpan -Seconds $interval $timeToWait = $totalTimeToWait - $timeTaken # go to sleep sleep ($timeToWait.TotalSeconds) }while($true)

Parse IIS log files with PowerShell

I recently got asked if there was an easy way to find out the average time-taken for an IIS instance to complete a request.  This information is available in the IIS log files, you just need to parse it out.  Now there are many IIS log parsers available on the internet, but I thought “I wonder how easily I could do that in PowerShell”; it turns out very easily!

First thing is to define the path to you log file

$IISLogPath = "C:\Temp\sample.log"

Next we need to find the headers that are available in this particular log file.  First load the file, pick the headers out (always on the 4th line) using ‘split’ to separate the headers delimitated by a white space, and then get rid of the “#Fields: “ prefix from the headers.

Note I’ve used  [System.IO.File]::ReadAllLines to load the file as it’s a lot faster than get-content, this makes a big difference if your iterating through a lot of files!

$IISLogFileRaw = [System.IO.File]::ReadAllLines($IISLogPath) $headers = $IISLogFileRaw[3].split(" ") $headers = $headers | where {$_ -ne "#Fields:"}

Now we need to actually import the file, which is nice and simple, as we’ve already got the headers we can just use import-csv to do the work for us, and then do a little bit of clean up removing any comment lines which start with a #.

$IISLogFileCSV = Import-Csv -Delimiter " " -Header $headers -Path $IISLogPath $IISLogFileCSV = $IISLogFileCSV | where {$_.date -notlike "#*"}

Finally lets collect all the time-taken values into an array.  Note I have had a to do a little bit of hoop jumping to get the “-“ to not be interpreted by PowerShell.

$timeTaken = $IISLogFileCSV | foreach {$_.$("time-taken")}

So putting it all together and we get this:

$IISLogPath = "C:\Temp\sample.log" $IISLogFileRaw = [System.IO.File]::ReadAllLines($IISLogPath) $headers = $IISLogFileRaw[3].split(" ") $headers = $headers | where {$_ -ne "#Fields:"} $IISLogFileCSV = Import-Csv -Delimiter " " -Header $headers -Path $IISLogPath $IISLogFileCSV = $IISLogFileCSV | where {$_.date -notlike "#*"} $timeTaken = $IISLogFileCSV | foreach {$_.$("time-taken")}

Once you’ve got the array of time-taken, you can ask questions like “what was the average and max time-taken”, but I’ll leave that bit up to you”!

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

DAC SQL Azure Import Export Service PowerShell Client Module

SQL Azure offers a hosted service to import/export Databases between SQL Azure and Azure Blob storage, essentially they have put up a REST API and you can fire commands at it.  There is even a Codeplex project with SQL DAC example client  implementations. 

When I recently attempted to automate exports of a number of databases we host in Azure I grabbed a copy of the client and wrapped it in PowerShell and thought job done. That’s where I ran into issue number one, the exe randomly hangs for me. 

C# is not one of my strong points, so I decided that attempting to debug the C# source probably wasn’t a good idea and instead I decided to re-implement the client in PowerShell. 

So a lot of coffee, detective work (the REST API isn’t very well documented currently) and digging around in the example source code I’ve put together a PowerShell module implementing the three main features.  Export, Import, Status.

Usage

I’ve attempted to keep the command switches as close as I could to the Codeplex project so if you’re switching from one to the other, you should be able to figure out what’s going on very quickly. 

start-DacExport -s <server> -d <database> -u <username> -p <password> -bloburl <bloburl> -blobaccesskey <key> start-DacImport -s <server> -d <database> -u <username> -p <password> -bloburl <bloburl> -blobaccesskey <key> -size <inGB> -edition <web/business> get-DacJobStatus -s <server> -u <username> -p <password> [-requestid <GUID> -withInfo]

Both start-DacExport and start-DacImport will return the GUID of the job, which you can then use with get-DacJobStatus.  get-DacJobStatus will return an xml object containing then job’s status information; this is great if you are using the function in your own script, but if you just want to print the results to screen make sure you use –withInfo and the XML will be sent to the console instead.

Installation Instructions

  • Create a new folder in called DacIESvcPS in your modules directory e.g C:\Windows\System32\WindowsPowerShell\v1.0\Modules\DacIESvcPS
  • Download the latest version of the PSM1 file from https://github.com/stevenaskwith/DacIESvcPS into the new directory
  • Launch a PowerShell console and run
    • import-module DacIESvcPS
  • To confirm module loaded correctly run
    • Get-Command -Module DacIESvcPS

You should get something like this:

image

An example output of get-DacJobStatus –withInfo would look like this:

db3prod-dacsvc.azure.com <?xml version="1.0" encoding="ibm850"?> <ArrayOfStatusInfo xmlns="http://schemas.datacontract.org/2004/07/Microsoft.SqlServer.Management.Dac.ServiceTypes" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"> <StatusInfo> <BlobUri>http://myExportBlob.blob.core.windows.net/sqlexports/someDatabase.bacpac</BlobUri> <DatabaseName>someDatabase</DatabaseName> <ErrorMessage /> <LastModifiedTime>2012-03-22T10:18:57.1864719Z</LastModifiedTime> <QueuedTime>2012-03-22T10:16:03.7488387Z</QueuedTime> <RequestId>2bbbf314-3ec5-4f7c-afbd-ba219a61954b</RequestId> <RequestType>Import</RequestType> <ServerName>eccaps1fj1.database.windows.net</ServerName> <Status>Completed</Status> </StatusInfo> </ArrayOfStatusInfo>

I would love to hear back from anyone who uses this in the field.

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK