Clik here to view.

Authenticom's Logo
This is a guest post by Chris Kirby, Director of Technology at Authenticom. He's a big fan of automating processes, beer, and games. You can follow him on Twitter, GitHub, and his blog.
If you're interested in being published in our blog, reach out to us at help@runscope.com.
Testing! Every developer's favorite topic :). For me, if I can save time through automation, them I'm interested. Automated testing for a developer typically starts with unit tests, which even if you don't subscribe to TDD, you've written at least one of them just to see what all the fuss was about. Like me, I'm sure you saw that testing complex logic at build time has huge advantages in terms of quality and taking risks. However, even with the most comprehensive tests at 100% coverage, you've still got more work to do on your journey towards a bug-free existence.
Given that most modern applications rely on a wide variety of cloud platform services, testing can't stop with the fakes and mocks...good integration testing is where it's at to get you the rest of the way. Integration testing is nothing new of course, it's just more complicated today than it was even a few years ago. A tester's job is not only to test with your application's custom interface and data, but to also test your interaction/integration with the dozens of 3rd party services and SaaS providers.
Thankfully, we've agreed on a common language...where there is a service, there is a REST API. This post demonstrates how to work with Azure Storage, free of the SDK, in a test environment like Runscope or Postman.
Authorization
By far the worst part of working with this particular API is getting through the 403. The docs are comprehensive in terms of what you have to do, but they are very light on how to do them. With that said, There are two primary ways to accomplish this. You can build a custom Authorization header, or you can generate a Secure Access Signature (SAS) and pass that via query string. In the following I cover both approaches, however I highly recommend using a SAS for simplicity.
Generating and using a Shared Key Authorization header
The short of it, is that you piece together a custom signature string, sign it with the HMAC-SHA256 algorithm using your primary/secondary storage account key, and BASE64 encode the result. If this sounds complicated, it is. Here is the full dump on SharedKey authorization from the Azure docs. The following is an example generation script and how you could go about using it in Runscope, my favorite tool for testing API's.
var storageAccount = "myStorageAccountName"; var accountKey = "12345678910-primaryStorageAccountKey"; var date = moment().format("ddd, DD MMM YYYY HH:mm:ss"); var data = date + "\n" + "/" + storageAccount + " /myTable" // utf-8 encoding var encodedData = unescape(encodeURIComponent(data)); // encrypt with your key var hash = CryptoJS.HmacSHA256(encodedData, accountKey); var signature = hash.toString(CryptoJS.enc.Base64); // build the auth header var auth = "SharedKeyLite " + storageAccount + ":" + signature; // show the full header $("#output").html("Authorization: " + auth);Image may be NSFW.
Clik here to view.

The pre-request script example in the Runscope test editor interface
Import the test directly into Runscope: https://gist.github.com/sirkirby/389a289f55e8160efcbeef99e1d33db4
Generating and using a SAS
Did I mention this was easier? There are a couple of common ways to generate one outside of using code and the SDK. The most obvious is using the Azure Portal. Just navigate to your storage account blade and look for the Shared access signature option on the left menu. The other option is to generate one using the awesome Azure Storage Explorer tool.
Once in and authenticated, tree down to the account or resource you want to access and use the context menu to generate the signature. If you want tight control over security, I would suggest using Storage Explorer given that it has an interface for generating signatures on specific tables, containers, and queues. The Portal, on the other hand, only has an interface for the account level signature (at the time of writing). Now that I have my SAS, here is what is looks like in Runscope:
Image may be NSFW.Clik here to view.

Request setup with variables and querystring in the Runscope test editor
Import the test directly into Runscope: https://gist.github.com/sirkirby/72cdbeac7f8273da955b3e3784ab7083
Wrap up
Now that you're getting a 200, you can move on to writing your assertions. By default, the ODATA response you'll get back is in the Atom XML format, which makes writing your Javascript assertions more difficult. To get the result in JSON, be sure to add an Accept request header with the value application/json;odata=fullmetadata
.
Happy testing!