Gaurav Mantri's Personal Blog.

Azure Storage – Shared Access Signature Enhancements

Over the past few months, Azure Storage Team released two major upgrades. Both of these upgrades involve some really interesting new features and improvements. Among these new features and improvements include changes to Shared Access Signature (SAS) functionality. In Cloud Portam SAS plays a really important role. We use it everywhere we can to facilitate direct communication between a user’s browser and their storage account.

In this blog post I want to summarize the changes done to SAS in the last two versions. I will group these changes by the Storage Service REST API/.Net Storage Client library versions.

REST API Version 2015-02-21 / .Net Storage Client Library Version 5.X.X

There were two major changes done in this release when it comes to SAS.

SAS for File Service

Starting with this version you could create a SAS for File Service. You could create a SAS for a File Service Share as well as files inside a share.

SAS for File Service works much like SAS for Blob Service. In Blob Service, you can create a SAS with “Read”, “Write”, “Delete” and “List” permissions on a blob container. Similarly in File Service, you can create a SAS with “Read”, “Write”, “Delete” and “List” permissions on a share. In Blob Service, you can create a SAS with “Read”, “Write” and “Delete” permissions on a blob. Similarly in File Service you can create a SAS with “Read”, “Write” and “Delete” permissions on a file.

You could also define an access policy on a File Service share and use that access policy when creating a SAS on a share or a file on that share.

For some examples on how to create SAS on File Service resources, please see this blog post of mine: http://gauravmantri.com/2015/08/17/whats-new-in-azure-storage/.

“Canonicalized Resource” Creation

This was a breaking change introduced in this version and is only applicable if you’re creating a SAS token yourself without using any client library. Basically you must prepend the service name (blob, table, queue or file) to the canonicalized resource.

REST API Version 2015-04-05 / .Net Storage Client Library Version 6.X.X

There were three major changes done in this release when it comes to SAS.

IP Address/Range Restriction

Now when creating a SAS, you can restrict the usage of SAS by an IP address or an IP address range. Then when someone uses that SAS, Storage Service will allow them access to storage resources only when they are accessing the SAS URL from the IP address specified in the SAS.

You can specify a unique IP address e.g. 192.168.0.1 or an IP address range e.g. 192.168.0.0 – 192.168.0.255.

IMHO, this is a very useful feature when you want to avoid SAS getting into the hands of wrong persons.

I wish Storage Team would have allowed multiple IP address ranges (e.g. 192.168.0.0 – 192.168.0.255, 10.10.1.0 – 10.10.1.255). Currently I will have to create multiple SAS URLs for each IP address range I want to support.

Example 1: List Blobs SAS Restrict By IP Address

        static void ListBlobsSasRestrictByIpAddressExample(string ipAddress)
        {
            var account = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
            var blobClient = account.CreateCloudBlobClient();
            var containerName = "a00";
            var container = blobClient.GetContainerReference(containerName);
            var ipAddressRange = new IPAddressOrRange(ipAddress);
            var sasToken = container.GetSharedAccessSignature(new SharedAccessBlobPolicy()
                {
                    Permissions = SharedAccessBlobPermissions.List,
                    SharedAccessExpiryTime = new DateTimeOffset(DateTime.UtcNow.AddHours(1))
                }, null, null, ipAddressRange);
            Console.WriteLine(string.Format("Blob listing URL: {0}{1}&restype=container&comp=list", container.Uri, sasToken));
        }

Example 2: List Blobs SAS Restrict By IP Address range

        static void ListBlobsSasRestrictByIpAddressRangeExample(string ipAddressFrom, string ipAddressTo)
        {
            var account = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
            var blobClient = account.CreateCloudBlobClient();
            var containerName = "a00";
            var container = blobClient.GetContainerReference(containerName);
            var ipAddressRange = new IPAddressOrRange(ipAddressFrom, ipAddressTo);
            var sasToken = container.GetSharedAccessSignature(new SharedAccessBlobPolicy()
            {
                Permissions = SharedAccessBlobPermissions.List,
                SharedAccessExpiryTime = new DateTimeOffset(DateTime.UtcNow.AddHours(1))
            }, null, null, ipAddressRange);
            Console.WriteLine(string.Format("Blob listing URL: {0}{1}&restype=container&comp=list", container.Uri, sasToken));
        }

Protocol Restriction

When you create a SAS, essentially you get a SAS token that you append to the resource URL. Now this URL could be served over HTTPS protocol or HTTP protocol. In the previous versions you did not have control over that. With the latest version, you do. With “Protocol Restriction” feature, you can control if the SAS URL should be accessible only over HTTPS protocol or both HTTPS and HTTP protocol.

Example 1: List Blobs SAS Restrict By Protocol

        static void ListBlobsSasRestrictByProtocolExample(bool allowBothHttpAndHttpsProtocols)
        {
            var account = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
            var blobClient = account.CreateCloudBlobClient();
            var containerName = "a00";
            var container = blobClient.GetContainerReference(containerName);
            SharedAccessProtocol protocol = allowBothHttpAndHttpsProtocols ? SharedAccessProtocol.HttpsOrHttp : SharedAccessProtocol.HttpsOnly;
            var sasToken = container.GetSharedAccessSignature(new SharedAccessBlobPolicy()
            {
                Permissions = SharedAccessBlobPermissions.List,
                SharedAccessExpiryTime = new DateTimeOffset(DateTime.UtcNow.AddHours(1))
            }, null, protocol, null);
            Console.WriteLine(string.Format("Blob listing URL: {0}{1}&restype=container&comp=list", container.Uri, sasToken));
        }

Account SAS

This, by far, is the most significant improvement done to SAS since it started IMHO. If you look at operations supported by Storage Service, you can put these operations in three categories:

  1. Service Operations
  2. Container Operations – Container is a generic term meaning Blob Containers, File Service Shares, Tables and Queues.
  3. Object Operations – Object is a generic term meaning Blobs, Files, Entities and Messages.

Previously with SAS, typically you would do operations on objects and to some extent containers. For example, you could upload a blob (example of object operation) or list blobs in a container (example of container operation) using SAS but for other operations like creating a container you would need to use storage account key.

Account SAS enables you to perform almost all operations (service, container and object) using SAS.

The way you create an Account SAS is that you start by picking one or more services (Blobs, Files, Queues and Tables) and then specifying the operations (service, container, and object) you want to enable via SAS. Then you define the permissions (Read, Write, List, Delete, Add, Create, Update and Process) that you want to enable in SAS. Apart from that you can define start/expiry date/time for SAS, IP address and Protocol restrictions.

Important: At the time of writing this blog post, Account SAS is only supported for Blobs and File Service. It is not supported for Queue and Table Service. While you can create an Account SAS for Queue and Table Service using either the REST API or supported client libraries, you will get an error from storage service when you try to use that SAS (I wasted good 1 day on this :)).

Now let’s look at some examples of Account SAS.

Example 1: Account SAS for listing blob containers and file service shares

        static void AccountSasForListingBlobContainersAndFileServiceShares()
        {
            var account = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
            SharedAccessAccountServices servicesSupportedInSas = SharedAccessAccountServices.Blob | SharedAccessAccountServices.File;
            SharedAccessAccountResourceTypes resourceTypesSupportedInSas = SharedAccessAccountResourceTypes.Service;//Since we want to list top level containers only.
            SharedAccessAccountPermissions permissionsSupportedInSas = SharedAccessAccountPermissions.List;//Since we want to give list permissions only.
            var sasToken = account.GetSharedAccessSignature(new SharedAccessAccountPolicy()
                {
                    Services = servicesSupportedInSas,
                    ResourceTypes = resourceTypesSupportedInSas,
                    Permissions = permissionsSupportedInSas,
                    SharedAccessExpiryTime = new DateTimeOffset(DateTime.UtcNow.AddHours(1))
                });
            Console.WriteLine(string.Format("Blob containers listing URL: {0}{1}&comp=list", account.BlobStorageUri.PrimaryUri, sasToken));
            Console.WriteLine(string.Format("File shares listing URL: {0}{1}&comp=list", account.FileStorageUri.PrimaryUri, sasToken));
        }

Example 2: Account SAS for creating a blob container

        static void AccountSasForCreatingBlobContainers()
        {
            var account = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
            SharedAccessAccountServices servicesSupportedInSas = SharedAccessAccountServices.Blob;
            SharedAccessAccountResourceTypes resourceTypesSupportedInSas = SharedAccessAccountResourceTypes.Container;//Since we want SAS for top level containers only.
            SharedAccessAccountPermissions permissionsSupportedInSas = SharedAccessAccountPermissions.Create;//Since we want to give create permission only.
            var sasToken = account.GetSharedAccessSignature(new SharedAccessAccountPolicy()
            {
                Services = servicesSupportedInSas,
                ResourceTypes = resourceTypesSupportedInSas,
                Permissions = permissionsSupportedInSas,
                SharedAccessExpiryTime = new DateTimeOffset(DateTime.UtcNow.AddHours(1))
            });
            Console.WriteLine("Sas Token = " + sasToken);
        }

Example 3: Account SAS for deleting any file service share

        static void AccountSasForDeletingShares()
        {
            var account = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
            SharedAccessAccountServices servicesSupportedInSas = SharedAccessAccountServices.File;
            SharedAccessAccountResourceTypes resourceTypesSupportedInSas = SharedAccessAccountResourceTypes.Container;//Since we want SAS for top level containers only.
            SharedAccessAccountPermissions permissionsSupportedInSas = SharedAccessAccountPermissions.Delete;//Since we want to give delete permission only.
            var sasToken = account.GetSharedAccessSignature(new SharedAccessAccountPolicy()
            {
                Services = servicesSupportedInSas,
                ResourceTypes = resourceTypesSupportedInSas,
                Permissions = permissionsSupportedInSas,
                SharedAccessExpiryTime = new DateTimeOffset(DateTime.UtcNow.AddHours(1))
            });
            Console.WriteLine("Sas Token = " + sasToken);
        }

For more information about Account SAS, please visit REST API documentation page here: https://msdn.microsoft.com/en-US/library/azure/mt584140.aspx.

Service SAS

I want to take a moment here and write something about “Service SAS” as it confused me. Since it was mentioned along with Account SAS, my initial impression was that you use Service SAS to create service specific SAS e.g. SAS for blob, file, queue or table service while you use Account SAS to create SAS at storage account level that could include many services.

Only after reading more about it, I came to know that the old SAS is now referred to as Service SAS :). So essentially a Service SAS is SAS you create for individual blob container, blob, file service share, file, table and queue. It’s not new :). Thought I should mention it here.

Conclusion

That’s it for this post. As always, your feedback is welcome. If you find any issues with this post, please let me know and I will have it fixed ASAP.

As a side note, I do want to mention that Cloud Portam fully supports these features. You can read more about support for all these features in Cloud Portam here: http://blog.cloudportam.com/cloud-portam-updates-support-for-account-level-shared-access-signature-ip-address-and-protocol-restrictions-in-shared-access-signature-and-other-enhancements/.

Happy Coding!

Azure Icon Font For Your Web Application

image.png

For both Cloud Portam website and application, when it comes to icons we don’t use images. Almost invariably, we use vector icons and icon fonts (like Font Awesome which BTW are very aptly named; they are indeed really-really awesome :)). Where … [Continue reading]

What’s New In Azure Storage

It’s been a while that I wrote a blog post about Azure Storage :). Earlier this month, Azure Storage Team released a new version of Storage Service and included a lot of awesome goodness! In this blog post, I will try to summarize those. So … [Continue reading]

Azure Service Bus – As I Understand It: Part II (Queues & Messages)

Continuing from my previous post about Azure Service Bus, in this post I will share my learning about Queues & Messages. The focus of this post will be about some of the undocumented things I found as we implemented support for Queues and … [Continue reading]

Azure Service Bus – As I Understand It: Part I (Overview)

Recently we started working on including support for Azure Service Bus in Cloud Portam. Prior to this, I had no experience with this service though it has been around for quite some time and I always wanted to try this out but one thing or another … [Continue reading]

Understanding Azure Premium Storage – A Developer Perspective

As you may already know Azure Premium Storage is the latest offering from Azure Storage which is designed specifically to handle Virtual Machine workloads. A few weeks back we included support for managing Premium Storage in Cloud Portam. There were … [Continue reading]

Prarambh 2015 – A Recap

Over the weekend of Feb 20 - 22nd 2015, Indian Institute of Management (IIM) Udaipur’s Entrepreneurship Cell “Saksham” organized a 32 hour startup challenge event (Prarambh). This was the 2nd time an event like this was organized (and I'm extremely … [Continue reading]

What Startup Founders Can Learn From Ocean’s Eleven Movie

I’m a huge fan of movie Ocean’s Eleven. In fact, I watch this movie every time it comes on TV. For those who don’t know about this movie, in short it is a heist film where George Clooney along with 10 other folks rob three Las Vegas casinos in one … [Continue reading]

Azure SDK 2.5 and Cloud Service Diagnostics

Recently Azure team announced the availability of SDK 2.5. You can read the release notes here: http://msdn.microsoft.com/en-us/library/azure/dn873976.aspx. One of the major change that was done in this release is related to diagnostics. In this post … [Continue reading]

Azure Search Service – Some Documented/Undocumented Business Rules

As you may already know, for past 2 weeks we have been implementing support for Azure Search Service in Cloud Portam. We released a newer version yesterday (more on this here). Currently there is no SDK available for this service and we ended up … [Continue reading]