Gaurav Mantri's Personal Blog.

Windows Azure Storage and Cross-Origin Resource Sharing (CORS) – Lets Have Some Fun

Recently Windows Azure Storage introduced a bunch of new changes. I wrote a blog post summarizing these changes which you can read here: http://gauravmantri.com/2013/11/28/new-changes-to-windows-azure-storage-a-perfect-thanksgiving-gift/. One of the important changes is the support for Cross-Origin Resource Sharing (CORS) for the Blob, Table, and Queue services.

In this blog post, we will focus on CORS and that too for blob service. We will talk about how you can enable CORS for blob service, manage CORS rules and then we will end the post with an example of simple HTML based file uploader which will directly upload a file from your computer to blob storage. I wrote a blog post on same subject some time back and for this post we will use a lot of code from that post only. You are welcome to read that post as well: http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript/.

We will not dig into details about how CORS rules are evaluated by storage service and also we will not get into details about all the restrictions. Storage team has written excellent documentation on that and I would encourage you to read that. You can read the documentation here: http://msdn.microsoft.com/en-us/library/windowsazure/dn535601.aspx. Also from code perspective, we will focus on the Storage Client Library and not the REST API.

Before We Begin

Before we begin, let’s understand some of the concepts:

  • CORS is supported for blobs, table and queue service. However you would need to enable them separately for each service. The reason being each service in a storage account has a separate endpoint e.g. youraccount.blob.core.windows.net, youraccount.table.core.windows.net, and youraccount.queue.core.windows.net.
  • You can specify a maximum of 5 separate CORS rules for each service i.e. your blob service could have a maximum of 5 CORS rules. In other words a total 15 CORS rules can be specified for a storage account but each service can have a maximum of 5 CORS rules.
  • By specifying a CORS rule for a service (e.g. blob service) in a storage account does not mean that CORS rule is applied for another service in same storage account (e.g. table service).
  • Just because you have enabled CORS on a particular service in a storage account does not mean you could make unauthenticated request against that service. Your request still need to be authenticated. A good example is uploading blob. Here you would need to ensure that you’re using a Shared Access Signature URL with write permission to upload files into blob storage.

Elements of a CORS rule

Following are the elements of a CORS rule:

Allowed Origins

This defines the domains which can directly make a request to a storage service or in other words this is the domain from which the request to your storage service originates. A few things about allowed origins:

  • You can include more than one allowed origins in a single CORS rule. For example, you could include something like “http://www.mydomain1.com” and “http://www.mydomain2.com” in a single CORS rule for allowed origins.
  • The origin names specified in CORS rule must match with the origin making the request. Windows Azure Storage Service will do a case-sensitive match. For example, if you specified “http://www.mydomain1.com” as allowed origin however the request is made from “http://www.MyDomain.com” then the request will fail.
  • Wildcards are supported. To support all domains, you must specify “*” as allowed origin.

Allowed Methods

These define the HTTP methods (verbs) allowed in the request. A few things about allowed methods:

  • You can include one or more HTTP verbs in a single CORS rule. For example, it is perfectly legitimate to include PUT and MERGE in a single CORS rule.
  • Based on my understanding wildcards are not supported. You must specify each and every HTTP verb explicitely.

Allowed Headers

These define the HTTP headers which will be included in the cross domain request. A few things about allowed headers:

  • You can include more than one headers in a single CORS rule. As we will see below in a little while that for uploading blob, we would need to send 4 request headers: “x-ms-blob-type”, “x-ms-blob-content-type”, “content-length”, and “accept” and it is perfectly legitimate to set these in a single CORS request.
  • Wildcards are supported. If we take example above, I could set allowed headers to be “x-ms-*”, “content-length”, and “accept”. By setting “x-ms-*” as one of the allowed headers, we are essentially telling storage service to accept all request headers which starts with “x-ms-”.
  • If you send a request header in your request which is not part of the CORS rule, your request will be rejected.

Exposed Headers

These define a set of headers which you want storage service to send back to the browser as a part of CORS request. From what I understand, this setting is optional and if you don’t specify this, browser will not return A few things about exposed headers:

  • This setting is optional. From what I understand, if you don’t specify anything, nothing will be returned.
  • Wildcards are supported. For example, if you set the value as “x-ms-*”, all response headers starting with “x-ms-” will be returned.

Max Age in Seconds

This defines the maximum number of seconds browser will cache the pre-flight request. Since this “pre-flight” requests are billable, it is generally recommended that you set this value to a higher number so that browser does not make this “pre-flight” request frequently.

Managing CORS through Storage Client Library

Now let’s look at some code as to how you will manage CORS through storage client library. Our end goal is to upload a small file into blob storage from a web application running on the local computer. To do so I created a simple ASP.Net MVC 4 application and noticed the URL of that application when I ran the application in debug mode. Now based on what we need to do, here’re the settings we want to set in the CORS rule:

  • Allowed Origin: http://localhost:61233
  • Allowed Methods: PUT
  • Allowed Headers: x-ms-blob-type, x-ms-blob-content-type, content-length, and accept
  • Exposed Headers: none (i.e. leave it blank)
  • Max Age in Seconds: 3600 (i.e. 1 hour)

Next I created a simple console application and added a reference to Storage Client Library version 3.0.0.0 from Nuget. Please note that storage emulator still doesn’t work with this version of the library so you may want to think about this before upgrading your cloud project with the latest version of this library. Since we are doing a console application, we should be fine.

Add CORS Rule

To add a CORS rule, all I need to do is create a new “CorsRule()” object and set the appropriate properties as shown in the code below:

        var corsRule = new CorsRule()
        {
            AllowedHeaders = new List<string> { "x-ms-*", "content-type", "accept" },
            AllowedMethods = CorsHttpMethods.Put,//Since we'll only be calling Put Blob, let's just allow PUT verb
            AllowedOrigins = new List<string> { "http://localhost:61233" },//This is the URL of our application.
            MaxAgeInSeconds = 1 * 60 * 60,//Let the browswer cache it for an hour
        };

Next, we will fetch all the service settings for the blob service of my storage account as shown in the code below:

        var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
        var client = storageAccount.CreateCloudBlobClient();
        var serviceProperties = client.GetServiceProperties();
        var corsSettings = serviceProperties.Cors;

Since its a demo we’re working with, we will just add the new CORS rule we created to the CORS settings we fetched from storage as shown in the code below:

        corsSettings.CorsRules.Add(corsRule);

Next step is to save these settings back by calling “SetServiceProperties()” method as shown in the code below:

        client.SetServiceProperties(serviceProperties);

Here’s the complete code:

        static void AddCorsRuleStorageClientLibrary()
        {
            //Add a new rule.
            var corsRule = new CorsRule()
            {
                AllowedHeaders = new List<string> { "x-ms-*", "content-type", "accept" },
                AllowedMethods = CorsHttpMethods.Put,//Since we'll only be calling Put Blob, let's just allow PUT verb
                AllowedOrigins = new List<string> { "http://localhost:61233" },//This is the URL of our application.
                MaxAgeInSeconds = 1 * 60 * 60,//Let the browswer cache it for an hour
            };

            //First get the service properties from storage to ensure we're not adding the same CORS rule again.
            var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
            var client = storageAccount.CreateCloudBlobClient();
            var serviceProperties = client.GetServiceProperties();
            var corsSettings = serviceProperties.Cors;

            corsSettings.CorsRules.Add(corsRule);
            //Save the rule
            client.SetServiceProperties(serviceProperties);
        }

Get CORS Rules

To fetch all the CORS rules for a service, we just have to fetch the service properties of the specified service and iterate over “Cors” properties as shown in the code below:

        static void GetCorsRulesStorageClientLibrary()
        {
            var serviceProperties = blobClient.GetServiceProperties();
            var corsSettings = serviceProperties.Cors;
            foreach (var corsRule in corsSettings.CorsRules)
            {
                StringBuilder allowedOrigins = new StringBuilder();
                foreach (var allowedOrigin in corsRule.AllowedOrigins)
                {
                    allowedOrigins.AppendFormat("{0}, ", allowedOrigin);
                }
                StringBuilder allowedMethods = new StringBuilder();
                foreach (var type in Enum.GetValues(typeof (CorsHttpMethods)))
                {
                    if ((CorsHttpMethods)type != CorsHttpMethods.None)
                    {
                        if (corsRule.AllowedMethods.HasFlag((CorsHttpMethods)type))
                        {
                            allowedMethods.AppendFormat("{0}, ", (CorsHttpMethods)type);
                        }
                    }
                }
                StringBuilder allowedHeaders = new StringBuilder();
                foreach (var allowedHeader in corsRule.AllowedHeaders)
                {
                    allowedHeaders.AppendFormat("{0}, ", allowedHeader);
                }
                int maxAgeInSeconds = corsRule.MaxAgeInSeconds;
                StringBuilder exposedHeaders = new StringBuilder();
                foreach (var exposedHeader in corsRule.ExposedHeaders)
                {
                    exposedHeaders.AppendFormat("{0}, ", exposedHeader);
                }
                Console.WriteLine(string.Format("Allowed Origins:  {0}", allowedOrigins.ToString()));
                Console.WriteLine(string.Format("Allowed Methods:  {0}", allowedMethods.ToString()));
                Console.WriteLine(string.Format("Allowed Headers:  {0}", allowedHeaders.ToString()));
                Console.WriteLine(string.Format("Max Age (Seconds): {0}", maxAgeInSeconds));
                Console.WriteLine(string.Format("Exposed Headers:  {0}", exposedHeaders.ToString()));
                Console.WriteLine("==============================================================================");
            }
        }

Remove CORS Rule

In order to remove a particular CORS rule, we would first need to fetch the existing CORS rules for the service which we can do by fetching the service properties of the specified service and get “Cors” properties as shown in the code below:

        var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
        var client = storageAccount.CreateCloudBlobClient();
        var serviceProperties = client.GetServiceProperties();
        var corsSettings = serviceProperties.Cors;

Next we would need to find the CORS rule we want to remove. Let’s say if we wanted to remove CORS setting for “http://localhost:61223″, we would find it in the CORS rules and then remove this rule from CORS setting and then call “SetServiceProperties()” method to save the new CORS rules for the service as shown in the code below:

        var corsRuleToBeRemoved = corsSettings.CorsRules.FirstOrDefault(a => a.AllowedOrigins.Contains("http://localhost:61233"));
        if (corsRuleToBeRemoved != null)
        {
            corsSettings.CorsRules.Remove(corsRuleToBeRemoved);
            client.SetServiceProperties(serviceProperties);
        }

Here’s the complete code:

        static void RemoveCorsRuleStorageClientLibrary()
        {
            var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
            var client = storageAccount.CreateCloudBlobClient();
            var serviceProperties = client.GetServiceProperties();
            var corsSettings = serviceProperties.Cors;
            var corsRuleToBeRemoved = corsSettings.CorsRules.FirstOrDefault(a => a.AllowedOrigins.Contains("http://localhost:61233"));
            if (corsRuleToBeRemoved != null)
            {
                corsSettings.CorsRules.Remove(corsRuleToBeRemoved);
                client.SetServiceProperties(serviceProperties);
            }
        }

Remove All CORS Rules

In order to remove all CORS rules for a service, you simply fetch the service settings and then empty the “CorsRules” collection as shown in the code below:

        static void RemoveAllCorsRulesStorageClientLibrary()
        {
            var serviceProperties = blobClient.GetServiceProperties();
            var corsSettings = serviceProperties.Cors;
            corsSettings.CorsRules.Clear();
            blobClient.SetServiceProperties(serviceProperties);
        }

Update CORS Rule

In order to update a particular CORS rule, we would first need to fetch the existing CORS rules for the service which we can do by fetching the service properties of the specified service and get “Cors” properties as shown in the code below:

        var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
        var client = storageAccount.CreateCloudBlobClient();
        var serviceProperties = client.GetServiceProperties();
        var corsSettings = serviceProperties.Cors;

Next we would need to find the CORS rule we want to update. Let’s say if we wanted to update CORS setting for “http://localhost:61223″ where in we want to add another HTTP Method, we would find it in the CORS rules and then update this rule and then call “SetServiceProperties()” method to save the new CORS rules for the service as shown in the code below:

        var corsRuleToBeUpdated = corsSettings.CorsRules.FirstOrDefault(a => a.AllowedOrigins.Contains("http://localhost:61233"));
        if (corsRuleToBeUpdated != null)
        {
            corsRuleToBeUpdated.AllowedMethods = corsRuleToBeUpdated.AllowedMethods | CorsHttpMethods.Merge;
            client.SetServiceProperties(serviceProperties);
        }

Here’s the complete code:

        static void UpdateCorsSettingsStorageClientLibrary()
        {
            var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
            var client = storageAccount.CreateCloudBlobClient();
            var serviceProperties = client.GetServiceProperties();
            var corsSettings = serviceProperties.Cors;
            var corsRuleToBeUpdated = corsSettings.CorsRules.FirstOrDefault(a => a.AllowedOrigins.Contains("http://localhost:61233"));
            if (corsRuleToBeUpdated != null)
            {
                corsRuleToBeUpdated.AllowedMethods = corsRuleToBeUpdated.AllowedMethods | CorsHttpMethods.Merge;
                client.SetServiceProperties(serviceProperties);
            }
        }

Caution

If you notice, in the code above we’re always fetching the service properties first and then performing the operation. There’s a reason behind this. Essentially when you call “SetServiceProperties()” method, whatever you pass in that method call gets persisted. So let’s say if I already have 2 CORS rules defined and I wanted to add a 3rd rule and if I didn’t first get the properties and instead called the code like this:

            var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
            var client = storageAccount.CreateCloudBlobClient();
            var serviceProperties = new ServiceProperties();
            serviceProperties.Cors.CorsRules.Add(new CorsRule()
            {
                AllowedHeaders = new List<string> { "x-ms-*", "content-type", "accept" },
                AllowedMethods = CorsHttpMethods.Put,//Since we'll only be calling Put Blob, let's just allow PUT verb
                AllowedOrigins = new List<string> { "http://localhost:61233" },//This is the URL of our application.
                MaxAgeInSeconds = 1 * 60 * 60,//Let the browswer cache it for an hour
            });
            client.SetServiceProperties(serviceProperties);

This would actually remove the existing 2 settings and only keep the 3rd setting that I just added so now you’re left with just one CORS setting.

Web Application

Having done all of this, now we are left with writing a web application which will directly upload the file to blob storage. Here’s the sample code for this. Most of the code has been lifted from http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript/ blog post of mine. There I was uploading a large file by splitting it in chunks or in REST API terms, I was calling Put Block and Put Block List operations. Here I’m just uploading a small file without splitting it in chunks or in REST API terms, I’m calling Put Blob operation here. Before running this code, you would need to get a SAS URL with Write permission on a container and you would need to run a browser which supports HTML 5′s File API.

@{
    ViewBag.Title = "CORS Demo";
}
<script src="/Scripts/jquery-1.8.2.js"></script>
<script type="text/javascript">
    var reader = null;
    var selectedFile = null;
    $(document).ready(function () {
        reader = new FileReader();
        reader.onloadend = function (evt) {
            if (evt.target.readyState == FileReader.DONE) {
                var baseUrl = $("#sasUrl").val();
                var indexOfQueryStart = baseUrl.indexOf("?");
                submitUri = baseUrl.substring(0, indexOfQueryStart) + '/' + selectedFile.name + baseUrl.substring(indexOfQueryStart);
                console.log(submitUri);
                var requestData = new Uint8Array(evt.target.result);
                $.ajax({
                    url: submitUri,
                    type: "PUT",
                    data: requestData,
                    processData: false,
                    beforeSend: function (xhr) {
                        xhr.setRequestHeader('x-ms-blob-type', 'BlockBlob');
                        xhr.setRequestHeader('x-ms-blob-content-type', selectedFile.type);
                        xhr.setRequestHeader('x-ms-meta-uploadvia', 'CORS Demo');
                        xhr.setRequestHeader('Content-Length', requestData.length);
                    },
                    success: function (data, status) {
                        alert("File uploaded successfully");
                        console.log(data);
                        console.log(status);
                    },
                    error: function (xhr, desc, err) {
                        console.log(desc);
                        console.log(err);
                    }
                });
            }
        };
        $("#file").bind('change', function (e) {
            var files = e.target.files;
            selectedFile = files[0];
            $("#fileName").text(selectedFile.name);
            $("#fileSize").text(selectedFile.size);
            $("#fileType").text(selectedFile.type);
        });
        $("#buttonUploadFile").click(function (e) {
            if (selectedFile == null) {
                alert("Please select a file first.");
            }
            else {
                var fileContent = selectedFile.slice(0, selectedFile.size - 1);
                reader.readAsArrayBuffer(fileContent);
            }
        });
    })

</script>

<h2>CORS Demo</h2>
<form>
        <div style="margin-left: 20px;">
            <h1>File Uploader</h1>
            <p>
                <strong>SAS URI</strong>:
                <br/>
                <span class="input-control text">
                    <input type="text" id="sasUrl" style="width: 50%"
                           value="<fill in the SAS URL here>"/>
                </span>
            </p>
            <p>
                <strong>File To Upload</strong>:
                <br/>
                <span class="input-control text">
                    <input type="file" id="file" name="file" style="width: 50%"/>
                </span>
            </p>
            <div id="output">
                 
                <strong>File Properties:</strong>
                <br/>
                <p>
                    Name: <span id="fileName"></span>
                </p>
                <p>
                    File Size: <span id="fileSize"></span> bytes.
                </p>
                <p>
                    File Type: <span id="fileType"></span>
                </p>
                <p>
                    <input type="button" id="buttonUploadFile" value="Upload File"/>
                </p>
                <p>
                    <strong>Progress</strong>: <span id="fileUploadProgress">0.00 %</span>
                </p>
            </div>
        </div>
        <div>
        </div>
</form>

Summary

That’s it for this post. I hope you have found it useful. As always, if you find any issues with the code or anything else in the post please let me know and I will fix it ASAP. We will dig more into other storage changes in future posts.

Happy Coding!!!


[This is the latest product I'm working on]

Comments

  1. Hi, I’m getting these errors in JS console log:

    Refused to set unsafe header “Content-Length”
    Refused to set unsafe header “Access-Control-Request-Method”
    403 (CORS not enabled or no matching rule found for this request.)

    I used the code for block upload from your previous post. The strangest thing is that the file does get uploaded to Azure properly (I can view it on Azure website when I browse container files).

    Any idea why this is happening?

    Thanks for your posts btw, they helped a lot, and saved me a lot of time.

    • Ok I figured it out, the problem was in part of my code that I added. And now I have another question – how come there is no preflight request here? Isn’t there supposed to be one? Thank you.

      • Aah ok, to answer my question yet again – it’s not something that needs to be done explicitly, it’s handled automatically by the browser.

  2. Would this code work for videos too? I have a website that displays user videos, and I hope to tell my users to go to an asp.net page (not a console page, but a regular form), and have them upload videos from their PCs. Then I’d like to use Azure media services to encode the video so it can be watched, and to have it return a URL to the encoded version of that video so I can store the URL on my website.
    So: can I just paste the above code into the form part of an asp.net page, and then in my code-behind use a known SAS to prefill the SAS field, and then after the upload is done, get a URL of the uploaded file so that I can then apply “encoding” (into a playable form)?
    (I am very ignorant of Azure storage, I don’t know what a ‘blob’ or an ‘endpoint’ is, or whether media services is storing things in blobs in regular storage or in some special storage area).
    Thanks.

  3. Hi,

    Good post! However, I’m trying to do something that’s supposed to be much simpler. I’d just like to use CORS to query the below:

    https://wamsamsclus001rest-hs.cloudapp.net/api/Assets

    but there is nowhere an explanation on how to enable CORS on that kind of RESTFul URL. If I try to hit it, Chrome reports this:

    Access-Control-Allow-Credentials:true
    Access-Control-Allow-Headers:Content-Type,Authorization
    Access-Control-Allow-Methods:GET,POST,PUT,DELETE,MERGE
    Access-Control-Allow-Origin:http://urlofyourremotedomain
    Content-Length:0
    Date:Thu, 06 Mar 2014 16:12:38 GMT
    request-id:18e9cff9-b25e-4797-852c-e305cbadec16
    Server:Microsoft-IIS/7.5
    x-ms-request-id:18e9cff9-b25e-4797-852c-e305cbadec16
    X-Powered-By:ASP.NET

    although I didn’t configure anything in Azure, there are AllowMethods defined etc…but it’s failing because in the AllowedOrigin, the value is “http://urlofyourremotedomain”….where it should be my domain. Any clue on how to configure that one? It seems that it’s not related to either of the services (table,queue,blob)…

    Thanks

  4. Marc André says:

    Hi,

    thanks for the usefull Information! I tried your solution abd I’m able to upload small files. But if I try to upload bigger files > 50 MB then the connection breaks:
    ResendRequest() failed: Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host. < An existing connection was forcibly closed by the remote host

    Do you have any ideas about this problem?

  5. Hello,

    I am using Google Postman and I tried generating the adhoc SAS with Cerebrata (Azure management Studio) and punch in the signature in the postman., the URL below. Also using the Postman i added headers like Accept, x-ms-version, MaxDataServiceVesrion as you recommended in your post http://www.contentmaster.com/azure/windows-azure-table-storage-json/

    No matter what i do , i don’t get to work this correctly. I always get 403 Server failed to Authenticate the request.
    I even changed the sv querystring to 2013-08-15, if i used the old one then i get an error 415 , that JSON is not supported.

    Please guide , how to get away with this issue. I am using Azure Table storage and want my results to be in Json,
    If i use the below url , then i get the correct result in XML but not in JSON.

    I even went to the route to create Authorization Header, but i Get the same error. Since in your example you said that you use the SAS scheme, i went with that.
    I am also pasting a screen shot

    below is the URL + SAS generated by Cerebrata.
    https://XXXXX.table.core.windows.net/Users()?$filter=(rating%20ge%201)&$select=PartitionKey,RowKey,name&tn=Users&sv=2013-08-15&st=2014-04-08T22%3A15%3A22Z&se=2014-04-08T23%3A15%3A22Z&sp=raud&sig=WqWVaWoGjMjhWlwLJ%2BnciYQTbrxOdKMS2p4TotTRMXA%3D

    Thanks

    • Rahul,

      From what I know, Azure Management Studio still does not support version “2013-08-15″. Did you manually change the sv value from “2012-02-12″ to “2013-08-15″. If that’s the case, then your SAS URL will not work and will throw 403 error because REST API version number is used in calculating Shared Access Signature’s “sig” portion. Another alternative would be to create SAS using .Net Storage Client library.

                  CloudTable table = cloudStorageAccount.CreateCloudTableClient().GetTableReference("Users");
                  var sasToken = table.GetSharedAccessSignature(new SharedAccessTablePolicy()
                  {
                      Permissions = SharedAccessTablePermissions.Add | SharedAccessTablePermissions.Delete | SharedAccessTablePermissions.Query | SharedAccessTablePermissions.Update,
                      SharedAccessStartTime = DateTime.UtcNow.AddHours(-1),
                      SharedAccessExpiryTime = DateTime.UtcNow.AddDays(1),
                  }, null, null, null, null, null);
                  var sasUrl = string.Format("{0}{1}", table.Uri, sasToken);
      
      • Thanks for the quick reply,
        Yes, I did manually changed the sv from 2012 to 2013 , but i kept the sig as it is, from the azure management portal.
        Unfortunately, we don’t have the latest version of dot net running on the production VM’s and I was keen in using the AjaX, that’s the reason i thought of trying out various Rest API’s with the Postman.
        you are correct changing it “sv” manually to 2013 version errors out even for xml/Atom,
        Is there another way where we could generate the value?

        • is there a better way to create sharedkeylite and sharedkey.
          I am even getting a 413 over there as well

  6. I post my question also here, since it is closely related to CORS.

    I followed both your tutorial, the file upload one (http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript/) and this one.
    CORS rule is the following:
    Allowed origins: *
    Allowed methods: Get, Put
    Allowed headers: *
    Exposed headers: x-ms-*
    Max age (seconds): 3600
    …and it should enable every origin.
    Anyway, the browser stops the request because it is a Cross Origin request. :(

    My SAS is generated by a Web Service and it is valid since locally it works perfectly.

    Any idea? Am I missing something?

    Thank you so much.

Trackbacks

  1. […] shared 5 times • gauravmantri.com Windows Azure Storage and Cross-Origin Resource Sharing (CORS) – Lets Have Some Fun — GauravMant… […]

  2. […] Windows Azure Storage and Cross-Origin Resource Sharing (CORS) – Lets Have Some Fun […]

  3. Windows Azure Community News Roundup #78

    Welcome to the newest edition of our weekly roundup of the latest community-driven news, content and

  4. […] shared 3 times • gauravmantri.com Windows Azure Storage and Cross-Origin Resource Sharing (CORS) – Lets Have Some Fun — GauravMant… […]

  5. […] Windows Azure Storage and Cross-Origin Resource Sharing (CORS) – Lets Have Some Fun (Gaurav Mantri) […]

Speak Your Mind

*