Managing Azure Blob Storage with .NET

Azure Blob Storage, an integral component of Microsoft Azure’s services, stores a diverse range of unstructured data in the cloud. This data comprises text documents, images, multimedia files, and application logs.

Its standout benefits include the ability to reduce latency by storing data closer to access points and improve data retrieval times by using data tiering and indexing. Azure Blob Storage lets you choose hot, cool, cold, or archive tiers to fine-tune your storage costs.

This tutorial will guide you in managing Azure Blob Storage effectively using the .NET software development kit (SDK). The SDK offers several advantages over making direct calls to the Azure Blob Storage REST API, including ease of use since it bypasses direct call intricacies. Its uniform approach ensures consistency, making code maintenance easier. The .NET SDK has built-in error handling, logging, and other functions. Its official and community support also ensures you stay aligned with the latest features and best practices.

Working with Azure Blob Storage in .NET applications

In this article, you’ll learn how to leverage the .NET SDK to manage Azure Blob Storage efficiently.

Prerequisites

Before you begin, you’ll need to:

  • Download and install the .NET SDK, essential for developing .NET applications.
  • Install the Azure command-line interface (CLI) and update to the latest version. This CLI provides commands to manage Azure resources.
  • Sign up for a free Azure account if you don’t already have one.
  • Install your preferred integrated development environment (IDE). This tutorial uses Visual Studio (its Community edition is free to download).

Scenario

Consider a scenario where your organization generates a substantial volume of log files on a legacy server and needs to do the following:

  • Regularly move the server’s log files to a “hot” storage tier in Azure Blob Storage for immediate analysis (for this tutorial, you’ll source them from a local folder).
  • Download all files from Blob Storage to another system for analytics.
  • Move blobs to a cool storage tier after 10 days, as the cost of keeping them in hot storage becomes prohibitive over time.
  • Permanently delete blobs over 30 days old.

This process might look like the following:

Flow of log files from a server to an uploader, blob container, manager, or processor for additional analytics Fig. 1: Flow of log files from a server to an uploader, blob container, manager, or processor for additional analytics

Provisioning Azure resources

To lay the foundation for managing Azure Blob Storage, you first need to allocate and configure your Azure resources. Here's a table outlining the resources you’ll be provisioning and their specific usage:

Resource Usage
Storage Account Serves as the foundation for storing blobs and is essential for the Function App operations
Blob Container Acts as a specific storage location (akin to a folder) for organizing your blobs
Function App A serverless compute service that lets you run event-driven code without managing infrastructure

Create a storage account

The Function App needs a storage account for smooth operation. You’ll reuse that account for your blob storage for simplicity.

First, open Azure Portal and click Create a resource > Storage Account. Fill in the details for Subscription, Resource Group, and Storage Account name (note this name). Click Review, then Create to provision your storage account.

Go to your storage account in the Azure portal. Click Access Keys in the left pane. Copy key 1 for later use.

Details of the blob storage access keys Fig. 2: Details of the blob storage access keys

Create a blob container

Before uploading any blobs, create a container in that storage account to hold them. Think of containers as folders that organize your blobs.

Go to your Azure Storage account in the Azure Portal. Click Containers in the left pane. Then, click + Container. Next, name the container (for example, “logfiles”). Set the access level to Private so the container can’t be accessed without logging into the Azure Portal. Finally, click Create.

Set up a Function App

Before diving into Azure Blob Storage, set up a new Function App. First, open the Azure portal and click Create a resource > Web > Function App. Fill in the necessary details like Subscription, Resource Group, and Function App name (this tutorial uses “blob-sdk-example-function”). For the Runtime stack, select .NET. In the Storage tab, choose the newly created account. Finally, click Review + create to deploy your Function App.

Options to create a Function App Fig. 3: Options to create a Function App

Integrating the Azure Blob Storage SDK in .NET

Integrating the Azure Blob Storage SDK in .NET

With all your resources provisioned in Azure, you can create a new Function App project. For simplicity, this tutorial uses Visual Studio.

Set up your Function App for local development

You’ll create three Azure functions:

  • A function running on a schedule to upload any log files found locally to Blob Storage
  • Another scheduled function to modify blobs by changing their tier (or deleting older blobs completely)
  • An HTTP-triggered function that acts as the third-party service being called

Begin by creating the first function to upload files. Open Visual Studio and go to File > New > Project.

Azure Functions Additional information screen Fig. 4: Azure Functions Additional information screen

Search for and select Azure Functions. Name the project BlobSdkExampleFunction and click Create.

Next, choose the Timer trigger template. Set the Schedule to the CRON expression 0 */1 * * * * to run every minute for testing. Rename the Function and the file containing it to LogsUploader and UploadLogs.cs, respectively.

Then, add the following log output to the file to test that it works:

public class LogsUploader 
{
[FunctionName("LogsUploader")]
public void Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
log.LogInformation("Uploading logs…";)
}
}

Now, build and run the project. Log messages should appear in the console logs every minute.

Add the Azure Blob Storage SDK to the .NET project

To interact with Azure Blob Storage, add its SDK to your project. Go to the NuGet package manager, search for “Azure.Storage.Blobs,” and install this package.

Uploading blobs to Azure Blob Storage

Azure Blob Storage supports multiple blob types. Block blobs are ideal for text and binary data, like files, documents, and media. Append blobs are useful for live logging data or situations requiring data appendage. Page blobs, which virtual machines (VMs) typically use, are best suited for frequent read and write operations.

This tutorial focuses on block blobs due to their prevalence in file storage. They also match the default behavior for SDK clients.

The SDK Blob clients

The Azure Blob Storage SDK provides developers with specialized client interfaces, each designed to interact with specific components of the Blob Storage. These clients act as intermediaries, allowing you to integrate, manage, and manipulate your blob data seamlessly.

BlobServiceClient

BlobServiceClient is the primary entry point for developers when working with the Azure Blob Storage account. With this client, you can:

  • Enumerate all the containers present in the blob storage account.
  • Set and retrieve account-level properties and settings.
  • Get user delegation keys, essential for creating time-limited SAS tokens with user delegation.
  • Initiate account-wide operations such as fetching statistics or setting CORS rules.

BlobContainerClient

Think of BlobContainerClient as your toolbox for a specific container within your Azure Blob Storage. This client offers functionality to:

  • Create or delete a particular container.
  • Set or fetch properties and metadata associated with the container.
  • Acquire the container's access policies and lease details.
  • List the blobs within the container and perform batch operations.
  • Directly interact with blobs, allowing you to upload to or download from them and even manage or delete specific blobs within the container.

BlobClient

As the name suggests, BlobClient is for managing individual blobs. With the BlobClient, you can:

  • Upload or download blob data.
  • Fetch or set properties and metadata associated with a specific blob.
  • Create snapshots of blobs, enabling versioning and preserving states.
  • Manage blob leases, which control concurrent access to the blob data.
  • Implement blob-level security by generating SAS tokens specific to that blob.

Initialize the clients

First, update your LogsUploader function by adding the following code. Include your storage account name and access key you noted earlier:

[FunctionName("LogsUploader")]  
public async Task Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
const string accountName = "<Your storage account name>";
const string accountKey = ""<Your Account Key Here">";
const string blobServiceEndpoint = $"https://{accountName}.blob.core.windows.net";

var credentials = new StorageSharedKeyCredential(accountName, accountKey);
var blobServiceClient = new BlobServiceClient(new Uri(blobServiceEndpoint), credentials);

var containerClient = blobServiceClient.GetBlobContainerClient("logfiles");
//…
}

Upload files

Place files in a folder on your computer’s local file system to serve as your fictional log files. Copy this folder’s path into a new variable in your LogsUploader function. For example:

[var localFolderPath = @"C:\temp\logs";            

When uploading a blob, you can set a blob name different from the file you’re uploading. For this example, keep the same name as the local file.

Next, update your function to iterate over all the files in your folder. Put it below the code you added for initializing the BlobServiceClient and BlobContainerClient. For each file, you will create a new BlobClient, upload the file to your container as a blob, and delete the file from the local disk.

[FunctionName("LogsUploader")] 
public async Task Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
//…
const string localFolderPath = @"<Your local log files path>";

foreach (var filePath in Directory.GetFiles(localFolderPath))
{
var blobName = Path.GetFileName(filePath);
var blobClient = containerClient.GetBlobClient(blobName);

await using (var uploadFileStream = new FileStream(filePath, FileMode.Open))
{
await blobClient.UploadAsync(uploadFileStream, true);
}

File.Delete(filePath);
log.LogInformation($"Uploaded and deleted local file: {filePath}");
}
}

Now, build and run your Function App. It will check for new files every minute, upload any it finds, and remove the local copies.

Inspect the container in the Azure portal to confirm the function worked as expected. Or, list the contents using the .NET SDK, which you’ll do next.

Retrieving blobs

Azure Blob Storage provides multiple methods to list and retrieve blobs. Here, you’ll focus on listing all blobs in a container and retrieving a specific blob by name.

List blobs in a container

To list all blobs in a container, use the GetBlobsAsync method from the BlobContainerClient class. Update your function to output all the blobs to the logging process after the upload:

 [FunctionName("LogsUploader")] 
public async Task Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
//…

await foreach (BlobItem blobItem in containerClient.GetBlobsAsync()) {
log.LogInformation($"Blob name: {blobItem.Name}");
}
}

When you run the app now, you should get a log output in the console, showing all files currently in your blob container.

Retrieve a specific blob

This tutorial’s fictional scenario requires triggering another function each time a new blob is uploaded, then downloading it for additional processing elsewhere. Use the GetBlobClient method from the BlobContainerClient class and specify the blob’s name.

You’ll need a new function to do this. Create a new file in your LogsProcessor.cs project and paste the following code. Remember to add your storage account name and key:

 public static class LogsProcessor  
{
[FunctionName("LogsProcessor")]
public static async Task Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
ILogger log)
{
var requestBody = await new StreamReader(req.Body).ReadToEndAsync();
var data = JsonConvert.DeserializeObject(requestBody);
string blobName = data?.BlobName;

const string accountName = "<Your storage account name>";
const string accountKey = "<Your Account Key Here>";
const string blobServiceEndpoint = $"https://{accountName}.blob.core.windows.net";

var credentials = new StorageSharedKeyCredential(accountName, accountKey);
var blobServiceClient = new BlobServiceClient(new Uri(blobServiceEndpoint), credentials);
var containerClient = blobServiceClient.GetBlobContainerClient("logfiles");
var blobClient = containerClient.GetBlobClient(blobName);

const string localFolderPath = @"<Your local downloaded log files path>";
var downloadFilePath = Path.Combine(localFolderPath, blobName);
BlobDownloadInfo download = await blobClient.DownloadAsync();

await using FileStream fs = File.OpenWrite(downloadFilePath);
await download.Content.CopyToAsync(fs);
fs.Close();

log.LogInformation($"Downloaded blob to: {downloadFilePath}");
return new OkObjectResult($"Downloaded blob to: {downloadFilePath}");
}
}

In production, you’d likely ship these logs to some other analytics service. However, for demonstration purposes, you’ll download them locally here. Create a new folder to store downloaded logs and set the path for the variable localFolderPath.

Add code to the LogsUploader function to trigger this processor. In the Run function, append the following line:

 [FunctionName("LogsUploader")] 
public async Task Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
//…

await TriggerProcessorFunction(blobName);
}

Then, add the new method to the LogsUploader class:

 private async Task TriggerProcessorFunction(string blobName) 
{
var client = new HttpClient();
var payload = new { BlobName = blobName };
var content = new StringContent(JsonConvert.SerializeObject(payload), Encoding.UTF8, "application/json");

// Check the port being used when running and adjust if necessary
await client.PostAsync("http://localhost:7071/api/LogsProcessor", content);
}

Important: Confirm the port where your LogsProcessor runs.

Build and rerun your Function App project, and place the new file in your original folder. The LogsUploader function moves the file to your Blob storage as before, then triggers LogProcesor to download a copy.

Modifying and deleting blobs

Now, create another function scheduled to run every minute. The setup will be the same as earlier, and you’ll need a BlobContainerClient again to interact with your container.

For this tutorial, change the access tier of blobs older than 3 minutes to “cool” using the SetAccessTierAsync method with the BlobClient. Next, delete any blobs older than 6 minutes using the DeleteIfExistsAsync method.

Bring it all together with the following code:

 public class LogsManager  
{
[FunctionName("LogsManager")]
public async Task Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
const string accountName = "<Your storage account name>";
const string accountKey = "<Your Account Key Here>";
const string blobServiceEndpoint = $"https://{accountName}.blob.core.windows.net";

var credentials = new StorageSharedKeyCredential(accountName, accountKey);
var blobServiceClient = new BlobServiceClient(new Uri(blobServiceEndpoint), credentials);

var containerClient = blobServiceClient.GetBlobContainerClient("logfiles");

var currentTime = DateTimeOffset.UtcNow;
await foreach (BlobItem blobItem in containerClient.GetBlobsAsync())
{
BlobClient blobClient = containerClient.GetBlobClient(blobItem.Name);
BlobProperties properties = await blobClient.GetPropertiesAsync();

var ageInMinutes = (currentTime - properties.CreatedOn).TotalMinutes;

switch (ageInMinutes)
{
case >= 6:
await blobClient.DeleteIfExistsAsync();
log.LogInformation($"Deleted blob: {blobItem.Name}");
break;
case >= 3 when properties.AccessTier == AccessTier.Hot:
await blobClient.SetAccessTierAsync(AccessTier.Cool);
log.LogInformation($"Moved blob to cool storage: {blobItem.Name}");
break;
}
}
}
}

By following these steps, you can effectively manage your blobs’ lifecycles, optimizing storage costs and improving data retrieval times. The screenshot below shows some log files and their access tiers:

Log files in a blob Fig. 5: Log files in a blob

Now, build and run the application. As you pass files through, you can check the access tier updates in the Azure portal.

Implementing security with SAS

When working with Azure Blob Storage, ensuring secure access to your data is paramount. Shared access signatures (SAS) offer one of the most flexible ways to secure your blobs.

What is SAS?

A SAS Uniform Resource Identifier (URI) grants clients limited access to Azure Storage without giving out account keys. It’s great for temporary access or when third-party clients aren’t fully trustworthy.

Imagine that the LogsProcessor function represents a third-party service. They only need brief access to copy each new blob, so you don’t want to give them full access to the storage account.

Generate the SAS token

One solution is to have your LogsUploader function generate an SAS token scoped to each blob and limited to one hour. You can then build a complete URI to download a blob, including this token, and then send it to LogsProcessor.

First, add the following method to your LogsUploader class

 private static Uri GenerateBlobSasToken(BlobClient blobClient) 
{
var sasBuilder = new BlobSasBuilder
{
BlobContainerName = blobClient.BlobContainerName,
BlobName = blobClient.Name,
Resource = "b",
StartsOn = DateTimeOffset.UtcNow,
ExpiresOn = DateTimeOffset.UtcNow.AddHours(1)
};
sasBuilder.SetPermissions(BlobSasPermissions.Read);
return blobClient.GenerateSasUri(sasBuilder);
}

The function accepts a BlobClient instance, representing the SAS token’s target blob. The Resource is set to b, indicating that the SAS is for a single Blob.

Since you’re now sending a complete URI to the processor function, modify your trigger method:


private async Task TriggerProcessorFunction(Uri blobUriWithSas)
{
var client = new HttpClient();
var payload = new { BlobUriWithSas = blobUriWithSas.ToString() };
var content = new StringContent(JsonConvert.SerializeObject(payload), Encoding.UTF8, "application/json");

// Check the port being used when running and adjust if necessary
await client.PostAsync("http://localhost:7071/api/LogsProcessor", content);
}

Next, update the code at the end of your LogsUploader Run function:

[FunctionName("LogsUploader")]  
public async Task Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
//…

var blobUriWithSas = GenerateBlobSasToken(blobClient);
await TriggerProcessorFunction(blobUriWithSas);
}

Use the SAS token

The LogsProcessor no longer needs to authenticate with a key for the whole storage account. When triggered, it receives a URI that points to only the trigger blob, including the SAS token for access to that blob only. So, update your LogsProcessor code to use this URI:

[FunctionName("LogsProcessor")]  
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
ILogger log)
{
var requestBody = await new StreamReader(req.Body).ReadToEndAsync();
var data = JsonConvert.DeserializeObject<dynamic>(requestBody);
string blobUriWithSas = data?.BlobUriWithSas;

const string localFolderPath = @"<Your local downloaded log files path";

var blobClient = new BlobClient(new Uri(blobUriWithSas));

var blobName = blobClient.Name;
var downloadFilePath = Path.Combine(localFolderPath, blobName);
BlobDownloadInfo download = await blobClient.DownloadAsync();

await using FileStream fs = File.OpenWrite(downloadFilePath);
await download.Content.CopyToAsync(fs);
fs.Close();

log.LogInformation($"Downloaded blob to: {downloadFilePath}");
return new OkObjectResult($"Downloaded blob to: {downloadFilePath}");
}

You’re no longer sharing full storage account access with your fictional third-party processing service.

Conclusion

This tutorial demonstrated managing Azure Blob Storage using the .NET SDK. From integrating the SDK into your applications to uploading, retrieving, modifying, and securing your blobs, you explored a broad spectrum of operations essential for any developer working with Azure Blob Storage.

The synergy of .NET and Azure Blob Storage provides a robust platform for building scalable and secure cloud storage solutions. Understanding and effectively managing your cloud storage resources is key to building efficient and secure applications.

Keep experimenting with the Azure Blob Storage SDK and exploring its myriad features to optimize your storage management strategies. And to ensure you’re using your Azure Blob Storage resources effectively, try Site24x7’s Azure Monitoring tool. Its capabilities, including tracking ingress/egress volumes, checking blob capacity, and counting containers, help you improve your Azure Storage resource consumption, and its IT automation, reports, and alerts features help you stay in-the-know about your Azure Storage services.

Was this article helpful?

Related Articles

Write For Us

Write for Site24x7 is a special writing program that supports writers who create content for Site24x7 "Learn" portal. Get paid for your writing.

Write For Us

Write for Site24x7 is a special writing program that supports writers who create content for Site24x7 “Learn” portal. Get paid for your writing.

Apply Now
Write For Us