Skip to content

Storage library provides a universal interface for accessing and manipulating data in different cloud blob storage providers

License

Notifications You must be signed in to change notification settings

managedcode/Storage

Repository files navigation

img|300x200

ManagedCode.Storage

.NET codecov nuget CodeQL

Alert Status Coverage

Version Package Description
NuGet Package ManagedCode.Storage.Core Core
NuGet Package ManagedCode.Storage.FileSystem FileSystem
NuGet Package ManagedCode.Storage.Azure Azure
NuGet Package ManagedCode.Storage.Aws AWS
NuGet Package ManagedCode.Storage.Gcp GCP
NuGet Package ManagedCode.Storage.Sftp SFTP
NuGet Package ManagedCode.Storage.VirtualFileSystem Virtual FS
NuGet Package ManagedCode.Storage.Client .NET Client
NuGet Package ManagedCode.Storage.AspNetExtensions AspNetExtensions
NuGet Package ManagedCode.Storage.Server ASP.NET Server
NuGet Package ManagedCode.Storage.Client.SignalR SignalR Client

Storage


General concept

One of the key benefits of using a universal wrapper for cloud blob storages is that it provides a consistent, easy-to-use interface for working with different types of blob storage. This can make it much easier for developers to switch between different storage providers, or to use multiple providers in the same project.

A universal wrapper can also simplify the development process by providing a single set of methods for working with blob storage, rather than requiring developers to learn and use the different APIs provided by each storage provider. This can save time and reduce the complexity of the code, making it easier to write, maintain, and debug.

In addition, a universal wrapper can provide additional functionality that is not available through the individual storage providers, such as support for common patterns like asynchronous programming and error handling. This can make it easier to write high-quality, reliable code that is robust and resilient to errors.

Overall, using a universal wrapper for cloud blob storages can provide many benefits, including improved flexibility, simplicity, and reliability in your application. A universal storage for working with multiple storage providers:

  • Azure
  • Google Cloud
  • Amazon
  • FileSystem

Motivation

Cloud storage is a popular and convenient way to store and access data in the cloud. However, different cloud storage providers often have their own unique APIs and interfaces for accessing and manipulating data. This can make it difficult to switch between different providers or to use multiple providers simultaneously.

Our library, provides a universal interface for accessing and manipulating data in different cloud blob storage providers. This makes it easy to switch between providers or to use multiple providers at the same time, without having to learn and use multiple APIs.

Features

  • Provides a universal interface for accessing and manipulating data in different cloud blob storage providers.
  • Makes it easy to switch between providers or to use multiple providers simultaneously.
  • Supports common operations such as uploading, downloading, and deleting data, plus optional in-memory Virtual File System (VFS) storage for fast testing.
  • Provides first-class ASP.NET controller extensions and a SignalR hub/client pairing (two-step streaming handshake) for uploads, downloads, and chunk orchestration.
  • Ships keyed dependency-injection helpers so you can register multiple named providers and mirror assets across regions or vendors.
  • Exposes configurable server options for large-file thresholds, multipart parsing limits, and range streaming.

Virtual File System (VFS)

Need to hydrate storage dependencies without touching disk or the cloud? The ManagedCode.Storage.VirtualFileSystem package keeps everything in memory and makes it trivial to stand up repeatable tests or developer sandboxes:

// Program.cs / Startup.cs
builder.Services.AddVirtualFileSystemStorageAsDefault(options =>
{
    options.StorageName = "vfs";   // optional logical name
});

// Usage
public class MyService
{
    private readonly IStorage storage;

    public MyService(IStorage storage) => this.storage = storage;

    public Task UploadAsync(Stream stream, string path) => storage.UploadAsync(stream, new UploadOptions(path));
}

// In tests you can pre-populate the VFS
await storage.UploadAsync(new FileInfo("fixtures/avatar.png"), new UploadOptions("avatars/user-1.png"));

Because the VFS implements the same abstractions as every other provider, you can swap it for in-memory integration tests while hitting Azure, S3, etc. in production.

Dependency Injection & Keyed Registrations

Every provider ships with default and provider-specific registrations, but you can also assign multiple named instances using .NET's keyed services. This makes it easy to route traffic to different containers/buckets (e.g. azure-primary vs. azure-dr) or to fan out a file to several backends:

using Amazon;
using Amazon.S3;
using ManagedCode.MimeTypes;
using Microsoft.Extensions.DependencyInjection;
using System.IO;
using System.Threading;
using System.Threading.Tasks;

builder.Services
    .AddAzureStorage("azure-primary", options =>
    {
        options.ConnectionString = configuration["Storage:Azure:Primary:ConnectionString"]!;
        options.Container = "assets";
    })
    .AddAzureStorage("azure-dr", options =>
    {
        options.ConnectionString = configuration["Storage:Azure:Dr:ConnectionString"]!;
        options.Container = "assets-dr";
    })
    .AddAWSStorage("aws-backup", options =>
    {
        options.PublicKey = configuration["Storage:Aws:AccessKey"]!;
        options.SecretKey = configuration["Storage:Aws:SecretKey"]!;
        options.Bucket = "assets-backup";
        options.OriginalOptions = new AmazonS3Config
        {
            RegionEndpoint = RegionEndpoint.USEast1
        };
    });

public sealed class AssetReplicator
{
    private readonly IAzureStorage _primary;
    private readonly IAzureStorage _disasterRecovery;
    private readonly IAWSStorage _backup;

    public AssetReplicator(
        [FromKeyedServices("azure-primary")] IAzureStorage primary,
        [FromKeyedServices("azure-dr")] IAzureStorage secondary,
        [FromKeyedServices("aws-backup")] IAWSStorage backup)
    {
        _primary = primary;
        _disasterRecovery = secondary;
        _backup = backup;
    }

    public async Task MirrorAsync(Stream content, string fileName, CancellationToken cancellationToken = default)
    {
        await using var buffer = new MemoryStream();
        await content.CopyToAsync(buffer, cancellationToken);

        buffer.Position = 0;
        var uploadOptions = new UploadOptions(fileName, mimeType: MimeHelper.GetMimeType(fileName));

        await _primary.UploadAsync(buffer, uploadOptions, cancellationToken);

        buffer.Position = 0;
        await _disasterRecovery.UploadAsync(buffer, uploadOptions, cancellationToken);

        buffer.Position = 0;
        await _backup.UploadAsync(buffer, uploadOptions, cancellationToken);
    }
}

Keyed services can also be resolved via IServiceProvider.GetRequiredKeyedService<T>("key") when manual dispatching is required.

Want to double-check data fidelity after copying? Pair uploads with Crc32Helper:

var download = await _backup.DownloadAsync(fileName, cancellationToken);
download.IsSuccess.ShouldBeTrue();

await using var local = download.Value;
var crc = Crc32Helper.CalculateFileCrc(local.FilePath);
logger.LogInformation("Backup CRC for {File} is {Crc}", fileName, crc);

The test suite includes end-to-end scenarios that mirror payloads between Azure, AWS, the local file system, and virtual file systems; multi-gigabyte flows execute by default across every provider using 4 MB units per "GB" to keep runs fast while still exercising streaming paths.

ASP.NET Controllers & SignalR Streaming

The ManagedCode.Storage.Server package exposes ready-to-use controllers plus a SignalR hub that sit on top of any IStorage implementation. Pair it with the ManagedCode.Storage.Client.SignalR library to stream files from browsers, desktop or mobile apps:

// Program.cs / Startup.cs
builder.Services
    .AddStorageServer(options =>
    {
        options.InMemoryUploadThresholdBytes = 512 * 1024; // spill to disk after 512 KB
        options.MultipartBoundaryLengthLimit = 128;        // relax multipart parsing limit
    })
    .AddStorageSignalR();    // registers StorageHub options

app.MapControllers();
app.MapStorageHub();        // maps /hubs/storage by default

// Client usage
var client = new StorageSignalRClient(new StorageSignalRClientOptions
{
    HubUrl = new Uri("https://myapi/hubs/storage")
});

await client.ConnectAsync();
await client.UploadAsync(fileStream, new StorageUploadStreamDescriptor
{
    FileName = "video.mp4",
    ContentType = "video/mp4"
});

// Download back into a stream
await client.DownloadAsync("video.mp4", destinationStream);

Events such as TransferProgress and TransferCompleted fire automatically, enabling live progress UI or resumable workflows. Extending the default controller is a one-liner:

[Route("api/files")]
public sealed class FilesController : StorageControllerBase<IMyCustomStorage>
{
    public FilesController(IMyCustomStorage storage,
        ChunkUploadService chunks,
        StorageServerOptions options)
        : base(storage, chunks, options)
    {
    }
}

// Program.cs
builder.Services.AddStorageServer(opts =>
{
    opts.EnableRangeProcessing = true;
    opts.InMemoryUploadThresholdBytes = 1 * 1024 * 1024; // 1 MB
});
builder.Services.AddStorageSignalR();

app.MapControllers();
app.MapStorageHub();

Use the built-in controller extension methods to tailor behaviours (e.g. UploadFormFileAsync, DownloadAsStreamAsync) or override the base actions to add authorization filters, custom routing, or domain-specific validation.

SignalR uploads follow a two-phase handshake: the client calls BeginUploadStreamAsync to reserve an identifier, then streams payloads through UploadStreamContentAsync while consuming the server-generated status channel. The StorageSignalRClient handles this workflow automatically.

Connection modes

You can connect storage interface in two modes provider-specific and default. In case of default you are restricted with one storage type

Azure

Default mode connection:

// Startup.cs
services.AddAzureStorageAsDefault(new AzureStorageOptions
{
    Container = "{YOUR_CONTAINER_NAME}",
    ConnectionString = "{YOUR_CONNECTION_NAME}",
});

Using in default mode:

// MyService.cs
public class MyService
{
    private readonly IStorage _storage;

    public MyService(IStorage storage)
    {
        _storage = storage;
    }
}

Provider-specific mode connection:

// Startup.cs
services.AddAzureStorage(new AzureStorageOptions
{
    Container = "{YOUR_CONTAINER_NAME}",
    ConnectionString = "{YOUR_CONNECTION_NAME}",
});

Using in provider-specific mode

// MyService.cs
public class MyService
{
    private readonly IAzureStorage _azureStorage;

    public MyService(IAzureStorage azureStorage)
    {
        _azureStorage = azureStorage;
    }
}

Need multiple Azure accounts or containers? Call services.AddAzureStorage("azure-primary", ...) and decorate constructor parameters with [FromKeyedServices("azure-primary")].

Google Cloud (Click here to expand)

Google Cloud

Default mode connection:

// Startup.cs
services.AddGCPStorageAsDefault(opt =>
{
    opt.GoogleCredential = GoogleCredential.FromFile("{PATH_TO_YOUR_CREDENTIALS_FILE}.json");

    opt.BucketOptions = new BucketOptions()
    {
        ProjectId = "{YOUR_API_PROJECT_ID}",
        Bucket = "{YOUR_BUCKET_NAME}",
    };
});

Using in default mode:

// MyService.cs
public class MyService
{
    private readonly IStorage _storage;
  
    public MyService(IStorage storage)
    {
        _storage = storage;
    }
}

Provider-specific mode connection:

// Startup.cs
services.AddGCPStorage(new GCPStorageOptions
{
    BucketOptions = new BucketOptions()
    {
        ProjectId = "{YOUR_API_PROJECT_ID}",
        Bucket = "{YOUR_BUCKET_NAME}",
    }
});

Using in provider-specific mode

// MyService.cs
public class MyService
{
    private readonly IGCPStorage _gcpStorage;
    public MyService(IGCPStorage gcpStorage)
    {
    _gcpStorage = gcpStorage;
    }
}

Need parallel S3 buckets? Register them with AddAWSStorage("aws-backup", ...) and inject via [FromKeyedServices("aws-backup")].

Amazon (Click here to expand)

Amazon

Default mode connection:

// Startup.cs
//aws libarary overwrites property values. you should only create configurations this way. 
var awsConfig = new AmazonS3Config();
awsConfig.RegionEndpoint = RegionEndpoint.EUWest1;
awsConfig.ForcePathStyle = true;
awsConfig.UseHttp = true;
awsConfig.ServiceURL = "http://localhost:4566"; //this is the default port for the aws s3 emulator, must be last in the list

services.AddAWSStorageAsDefault(opt =>
{
    opt.PublicKey = "{YOUR_PUBLIC_KEY}";
    opt.SecretKey = "{YOUR_SECRET_KEY}";
    opt.Bucket = "{YOUR_BUCKET_NAME}";
    opt.OriginalOptions = awsConfig;
});

Using in default mode:

// MyService.cs
public class MyService
{
    private readonly IStorage _storage;
  
    public MyService(IStorage storage)
    {
        _storage = storage;
    }
}

Provider-specific mode connection:

// Startup.cs
services.AddAWSStorage(new AWSStorageOptions
{
    PublicKey = "{YOUR_PUBLIC_KEY}",
    SecretKey = "{YOUR_SECRET_KEY}",
    Bucket = "{YOUR_BUCKET_NAME}",
    OriginalOptions = awsConfig
});

Using in provider-specific mode

// MyService.cs
public class MyService
{
    private readonly IAWSStorage _storage;
    public MyService(IAWSStorage storage)
    {
        _storage = storage;
    }
}

Need parallel S3 buckets? Register them with AddAWSStorage("aws-backup", ...) and inject via [FromKeyedServices("aws-backup")].

FileSystem (Click here to expand)

FileSystem

Default mode connection:

// Startup.cs
services.AddFileSystemStorageAsDefault(opt =>
{
    opt.BaseFolder = Path.Combine(Environment.CurrentDirectory, "{YOUR_BUCKET_NAME}");
});

Using in default mode:

// MyService.cs
public class MyService
{
    private readonly IStorage _storage;
  
    public MyService(IStorage storage)
    {
        _storage = storage;
    }
}

Provider-specific mode connection:

// Startup.cs
services.AddFileSystemStorage(new FileSystemStorageOptions
{
    BaseFolder = Path.Combine(Environment.CurrentDirectory, "{YOUR_BUCKET_NAME}"),
});

Using in provider-specific mode

// MyService.cs
public class MyService
{
    private readonly IFileSystemStorage _fileSystemStorage;
    public MyService(IFileSystemStorage fileSystemStorage)
    {
        _fileSystemStorage = fileSystemStorage;
    }
}

Mirror to multiple folders? Use AddFileSystemStorage("archive", options => options.BaseFolder = ...) and resolve them via [FromKeyedServices("archive")].

How to use

We assume that below code snippets are placed in your service class with injected IStorage:

public class MyService
{
    private readonly IStorage _storage;
    public MyService(IStorage storage)
    {
        _storage = storage;
    }
}

Upload

await _storage.UploadAsync(new Stream());
await _storage.UploadAsync("some string content");
await _storage.UploadAsync(new FileInfo("D:\\my_report.txt"));

Delete

await _storage.DeleteAsync("my_report.txt");

Download

var localFile = await _storage.DownloadAsync("my_report.txt");

Get metadata

await _storage.GetBlobMetadataAsync("my_report.txt");

Native client

If you need more flexibility, you can use native client for any IStorage<T>

_storage.StorageClient

Conclusion

In summary, Storage library provides a universal interface for accessing and manipulating data in different cloud blob storage providers, plus ready-to-host ASP.NET controllers, SignalR streaming endpoints, keyed dependency injection, and a memory-backed VFS. It makes it easy to switch between providers or to use multiple providers simultaneously, without having to learn and use multiple APIs, while staying in full control of routing, thresholds, and mirroring. We hope you find it useful in your own projects!

About

Storage library provides a universal interface for accessing and manipulating data in different cloud blob storage providers

Topics

Resources

License

Stars

Watchers

Forks

Contributors 14

Languages