When using Azure LogicApps and unzipping a file in blob storage there is a size limit of 50 MB with the base component ‘Extract archive to folder’. We can surpass this limitation with the help of an Azure Function like so.
First we will need to create an Azure Function that can take in the following JSON body:
{ "connectionString": "CONNECTIONSTRING", "container": "container", "sourcePath": "path/file.zip", "destinationPath": "path/unzip/" }
In this JSON object the storage account is obtained from data.connectionString, the container is grabbed form data.container, and the additional paths are used to determine the source and destination locations for the extract. Note that the entire path including the container needs to be lowercase.
After we set up or import payload we can add the following code to our Azure function:
#r "Microsoft.WindowsAzure.Storage" #r "Newtonsoft.Json" using System.IO; using System.Threading.Tasks; using System.Net; using Microsoft.AspNetCore.Mvc; using Microsoft.Extensions.Primitives; using Microsoft.AspNetCore.Http; using Microsoft.Extensions.Logging; using Newtonsoft.Json; using System.Collections.Generic; using System.Text; using System.IO.Compression; using Microsoft.Azure; using Microsoft.WindowsAzure.Storage; using Microsoft.WindowsAzure.Storage.Blob; public static async Task<IActionResult> Run(HttpRequest req, ILogger log) { log.LogInformation("Starting decompress function."); string requestBody = await new StreamReader(req.Body).ReadToEndAsync(); dynamic data = JsonConvert.DeserializeObject(requestBody); string destinationPath = data.destinationPath; string connectionString = data.connectionString; string containerString = data.container; string sourcePath = data.sourcePath; // Retrieve storage account from connection string. CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString); // Create the blob client. CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); // Retrieve reference to a zip file container. CloudBlobContainer container = blobClient.GetContainerReference(containerString); // Retrieve reference to the blob - zip file which we wanted to extract CloudBlockBlob blockBlob = container.GetBlockBlobReference(sourcePath); //Retrieve reference to a container where you wanted to extract the zip file. CloudBlobContainer extractcontainer = blockBlob.ServiceClient.GetContainerReference(containerString); var containerCreated = await extractcontainer.CreateIfNotExistsAsync(); // Save blob(zip file) contents to Memory Stream. using (var zipBlobFileStream = new MemoryStream()) { await blockBlob.DownloadToStreamAsync(zipBlobFileStream); await zipBlobFileStream.FlushAsync(); zipBlobFileStream.Position = 0; //use ZipArchive from System.IO.Compression to extract all the files from zip file using (var zip = new ZipArchive(zipBlobFileStream)) { //Each entry here represents an individual file or a folder foreach (var entry in zip.Entries) { //creating an empty file (blobkBlob) for the actual file with the same name of file string exportPath = String.Concat(destinationPath, entry.FullName); var blob = extractcontainer.GetBlockBlobReference(exportPath); using (var stream = entry.Open()) { //check for file or folder and update the above blob reference with actual content from stream if (entry.Length > 0) await blob.UploadFromStreamAsync(stream); } } } } byte[] arr = null; return new FileContentResult(arr, "application/octet-stream"); }
Note: To correctly use the referenced packages, we need to create a file called ‘function.proj’ that looks as follows and use the ‘Upload’ button to import that into our Azure Function App.
<Project Sdk="Microsoft.NET.Sdk"> <PropertyGroup> <TargetFramework>netstandard2.0</TargetFramework> </PropertyGroup> <ItemGroup> <PackageReference Include="WindowsAzure.Storage" Version="8.1.4" /> <PackageReference Include="System.IO.Compression" Version="4.3.0" /> <PackageReference Include="Microsoft.WindowsAzure.ConfigurationManager" Version="3.2.3" /> </ItemGroup> </Project>
That’s it – then we can test our Azure function with the Test/Run feature right in the portal. If we are happy with the outputs in the referenced storage account we can replicate the inbound JSON payload from a LogicApp call or various other source locations.
Keep reading about D365 tips and tricks here: https://markedcode.com/index.php/category/d365/