Menu Home

Deploying to Azure Functions and Blob Storage from npm scripts

My latest web app project is a Typescript SPA frontend hosted in Blob storage with a Serverless Azure Functions Javascript backend. For an optimal development experience I need automated deployment that is easy and reproducible. This will be used on my Windows git4windows bash development environment and later on a CI/CD server (which might be Windows but more likely, Linux). I expected this was going to be “easy-peasy” given Azure Function’s relative maturity and the number of deployment options available. Wrong! I spent considerable time fiddling about getting something ‘just right’ for deploying the SPA and and backend to Azure. The problem is an absence of portable CLI commands that sync a local directory to either the AzureFunction file space or Blob storage. Hopefully this post will save others the same hassle.

The simple portable solution uses:

  • The AzCopy command to deploy the SPA to Blob storage while setting the correct ContentType (sets MIME type when served over HTTP)
  • FTP ncftpput (part of the ncftp client) to recursively deploy the Function App directory structure, without using any repositories
  • npm dotenv and cross-env to provide cross platform private env var access (both are npm CLI packages)

This is by no means a perfect solution and I explore some potential alternatives below. Note WAWSDeploy is an effective Windows-only solution for the Functions code.

Here are the relevant npm scripts I use:

  • The Azure Functions FTP settings are now rather hidden in the Functions -> Platform Features -> Properties. I used FTPS host name and FTP/deployment user. You set a subscription-wide username and password in Functions -> Platform Features -> Deployment Credentials. Note, changing the password requires changing the username to something else and then back again to avoid an ‘in use’ error.
  • Cross-env lets you use a portable $ENVVAR form of environment variable reference in your npm scripts
  • Dotenv lets you provide all your deployment variables in a provate .env file. Hopefully, it’s obvious that you must never checkin this to VC, so add *.env to your .gitignore. You can provide a different .env file per deployment scenario, or on a CI/CD server you could just set the env vars directly.
  • The extra \" round the ncftpput command stop dotenv interpreting the -p option

And sample output from npm run deploy:

Here’s an elided .env file


  • A disadvantage of FTP is it does not do a full sync and delete files that no are longer needed. It’s a shame rsync is not supported, though that can be a pain to get configured. Indeed, it appears that there is no single cross platform tool that can sync folders to/from Functions filespace. Perhaps another AzureFunction for that.
  • AzCopy also does not delete existing files, though it does copy directory trees to a Blob container.
  • NcFTP is a mature, portable client and offers directory tree copy, unlike basic clients. However FTP copying is slow compared to the optimised Git sync methods, especially as there is a lack of incremental updates. A common technique is to transfer a single .ZIP archive and have the server unpack it. Unfortunately, this doesn’t seem to be available for simple FTP deployments. Perhaps a new Azure Function would fix that. [Update: it can be done using the Kudu REST API, but again does not delete unwanted files]
  • The OneDrive and DropBox deployment options for Azure Functions sound useful but they require a manual sync in the portal, making then useless for automated deployment.

 Why not Git?

  • Azure Functions deployment options are heavily oriented to using VC, git specifically,  but that really is is not ideal as you either have to commit all your deploy-able artifacts to your source repository or use a separate repository just for deployment. Having these in VC does mean you can roll back to any version, but I intend to use slots for 1 level rollback and will use the source VC to rebuild other version if ever needed. You may not be bothered about putting built artifacts in your main repository but I see it as an anti pattern, not to mention the unnecessary bloat. Typescript produces JS and typings and for Javascript on nodejs you currently need to bundle all the dependencies to get over the slow cold start times.
  • Another reason for using a separate deployment repository is your main repository is extremely likely to contain many files that you do NOT want deployed, but will be.
  • Azure Functions does not support SSH authentication and Git HTTPS authentication options mean you will be prompted for a password unless you use a credential manager.
  • While it would be possible to create/use a separate git repository for deployment, it cannot live in the project directory structure (can’t nest git repos) and seems like quite a bit of hassle to get right with scripting. Git submodules may be a solution.

In search of a better solution


For the back-end’s AzureFunction deployment, the limitations of FTP are a problem. FTP is slow, is not incremental and will not sync directory trees by removing unwanted files. I’ve not found a simple cross platform sync tool that does all this. If you know of one please do tell in the comments!

Git does address these all issues with its fast minimal on wire protocols and working directory syncing. That’s probably why the AzureFunctions deployment options are so centred around git. Given this I should either forget my aversion to having a separate deployment repository or investigate a workjable solution, complete wit hthe require git setup. This might use git submodules for a local repo dynamically created in a temporary folder.

The Azure Functions Core Tools does include a non syncing publish action but the tool is too dependent on context and has to be run in the correct folder making it fiddly to use. It is also WIndows only

A Windows-only solution is David Ebbo’s WAWSDeploy which wraps msdeploy and does have a delete option. It works really well. You only need deploy the .PublishSettings file you can dowload from the Azure Function App blade in the portal (Get Publish Profile) and use a npm script like this (don’t forget to add *.PublishSettinigs to .gitingore):

Blob Storage

Again, for the SPA’s Azure Blob Storage, I could find no CLI utility that will both sync and set ContentType. So, until the Azure CLI tool supports both (“PRs welcome”) the answer is to use AZCopy with either an occasional manual cleanup or perhaps using cron or an AzureFunction.

While the Azure CLI might seem the obvious choice, it currently does not set the Blob storage ContentType automatically based on file extension. [Update: A fix has been commited now].

AzCopy does do this and is available for Windows and Linux. Strangely though, the Windows version does not accept the – option switch character which would be portable. The Azure CLI does not offer a directory sync option either, just a basic copy. In fact, there doesn’t appear to be any tool that does both and is cross platform.

Possible solutions include a new feature in the Azure CLI , an update to AzCopy (not OSS) or a new utility to replace AZCopy.

Alternatively the site storage could be changed to use a simple static web server AzureFunction like Anthony Chu’s, Deployment then becomes identical to the backend, though you’d want a separate Function App as the deployment cadence will be distinct for each. There are obviously trade-offs compared to using blob storage, such as performance. One obvious question is will CDN caching will work?



Categories: Uncategorized

Tagged as:

1 reply

Leave a Reply