The simple portable solution uses:
- The AzCopy command to deploy the SPA to Blob storage while setting the correct ContentType (sets MIME type when served over HTTP)
- FTP ncftpput (part of the ncftp client) to recursively deploy the Function App directory structure, without using any repositories
- npm dotenv and cross-env to provide cross platform private env var access (both are npm CLI packages)
This is by no means a perfect solution and I explore some potential alternatives below. Note WAWSDeploy is an effective Windows-only solution for the Functions code.
Here are the relevant npm scripts I use:
"deploy:spa": "dotenv cross-env-shell AzCopy /Source:build /Dest:$AZURE_SPA_CONTAINER_URL /DestKey:$AZURE_SPA_STORAGE_KEY /S /Y /SetContentType",
"deploy:func": "dotenv cross-env-shell \"ncftpput -R -u $AZURE_FTP_USER -p $AZURE_FTP_PWD $AZURE_FTP_HOST / functionApp/src/\"",
"deploy": "npm run deploy:spa && npm run deploy:func"
- The Azure Functions FTP settings are now rather hidden in the
Functions -> Platform Features -> Properties. I used
FTPS host nameand
FTP/deployment user. You set a subscription-wide username and password in
Functions -> Platform Features -> Deployment Credentials. Note, changing the password requires changing the username to something else and then back again to avoid an ‘in use’ error.
- Cross-env lets you use a portable
$ENVVARform of environment variable reference in your npm scripts
- Dotenv lets you provide all your deployment variables in a provate
.envfile. Hopefully, it’s obvious that you must never checkin this to VC, so add
.gitignore. You can provide a different
.envfile per deployment scenario, or on a CI/CD server you could just set the env vars directly.
- The extra
ncftpputcommand stop dotenv interpreting the
And sample output from
npm run deploy
C:\projects\brian>npm run deploy
> firstname.lastname@example.org deploy C:\projects\brian
> npm run deploy:spa && npm run deploy:func
> email@example.com deploy:spa C:\projects\brian
> dotenv cross-env-shell AzCopy /Source:build /Dest:$AZURE_SPA_CONTAINER_URL /DestKey:$AZURE_SPA_STORAGE_KEY /S /Y /SetContentType
Finished 5 of total 5 file(s).
[2017/08/12 11:00:52] Transfer summary:
Total files transferred: 5
Transfer successfully: 5
Transfer skipped: 0
Transfer failed: 0
Elapsed time: 00.00:00:02
> firstname.lastname@example.org deploy:func C:\projects\brian
> dotenv cross-env-shell "ncftpput -R -u $AZURE_FTP_USER -p $AZURE_FTP_PAWD $AZURE_FTP_HOST / functionApp/src/"
functionApp\src\host.json: 2.00 B 43.07 B/s
functionApp\src\local.settings.json: 1.14 kB 14.83 kB/s
functionApp\src\proxies.json: 721.00 B 9.96 kB/s
functionApp\src\HttpTriggerJS1\function.json: 257.00 B 4.73 kB/s
functionApp\src\HttpTriggerJS1\index.js: 510.00 B 12.45 kB/s
Here’s an elided
AZURE_FTP_HOST=<HOST-DOMAIN> # without the protocol provided in the portal
- A disadvantage of FTP is it does not do a full sync and delete files that no are longer needed. It’s a shame rsync is not supported, though that can be a pain to get configured. Indeed, it appears that there is no single cross platform tool that can sync folders to/from Functions filespace. Perhaps another AzureFunction for that.
- AzCopy also does not delete existing files, though it does copy directory trees to a Blob container.
- NcFTP is a mature, portable client and offers directory tree copy, unlike basic clients. However FTP copying is slow compared to the optimised Git sync methods, especially as there is a lack of incremental updates. A common technique is to transfer a single .ZIP archive and have the server unpack it. Unfortunately, this doesn’t seem to be available for simple FTP deployments. Perhaps a new Azure Function would fix that. [Update: it can be done using the Kudu REST API, but again does not delete unwanted files]
- The OneDrive and DropBox deployment options for Azure Functions sound useful but they require a manual sync in the portal, making then useless for automated deployment.
Why not Git?
- Another reason for using a separate deployment repository is your main repository is extremely likely to contain many files that you do NOT want deployed, but will be.
- Azure Functions does not support SSH authentication and Git HTTPS authentication options mean you will be prompted for a password unless you use a credential manager.
- While it would be possible to create/use a separate git repository for deployment, it cannot live in the project directory structure (can’t nest git repos) and seems like quite a bit of hassle to get right with scripting. Git submodules may be a solution.
In search of a better solution
For the back-end’s AzureFunction deployment, the limitations of FTP are a problem. FTP is slow, is not incremental and will not sync directory trees by removing unwanted files. I’ve not found a simple cross platform sync tool that does all this. If you know of one please do tell in the comments!
Git does address these all issues with its fast minimal on wire protocols and working directory syncing. That’s probably why the AzureFunctions deployment options are so centred around git. Given this I should either forget my aversion to having a separate deployment repository or investigate a workjable solution, complete wit hthe require git setup. This might use git submodules for a local repo dynamically created in a temporary folder.
The Azure Functions Core Tools does include a non syncing publish action but the tool is too dependent on context and has to be run in the correct folder making it fiddly to use. It is also WIndows only
A Windows-only solution is David Ebbo’s WAWSDeploy which wraps msdeploy and does have a delete option. It works really well. You only need deploy the .PublishSettings file you can dowload from the Azure Function App blade in the portal (
Get Publish Profile) and use a npm script like this (don’t forget to add
*.PublishSettinigs to .gitingore):
"deploy:func:win": "<...>\WAWSDeploy.exe functionApp\\build functionApp.PublishSettings /d"
Again, for the SPA’s Azure Blob Storage, I could find no CLI utility that will both sync and set ContentType. So, until the Azure CLI tool supports both (“PRs welcome”) the answer is to use AZCopy with either an occasional manual cleanup or perhaps using cron or an AzureFunction.
AzCopy does do this and is available for Windows and Linux. Strangely though, the Windows version does not accept the – option switch character which would be portable. The Azure CLI does not offer a directory sync option either, just a basic copy. In fact, there doesn’t appear to be any tool that does both and is cross platform.
Possible solutions include a new feature in the Azure CLI , an update to AzCopy (not OSS) or a new utility to replace AZCopy.
Alternatively the site storage could be changed to use a simple static web server AzureFunction like Anthony Chu’s, Deployment then becomes identical to the backend, though you’d want a separate Function App as the deployment cadence will be distinct for each. There are obviously trade-offs compared to using blob storage, such as performance. One obvious question is will CDN caching will work?