Creating your package
When creating your package for deployments there’s a number of requirements to ensure that the Dashboard can properly release the package to you environment.
The package must not contain a
.git directory or any environment files i.e.
The package should be created at the same level as the containing directory. For example if you have the following setup:
my-projects │ └───project1 │ file011.txt │ file012.txt │ │ └───project2 file021.txt file022.txt
and you want to package up
project1 you should
cd in to
my-projects and then run
tar -zcf /path/to/store/package.tgz project1. This will ensure your package contains the expected layout that the Dashboard requires.
To ensure you have the correct layout you can run
tar -tf /path/to/store/package.tgz, in this example you should see:
project1/ project1/file011.txt project1/file012.txt
The following examples of output from
tar -tf will not work and will either be rejected by the Dashboard on deployment creation or the deployment will fail:
./project1/ ./project1/file011.txt ./project1/file012.txt
/path/to/my-projects/project1/ /path/to/my-projects/project1/file011.txt /path/to/my-projects/project1/file012.txt
BSD Tar vs GNU Tar
If you are using OSX as your operating system you most likely have
bsdtar installed by default (check with
tar --help) which can sometimes cause issues when unpacking in a Linux environment such as:
tar: Ignoring unknown extended header keyword `SCHILY.ino' tar: Ignoring unknown extended header keyword `SCHILY.nlink' tar: Ignoring unknown extended header keyword `SCHILY.dev'
If you see such errors in failed deployment logs consider installing
gtar locally to create your packages with - OSX users can do so with
brew install gnu-tar and then running
gtar in place of
Deploying your package
You can create a package deployment from the deployment screen by opening the “advanced options” dropdown and choosing “deploy a tar.gz package”. You can also create package deployments using the API.
An example workflow
The power of deploying pre-packaged codebases is that the build script is entirely in your hands which enables you to customise how you deploy. How you build you packages is up to you, but here’s one example of what you could do.
Our configuration for the following example can be found here.
Using CircleCI to build and test our code, we can set-up automatic deployments of our develop branch to UAT and automatically queue deployments to Production of our master branch using the Dashboard API.
The process looks like this:
- First we build our codebase after pushing a commit to our chosen repository. In this example we are just running a
composer vendor-expose. This is the stage where you could compile your front-end files or other dependencies.
- Then we test that built codebase to make sure it is functioning as we need. We are using PHPUnit to run unit tests and a dev/build to catch anything else that could be site breaking or cause our deployment to fail.
- If the testing step is successful then we tar compress the codebase and store it as an artifact in CircleCI. Artifacts are accessible to whoever has access to the repo CircleCI is building from, but can also be accessed with a token in the GET parameters.
- Once we have packaged our code we call the Dashboard API to queue the deployment for prod or run the deployment for UAT.
We have created two CircleCI environment variables:
CIRCLE_CD_TOKEN- Your API token for CircleCI, used to access build artifact locations and posted to the Dashboard to allow it to download your artifact.
DASH_TOKEN- Your API token for the Dashboard, used to call the API and create the deployments.
Feel free to build on this configuration for your own builds but remember to update:
- The CircleCI location on lines 98 and 117
- The email address and stack code on lines 100 and 119
Do I have to use CircleCI?
No, the only requirements for your package are as listed in the “Creating your package” section on this page and that a
dev/build can run to completion on it - outside of that you can create your package however you like.
Does it still use a .platform.yml?
Yes, if you are deploying a package to your base stack the
.platform.yml file is still read.
What happens to my packages after they are deployed?
Packages are stored on the Dashboard server and can be reused multiple times without needing to have the Dashboard download them again. There is no guarantee how long a package may remain on the Dashboard server so do not use it as your primary storage source.
Do my package names have to be unique?
No, each time you upload a package it can be named the same thing. The package is identified internally with a unique ID so the URL it comes from can be the same everytime without fear of overwriting old packages.