DocOps at ATIX – the deploy stage: Automatically deploying to an internal http server via our Documentation Gitlab Pipeline

This short blog series will explain how we automatically test, build, and deploy documentation mostly using vim, git, sphinx, docker, and gitlab. This „DocOps at ATIX“ blog article describes how we deploy our built documentation to an internal http server via our Documentation Gitlab Pipeline.

The first and second stage test and build the documentation.
To make our Gitlab pipeline even more convenient, we also deploy the build artifacts to an http server.
In our case, we extract the html, website, and style guide, and copy the pdf version to its designated location: a subdirectory named like its corresponding git branch.
This is a Gitlab feature and can be accessed as a predefined environment variable as ${CI_COMMIT_REF_SLUG}.

To do so, we have a third stage called deploy.
All jobs share common tasks as are descibed in the .deploy stage.

The deployment task uses an Alpine based Docker image.
Before running the actual script, we prepare SSH access by copying the SSH_KNOWN_HOSTS file into ~/.ssh/known_hosts and add a prepared SSH private key.

---
.test_deploy:
  extends: .deploy
  variables:
    SSH_KNOWN_HOSTS: "${DEV_SSH_KNOWN_HOSTS}"
    SSH_PRIVATE_KEY: "${DEV_SSH_PRIVATE_KEY}"
    TARGET_HOST: "dev-docs.example.com"

deploy:test_deployment:
  extends: .test_deploy
  dependencies:
    - build:website
    - build:html
    - build:pdf
    - build:style_guide
  environment:
    name: dev ${CI_COMMIT_REF_NAME}
    url: "http://dev-docs.example.com/branches/${CI_COMMIT_REF_SLUG}"
    on_stop: deploy:stop_test_deployment
  script:
    - set -u
    - ansible-playbook ci_resources/test_deployment.yaml
        --inventory "${TARGET_HOST},"
        --extra-vars "branch_slug=${CI_COMMIT_REF_SLUG}"
...

The following task describes how to deploy the website to our actual web server running docs.orcharhino.com.
This task only runs when started manually in our Gitlab pipeline, as we mostly push documentation online in sync with new orcharhino releases.
The only exception is bug fixes.

 ---
deploy:docs.orcharhino.com:
extends: .deploy
dependencies:
- build:website
environment:
name: production
url: https://docs.orcharhino.com/
variables:
SSH_KNOWN_HOSTS: "${PROD_SSH_KNOWN_HOSTS}"
SSH_PRIVATE_KEY: "${PROD_SSH_PRIVATE_KEY}"
TARGET_HOST: "${PROD_HOST_IP}"
script:
- set -u
- ansible-playbook ci_resources/prod_deployment.yaml -i "${TARGET_HOST}",
only:
refs:
- docs_orcharhino_com
when: manual
...

The variables "${}" refer to various Gitlab internal variables that are simply renamed.
This job only runs when started manually.
It starts an Ansible Playbook to deploy the documentation to our production server for docs.orcharhino.com.

---
.test_deploy:
extends: .deploy
variables:
SSH_KNOWN_HOSTS: "${DEV_SSH_KNOWN_HOSTS}"
SSH_PRIVATE_KEY: "${DEV_SSH_PRIVATE_KEY}"
TARGET_HOST: "dev-docs.example.com"

deploy:test_deployment:
extends: .test_deploy
dependencies:
- build:website
- build:html
- build:pdf
- build:style_guide
environment:
name: dev ${CI_COMMIT_REF_NAME}
url: "http://dev-docs.example.com/branches/${CI_COMMIT_REF_SLUG}"
on_stop: deploy:stop_test_deployment
script:
- set -u
- ansible-playbook ci_resources/test_deployment.yaml
--inventory "${TARGET_HOST},"
--extra-vars "branch_slug=${CI_COMMIT_REF_SLUG}"
...

The test_deployment job extends the test_deploy job which holds internal variables defined in Gitlab.
It depends on all four build stage jobs, as it only makes sense to deploy the documentation once it’s been built successfully.
It also starts an Ansible playbook but with a different inventory containing the internal http server and an extra variable called branch_slug to create a folder with the name of the feature branch ${CI_COMMIT_REF_SLUG}.

Overall, this mini-series with three parts allows you to rebuild our Gitlab pipeline to automatically test, build, and deploy Sphinx-based documentation.
It helps us focus on writing documentation rather than testing, building, and deploying it.
Using this Gitlab pipeline makes asking for reviews much more convenient and overall saves us a lot of time.