Docker logoLast month I talked about the need for Docker-aware configuration management (CM) tools to effectively build and test containers in a CI/CD pipeline. The goal is to not install any extra tooling inside of the docker container that gets published for production use; not sshd, nor any CM tooling.

This technical post documents the major steps I took to start using ansible-container for this purpose.  The driving metric is that CM tooling was an additional 50% on top of my base image and was the Docker Way™ of making tighter containers.

  1. Install Ansible

    This tools is quite new, so consider getting it straight from their github. As of this publishing, pip installs version 0.2, and github has 0.3 (I recommend installing from github because this is under development, but the pip version is sufficient to get started).

    Troubleshooting: On Centos 7, I did have this error and had to upgrade setup tools
    pip install -U setuptools

  2. Create a Github repository

    An Ansible standard is to have every role in it’s own repository, so if you don’t already have one for your role, then let us make new one. I named mine kafka (so the Github repo url ends up being at jmalacho/kafka).  Ansible-galaxy has a nice tie-in to Github, so that is the default place for this.

  3. Initialize your git repo

    Clone the [almost] blank repo and use Ansible-galaxy to initialize it

    ansible-galaxy init $repo --force

    If you are unfamiliar with Ansible-galaxy: know that you have to update the meta/main.yml platforms section if you want Ansible-galaxy to be able to import it.  You don’t need Ansible-galaxy to test your local roles, but since this is the only section that is parsed on import, I went ahead and did it myself.  

  4. Initialize the Ansible Project

    Use Ansible-container from inside the repo’s directory to [further] initialize the project. This will create an ansible/ directory with some .yml files in it

    cd $repo ; ansible-container init

    ansible/main.yml is the equivalent of the playbook that will get executed inside the container environment. Since we are testing roles, this file will be simple.

    ansible/container.yml is the Docker compose-like file for creating the Docker environment to begin building in.  Take a second to look at the documentation.  When building a container, the base image (centos:7 for me) is the most important thing.  The oddest thing about this file is the command will not be your final docker startup command (because you haven’t build the container, so it won’t exist yet).

  5. Build the container ( the magic step! )

    ansible-container build --roles-path ~/path/ -- -v

    Ansible-container by default is better at testing already-published roles than developing new ones. We have to add an extra argument if we are testing a role that is still in development.

    User Tip: It can be useful to want more verbose output. To pass the verbose flag to the playbook run, we can add it to the second set of options.

  6. Re-wrap the resulting container

    docker build -t jmalacho/kafka .

    Awkwardly, the Docker container we built doesn’t have all the right metadata yet.  But you can use a simple Dockerfile to redefine the command/entrypoint, and the Docker build line to rename (tag) it with the image name you actually want pushed to Docker hub.  This command doesn’t really have to rebuild the container (there are no RUN lines), so it is fast.

  7. Publish the final container

    docker login && docker push

    The only challenging part here can be how you securely get the credentials into your pipeline. There are some options in the docker login documentation, and if we are using Jenkins, I usually use the credentials binding plugin as a starting point.
    We’re done!

X) Iterate

Now that we are setup, we can rebuild our Docker container automatically.  With every commit, our system can be programmed to build our container the same way each time.

What’s Next?

Testing and automation go hand in hand, in my next Docker blog, I’ll talk about ways to kick off integration tests to provide a level of validation beyond “did it build at all?”.

 

Leave a comment

Your email address will not be published. Required fields are marked *

X