- rename local.yml to docker-compose.yml

- rename production.yml to docker-compose.prod.yml
- update the docs
This commit is contained in:
Uyen Minh Do 2018-10-11 09:38:11 +07:00
parent 1bef404355
commit 423a066da2
9 changed files with 40 additions and 44 deletions

View File

@ -14,7 +14,7 @@ Prerequisites
Understanding the Docker Compose Setup
--------------------------------------
Before you begin, check out the ``production.yml`` file in the root of this project. Keep note of how it provides configuration for the following services:
Before you begin, check out the ``docker-compose.prod.yml`` file in the root of this project. Keep note of how it provides configuration for the following services:
* ``django``: your application running behind ``Gunicorn``;
* ``postgres``: PostgreSQL database with the application's relational data;
@ -81,42 +81,42 @@ Building & Running Production Stack
You will need to build the stack first. To do that, run::
docker-compose -f production.yml build
docker-compose -f docker-compose.prod.yml build
Once this is ready, you can run it with::
docker-compose -f production.yml up
docker-compose -f docker-compose.prod.yml up
To run the stack and detach the containers, run::
docker-compose -f production.yml up -d
docker-compose -f docker-compose.prod.yml up -d
To run a migration, open up a second terminal and run::
docker-compose -f production.yml run --rm django python manage.py migrate
docker-compose -f docker-compose.prod.yml run --rm django python manage.py migrate
To create a superuser, run::
docker-compose -f production.yml run --rm django python manage.py createsuperuser
docker-compose -f docker-compose.prod.yml run --rm django python manage.py createsuperuser
If you need a shell, run::
docker-compose -f production.yml run --rm django python manage.py shell
docker-compose -f docker-compose.prod.yml run --rm django python manage.py shell
To check the logs out, run::
docker-compose -f production.yml logs
docker-compose -f docker-compose.prod.yml logs
If you want to scale your application, run::
docker-compose -f production.yml scale django=4
docker-compose -f production.yml scale celeryworker=2
docker-compose -f docker-compose.prod.yml scale django=4
docker-compose -f docker-compose.prod.yml scale celeryworker=2
.. warning:: don't try to scale ``postgres``, ``celerybeat``, or ``caddy``.
To see how your containers are doing run::
docker-compose -f production.yml ps
docker-compose -f docker-compose.prod.yml ps
Example: Supervisor
@ -124,12 +124,12 @@ Example: Supervisor
Once you are ready with your initial setup, you want to make sure that your application is run by a process manager to
survive reboots and auto restarts in case of an error. You can use the process manager you are most familiar with. All
it needs to do is to run ``docker-compose -f production.yml up`` in your projects root directory.
it needs to do is to run ``docker-compose -f docker-compose.prod.yml up`` in your projects root directory.
If you are using ``supervisor``, you can use this file as a starting point::
[program:{{cookiecutter.project_slug}}]
command=docker-compose -f production.yml up
command=docker-compose -f docker-compose.prod.yml up
directory=/path/to/{{cookiecutter.project_slug}}
redirect_stderr=true
autostart=true

View File

@ -33,9 +33,9 @@ Build the Stack
This can take a while, especially the first time you run this particular command on your development system::
$ docker-compose -f local.yml build
$ docker-compose build
Generally, if you want to emulate production environment use ``production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it!
Generally, if you want to emulate production environment use ``docker-compose.prod.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it!
Run the Stack
@ -45,11 +45,7 @@ This brings up both Django and PostgreSQL. The first time it is run it might tak
Open a terminal at the project root and run the following for local development::
$ docker-compose -f local.yml up
You can also set the environment variable ``COMPOSE_FILE`` pointing to ``local.yml`` like this::
$ export COMPOSE_FILE=local.yml
$ docker-compose up
And then run::
@ -63,10 +59,10 @@ To run in a detached (background) mode, just::
Execute Management Commands
---------------------------
As with any shell command that we wish to run in our container, this is done using the ``docker-compose -f local.yml run --rm`` command: ::
As with any shell command that we wish to run in our container, this is done using the ``docker-compose run --rm`` command: ::
$ docker-compose -f local.yml run --rm django python manage.py migrate
$ docker-compose -f local.yml run --rm django python manage.py createsuperuser
$ docker-compose run --rm django python manage.py migrate
$ docker-compose run --rm django python manage.py createsuperuser
Here, ``django`` is the target service we are executing the commands against.
@ -82,7 +78,7 @@ When ``DEBUG`` is set to ``True``, the host is validated against ``['localhost',
Configuring the Environment
---------------------------
This is the excerpt from your project's ``local.yml``: ::
This is the excerpt from your project's ``docker-compose.yml``: ::
# ...
@ -151,7 +147,7 @@ If you are using the following within your code to debug: ::
Then you may need to run the following for it to work as desired: ::
$ docker-compose -f local.yml run --rm --service-ports django
$ docker-compose run --rm --service-ports django
django-debug-toolbar
@ -184,6 +180,6 @@ Prerequisites:
* ``use_docker`` was set to ``y`` on project initialization;
* ``use_celery`` was set to ``y`` on project initialization.
By default, it's enabled both in local and production environments (``local.yml`` and ``production.yml`` Docker Compose configs, respectively) through a ``flower`` service. For added security, ``flower`` requires its clients to provide authentication credentials specified as the corresponding environments' ``.envs/.local/.django`` and ``.envs/.production/.django`` ``CELERY_FLOWER_USER`` and ``CELERY_FLOWER_PASSWORD`` environment variables. Check out ``localhost:5555`` and see for yourself.
By default, it's enabled both in local and production environments (``docker-compose.yml`` and ``docker-compose.prod.yml`` Docker Compose configs, respectively) through a ``flower`` service. For added security, ``flower`` requires its clients to provide authentication credentials specified as the corresponding environments' ``.envs/.local/.django`` and ``.envs/.production/.django`` ``CELERY_FLOWER_USER`` and ``CELERY_FLOWER_PASSWORD`` environment variables. Check out ``localhost:5555`` and see for yourself.
.. _`Flower`: https://github.com/mher/flower

View File

@ -1,14 +1,14 @@
PostgreSQL Backups with Docker
==============================
.. note:: For brevity it is assumed that you will be running the below commands against local environment, however, this is by no means mandatory so feel free to switch to ``production.yml`` when needed.
.. note:: For brevity it is assumed that you will be running the below commands against local environment, however, this is by no means mandatory so feel free to switch to ``docker-compose.prod.yml`` when needed.
Prerequisites
-------------
#. the project was generated with ``use_docker`` set to ``y``;
#. the stack is up and running: ``docker-compose -f local.yml up -d postgres``.
#. the stack is up and running: ``docker-compose up -d postgres``.
Creating a Backup
@ -16,7 +16,7 @@ Creating a Backup
To create a backup, run::
$ docker-compose -f local.yml exec postgres backup
$ docker-compose exec postgres backup
Assuming your project's database is named ``my_project`` here is what you will see: ::
@ -31,7 +31,7 @@ Viewing the Existing Backups
To list existing backups, ::
$ docker-compose -f local.yml exec postgres backups
$ docker-compose exec postgres backups
These are the sample contents of ``/backups``: ::
@ -63,7 +63,7 @@ Restoring from the Existing Backup
To restore from one of the backups you have already got (take the ``backup_2018_03_13T09_05_07.sql.gz`` for example), ::
$ docker-compose -f local.yml exec postgres restore backup_2018_03_13T09_05_07.sql.gz
$ docker-compose exec postgres restore backup_2018_03_13T09_05_07.sql.gz
You will see something like ::

View File

@ -59,7 +59,7 @@ def remove_pycharm_files():
def remove_docker_files():
shutil.rmtree("compose")
file_names = ["local.yml", "production.yml", ".dockerignore"]
file_names = ["docker-compose.yml", "docker-compose.prod.yml", ".dockerignore"]
for file_name in file_names:
os.remove(file_name)

View File

@ -15,13 +15,13 @@ cookiecutter ../../ --no-input --overwrite-if-exists use_docker=y
cd my_awesome_project
# run the project's type checks
docker-compose -f local.yml run django mypy my_awesome_project
docker-compose run django mypy my_awesome_project
# run the project's tests
docker-compose -f local.yml run django pytest
docker-compose run django pytest
# return non-zero status code if there are migrations that have not been created
docker-compose -f local.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; }
docker-compose run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; }
# Test support for translations
docker-compose -f local.yml run django python manage.py makemessages
docker-compose run django python manage.py makemessages

View File

@ -29,7 +29,7 @@ The Docker compose tool (previously known as `fig`_) makes linking these contain
webserver/
Dockerfile
...
production.yml
docker-compose.prod.yml
Each component of your application would get its own `Dockerfile`_. The rest of this example assumes you are using the `base postgres image`_ for your database. Your database settings in `config/base.py` might then look something like:
@ -48,7 +48,7 @@ Each component of your application would get its own `Dockerfile`_. The rest of
}
}
The `Docker compose documentation`_ explains in detail what you can accomplish in the `production.yml` file, but an example configuration might look like this:
The `Docker compose documentation`_ explains in detail what you can accomplish in the `docker-compose.prod.yml` file, but an example configuration might look like this:
.. _Docker compose documentation: https://docs.docker.com/compose/#compose-documentation
@ -107,9 +107,9 @@ We'll ignore the webserver for now (you'll want to comment that part out while w
# uncomment the line below to use container as a non-root user
USER python:python
Running `sudo docker-compose -f production.yml build` will follow the instructions in your `production.yml` file and build the database container, then your webapp, before mounting your cookiecutter project files as a volume in the webapp container and linking to the database. Our example yaml file runs in development mode but changing it to production mode is as simple as commenting out the line using `runserver` and uncommenting the line using `gunicorn`.
Running `sudo docker-compose -f docker-compose.prod.yml build` will follow the instructions in your `docker-compose.prod.yml` file and build the database container, then your webapp, before mounting your cookiecutter project files as a volume in the webapp container and linking to the database. Our example yaml file runs in development mode but changing it to production mode is as simple as commenting out the line using `runserver` and uncommenting the line using `gunicorn`.
Both are set to run on port `0.0.0.0:8000`, which is where the Docker daemon will discover it. You can now run `sudo docker-compose -f production.yml up` and browse to `localhost:8000` to see your application running.
Both are set to run on port `0.0.0.0:8000`, which is where the Docker daemon will discover it. You can now run `sudo docker-compose -f docker-compose.prod.yml up` and browse to `localhost:8000` to see your application running.
Deployment
^^^^^^^^^^
@ -155,7 +155,7 @@ That Dockerfile assumes you have an Nginx conf file named `site.conf` in the sam
}
}
Running `sudo docker-compose -f production.yml build webserver` will build your server container. Running `sudo docker-compose -f production.yml up` will now expose your application directly on `localhost` (no need to specify the port number).
Running `sudo docker-compose -f docker-compose.prod.yml build webserver` will build your server container. Running `sudo docker-compose -f docker-compose.prod.yml up` will now expose your application directly on `localhost` (no need to specify the port number).
Building and running your app on EC2
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@ -166,9 +166,9 @@ All you now need to do to run your app in production is:
* Install your preferred source control solution, Docker and Docker compose on the news instance.
* Pull in your code from source control. The root directory should be the one with your `production.yml` file in it.
* Pull in your code from source control. The root directory should be the one with your `docker-compose.prod.yml` file in it.
* Run `sudo docker-compose -f production.yml build` and `sudo docker-compose -f production.yml up`.
* Run `sudo docker-compose -f docker-compose.prod.yml build` and `sudo docker-compose -f docker-compose.prod.yml up`.
* Assign an `Elastic IP address`_ to your new machine.

View File

@ -21,7 +21,7 @@ Next, you have to add new remote python interpreter, based on already tested dep
.. image:: images/3.png
Switch to *Docker Compose* and select `local.yml` file from directory of your project, next set *Service name* to `django`
Switch to *Docker Compose* and select `docker-compose.yml` file from directory of your project, next set *Service name* to `django`
.. image:: images/4.png