Renamed production.yml to docker-compose.production.yml

This commit is contained in:
Matthew Foster Walsh 2024-04-15 13:10:31 -04:00
parent 0338748969
commit 1a8617a548
No known key found for this signature in database
7 changed files with 26 additions and 22 deletions

View File

@ -14,7 +14,7 @@ Prerequisites
Understanding the Docker Compose Setup
--------------------------------------
Before you begin, check out the ``production.yml`` file in the root of this project. Keep note of how it provides configuration for the following services:
Before you begin, check out the ``docker-compose.production.yml`` file in the root of this project. Keep note of how it provides configuration for the following services:
* ``django``: your application running behind ``Gunicorn``;
* ``postgres``: PostgreSQL database with the application's relational data;
@ -107,7 +107,7 @@ To solve this, you can either:
2. create a ``.env`` file in the root of the project with just variables you need. You'll need to also define them in ``.envs/.production/.django`` (hence duplicating them).
3. set these variables when running the build command::
DJANGO_AWS_S3_CUSTOM_DOMAIN=example.com docker compose -f production.yml build``.
DJANGO_AWS_S3_CUSTOM_DOMAIN=example.com docker compose -f docker-compose.production.yml build``.
None of these options are ideal, we're open to suggestions on how to improve this. If you think you have one, please open an issue or a pull request.
@ -122,42 +122,42 @@ Building & Running Production Stack
You will need to build the stack first. To do that, run::
docker compose -f production.yml build
docker compose -f docker-compose.production.yml build
Once this is ready, you can run it with::
docker compose -f production.yml up
docker compose -f docker-compose.production.yml up
To run the stack and detach the containers, run::
docker compose -f production.yml up -d
docker compose -f docker-compose.production.yml up -d
To run a migration, open up a second terminal and run::
docker compose -f production.yml run --rm django python manage.py migrate
docker compose -f docker-compose.production.yml run --rm django python manage.py migrate
To create a superuser, run::
docker compose -f production.yml run --rm django python manage.py createsuperuser
docker compose -f docker-compose.production.yml run --rm django python manage.py createsuperuser
If you need a shell, run::
docker compose -f production.yml run --rm django python manage.py shell
docker compose -f docker-compose.production.yml run --rm django python manage.py shell
To check the logs out, run::
docker compose -f production.yml logs
docker compose -f docker-compose.production.yml logs
If you want to scale your application, run::
docker compose -f production.yml up --scale django=4
docker compose -f production.yml up --scale celeryworker=2
docker compose -f docker-compose.production.yml up --scale django=4
docker compose -f docker-compose.production.yml up --scale celeryworker=2
.. warning:: don't try to scale ``postgres``, ``celerybeat``, or ``traefik``.
To see how your containers are doing run::
docker compose -f production.yml ps
docker compose -f docker-compose.production.yml ps
Example: Supervisor
@ -165,12 +165,12 @@ Example: Supervisor
Once you are ready with your initial setup, you want to make sure that your application is run by a process manager to
survive reboots and auto restarts in case of an error. You can use the process manager you are most familiar with. All
it needs to do is to run ``docker compose -f production.yml up`` in your projects root directory.
it needs to do is to run ``docker compose -f docker-compose.production.yml up`` in your projects root directory.
If you are using ``supervisor``, you can use this file as a starting point::
[program:{{cookiecutter.project_slug}}]
command=docker compose -f production.yml up
command=docker compose -f docker-compose.production.yml up
directory=/path/to/{{cookiecutter.project_slug}}
redirect_stderr=true
autostart=true

View File

@ -34,7 +34,7 @@ This can take a while, especially the first time you run this particular command
$ docker compose -f docker-compose.local.yml build
Generally, if you want to emulate production environment use ``production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it!
Generally, if you want to emulate production environment use ``docker-compose.production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it!
Before doing any git commit, `pre-commit`_ should be installed globally on your local machine, and then::
@ -231,7 +231,7 @@ Prerequisites:
* ``use_docker`` was set to ``y`` on project initialization;
* ``use_celery`` was set to ``y`` on project initialization.
By default, it's enabled both in local and production environments (``docker-compose.local.yml`` and ``production.yml`` Docker Compose configs, respectively) through a ``flower`` service. For added security, ``flower`` requires its clients to provide authentication credentials specified as the corresponding environments' ``.envs/.local/.django`` and ``.envs/.production/.django`` ``CELERY_FLOWER_USER`` and ``CELERY_FLOWER_PASSWORD`` environment variables. Check out ``localhost:5555`` and see for yourself.
By default, it's enabled both in local and production environments (``docker-compose.local.yml`` and ``docker-compose.production.yml`` Docker Compose configs, respectively) through a ``flower`` service. For added security, ``flower`` requires its clients to provide authentication credentials specified as the corresponding environments' ``.envs/.local/.django`` and ``.envs/.production/.django`` ``CELERY_FLOWER_USER`` and ``CELERY_FLOWER_PASSWORD`` environment variables. Check out ``localhost:5555`` and see for yourself.
.. _`Flower`: https://github.com/mher/flower

View File

@ -1,7 +1,7 @@
PostgreSQL Backups with Docker
==============================
.. note:: For brevity it is assumed that you will be running the below commands against local environment, however, this is by no means mandatory so feel free to switch to ``production.yml`` when needed.
.. note:: For brevity it is assumed that you will be running the below commands against local environment, however, this is by no means mandatory so feel free to switch to ``docker-compose.production.yml`` when needed.
Prerequisites
@ -95,8 +95,8 @@ Backup to Amazon S3
For uploading your backups to Amazon S3 you can use the aws cli container. There is an upload command for uploading the postgres /backups directory recursively and there is a download command for downloading a specific backup. The default S3 environment variables are used. ::
$ docker compose -f production.yml run --rm awscli upload
$ docker compose -f production.yml run --rm awscli download backup_2018_03_13T09_05_07.sql.gz
$ docker compose -f docker-compose.production.yml run --rm awscli upload
$ docker compose -f docker-compose.production.yml run --rm awscli download backup_2018_03_13T09_05_07.sql.gz
Remove Backup
----------------------------------

View File

@ -78,7 +78,11 @@ def remove_docker_files():
shutil.rmtree(".devcontainer")
shutil.rmtree("compose")
file_names = ["docker-compose.local.yml", "production.yml", ".dockerignore"]
file_names = [
"docker-compose.local.yml",
"docker-compose.production.yml",
".dockerignore",
]
for file_name in file_names:
os.remove(file_name)
if "{{ cookiecutter.editor }}" == "PyCharm":

View File

@ -3,7 +3,7 @@
### Download a file from your Amazon S3 bucket to the postgres /backups folder
###
### Usage:
### $ docker compose -f production.yml run --rm awscli <1>
### $ docker compose -f docker-compose.production.yml run --rm awscli <1>
set -o errexit
set -o pipefail

View File

@ -3,7 +3,7 @@
### Upload the /backups folder to Amazon S3
###
### Usage:
### $ docker compose -f production.yml run --rm awscli upload
### $ docker compose -f docker-compose.production.yml run --rm awscli upload
set -o errexit
set -o pipefail