diff --git a/docs/deployment-with-docker.rst b/docs/deployment-with-docker.rst index 3d2f9f81..ebc42a52 100644 --- a/docs/deployment-with-docker.rst +++ b/docs/deployment-with-docker.rst @@ -14,7 +14,7 @@ Prerequisites Understanding the Docker Compose Setup -------------------------------------- -Before you begin, check out the ``production.yml`` file in the root of this project. Keep note of how it provides configuration for the following services: +Before you begin, check out the ``docker-compose.production.yml`` file in the root of this project. Keep note of how it provides configuration for the following services: * ``django``: your application running behind ``Gunicorn``; * ``postgres``: PostgreSQL database with the application's relational data; @@ -107,7 +107,7 @@ To solve this, you can either: 2. create a ``.env`` file in the root of the project with just variables you need. You'll need to also define them in ``.envs/.production/.django`` (hence duplicating them). 3. set these variables when running the build command:: - DJANGO_AWS_S3_CUSTOM_DOMAIN=example.com docker compose -f production.yml build``. + DJANGO_AWS_S3_CUSTOM_DOMAIN=example.com docker compose -f docker-compose.production.yml build``. None of these options are ideal, we're open to suggestions on how to improve this. If you think you have one, please open an issue or a pull request. @@ -122,42 +122,42 @@ Building & Running Production Stack You will need to build the stack first. To do that, run:: - docker compose -f production.yml build + docker compose -f docker-compose.production.yml build Once this is ready, you can run it with:: - docker compose -f production.yml up + docker compose -f docker-compose.production.yml up To run the stack and detach the containers, run:: - docker compose -f production.yml up -d + docker compose -f docker-compose.production.yml up -d To run a migration, open up a second terminal and run:: - docker compose -f production.yml run --rm django python manage.py migrate + docker compose -f docker-compose.production.yml run --rm django python manage.py migrate To create a superuser, run:: - docker compose -f production.yml run --rm django python manage.py createsuperuser + docker compose -f docker-compose.production.yml run --rm django python manage.py createsuperuser If you need a shell, run:: - docker compose -f production.yml run --rm django python manage.py shell + docker compose -f docker-compose.production.yml run --rm django python manage.py shell To check the logs out, run:: - docker compose -f production.yml logs + docker compose -f docker-compose.production.yml logs If you want to scale your application, run:: - docker compose -f production.yml up --scale django=4 - docker compose -f production.yml up --scale celeryworker=2 + docker compose -f docker-compose.production.yml up --scale django=4 + docker compose -f docker-compose.production.yml up --scale celeryworker=2 .. warning:: don't try to scale ``postgres``, ``celerybeat``, or ``traefik``. To see how your containers are doing run:: - docker compose -f production.yml ps + docker compose -f docker-compose.production.yml ps Example: Supervisor @@ -165,12 +165,12 @@ Example: Supervisor Once you are ready with your initial setup, you want to make sure that your application is run by a process manager to survive reboots and auto restarts in case of an error. You can use the process manager you are most familiar with. All -it needs to do is to run ``docker compose -f production.yml up`` in your projects root directory. +it needs to do is to run ``docker compose -f docker-compose.production.yml up`` in your projects root directory. If you are using ``supervisor``, you can use this file as a starting point:: [program:{{cookiecutter.project_slug}}] - command=docker compose -f production.yml up + command=docker compose -f docker-compose.production.yml up directory=/path/to/{{cookiecutter.project_slug}} redirect_stderr=true autostart=true diff --git a/docs/developing-locally-docker.rst b/docs/developing-locally-docker.rst index 01970e46..83de99bb 100644 --- a/docs/developing-locally-docker.rst +++ b/docs/developing-locally-docker.rst @@ -32,9 +32,9 @@ Build the Stack This can take a while, especially the first time you run this particular command on your development system:: - $ docker compose -f local.yml build + $ docker compose -f docker-compose.local.yml build -Generally, if you want to emulate production environment use ``production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it! +Generally, if you want to emulate production environment use ``docker-compose.production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it! Before doing any git commit, `pre-commit`_ should be installed globally on your local machine, and then:: @@ -51,11 +51,11 @@ This brings up both Django and PostgreSQL. The first time it is run it might tak Open a terminal at the project root and run the following for local development:: - $ docker compose -f local.yml up + $ docker compose -f docker-compose.local.yml up -You can also set the environment variable ``COMPOSE_FILE`` pointing to ``local.yml`` like this:: +You can also set the environment variable ``COMPOSE_FILE`` pointing to ``docker-compose.local.yml`` like this:: - $ export COMPOSE_FILE=local.yml + $ export COMPOSE_FILE=docker-compose.local.yml And then run:: @@ -67,21 +67,21 @@ To run in a detached (background) mode, just:: These commands don't run the docs service. In order to run docs service you can run:: - $ docker compose -f docs.yml up + $ docker compose -f docker-compose.docs.yml up To run the docs with local services just use:: - $ docker compose -f local.yml -f docs.yml up + $ docker compose -f docker-compose.local.yml -f docker-compose.docs.yml up The site should start and be accessible at http://localhost:3000 if you selected Webpack or Gulp as frontend pipeline and http://localhost:8000 otherwise. Execute Management Commands --------------------------- -As with any shell command that we wish to run in our container, this is done using the ``docker compose -f local.yml run --rm`` command: :: +As with any shell command that we wish to run in our container, this is done using the ``docker compose -f docker-compose.local.yml run --rm`` command: :: - $ docker compose -f local.yml run --rm django python manage.py migrate - $ docker compose -f local.yml run --rm django python manage.py createsuperuser + $ docker compose -f docker-compose.local.yml run --rm django python manage.py migrate + $ docker compose -f docker-compose.local.yml run --rm django python manage.py createsuperuser Here, ``django`` is the target service we are executing the commands against. Also, please note that the ``docker exec`` does not work for running management commands. @@ -97,7 +97,7 @@ When ``DEBUG`` is set to ``True``, the host is validated against ``['localhost', Configuring the Environment --------------------------- -This is the excerpt from your project's ``local.yml``: :: +This is the excerpt from your project's ``docker-compose.local.yml``: :: # ... @@ -163,8 +163,8 @@ You have to modify the relevant requirement file: base, local or production by a To get this change picked up, you'll need to rebuild the image(s) and restart the running container: :: - docker compose -f local.yml build - docker compose -f local.yml up + docker compose -f docker-compose.local.yml build + docker compose -f docker-compose.local.yml up Debugging ~~~~~~~~~ @@ -178,7 +178,7 @@ If you are using the following within your code to debug: :: Then you may need to run the following for it to work as desired: :: - $ docker compose -f local.yml run --rm --service-ports django + $ docker compose -f docker-compose.local.yml run --rm --service-ports django django-debug-toolbar @@ -231,7 +231,7 @@ Prerequisites: * ``use_docker`` was set to ``y`` on project initialization; * ``use_celery`` was set to ``y`` on project initialization. -By default, it's enabled both in local and production environments (``local.yml`` and ``production.yml`` Docker Compose configs, respectively) through a ``flower`` service. For added security, ``flower`` requires its clients to provide authentication credentials specified as the corresponding environments' ``.envs/.local/.django`` and ``.envs/.production/.django`` ``CELERY_FLOWER_USER`` and ``CELERY_FLOWER_PASSWORD`` environment variables. Check out ``localhost:5555`` and see for yourself. +By default, it's enabled both in local and production environments (``docker-compose.local.yml`` and ``docker-compose.production.yml`` Docker Compose configs, respectively) through a ``flower`` service. For added security, ``flower`` requires its clients to provide authentication credentials specified as the corresponding environments' ``.envs/.local/.django`` and ``.envs/.production/.django`` ``CELERY_FLOWER_USER`` and ``CELERY_FLOWER_PASSWORD`` environment variables. Check out ``localhost:5555`` and see for yourself. .. _`Flower`: https://github.com/mher/flower @@ -279,7 +279,7 @@ certs Take the certificates that you generated and place them in a folder called ``certs`` in the project's root folder. Assuming that you registered your local hostname as ``my-dev-env.local``, the certificates you will put in the folder should have the names ``my-dev-env.local.crt`` and ``my-dev-env.local.key``. -local.yml +docker-compose.local.yml ~~~~~~~~~ #. Add the ``nginx-proxy`` service. :: @@ -323,7 +323,7 @@ You should allow the new hostname. :: Rebuild your ``docker`` application. :: - $ docker compose -f local.yml up -d --build + $ docker compose -f docker-compose.local.yml up -d --build Go to your browser and type in your URL bar ``https://my-dev-env.local`` @@ -343,9 +343,9 @@ Webpack If you are using Webpack: -1. On the ``nginx-proxy`` service in ``local.yml``, change ``depends_on`` to ``node`` instead of ``django``. +1. On the ``nginx-proxy`` service in ``docker-compose.local.yml``, change ``depends_on`` to ``node`` instead of ``django``. -2. On the ``node`` service in ``local.yml``, add the following environment configuration: +2. On the ``node`` service in ``docker-compose.local.yml``, add the following environment configuration: :: diff --git a/docs/docker-postgres-backups.rst b/docs/docker-postgres-backups.rst index 302e4c4b..d214ee4e 100644 --- a/docs/docker-postgres-backups.rst +++ b/docs/docker-postgres-backups.rst @@ -1,14 +1,14 @@ PostgreSQL Backups with Docker ============================== -.. note:: For brevity it is assumed that you will be running the below commands against local environment, however, this is by no means mandatory so feel free to switch to ``production.yml`` when needed. +.. note:: For brevity it is assumed that you will be running the below commands against local environment, however, this is by no means mandatory so feel free to switch to ``docker-compose.production.yml`` when needed. Prerequisites ------------- #. the project was generated with ``use_docker`` set to ``y``; -#. the stack is up and running: ``docker compose -f local.yml up -d postgres``. +#. the stack is up and running: ``docker compose -f docker-compose.local.yml up -d postgres``. Creating a Backup @@ -16,7 +16,7 @@ Creating a Backup To create a backup, run:: - $ docker compose -f local.yml exec postgres backup + $ docker compose -f docker-compose.local.yml exec postgres backup Assuming your project's database is named ``my_project`` here is what you will see: :: @@ -31,7 +31,7 @@ Viewing the Existing Backups To list existing backups, :: - $ docker compose -f local.yml exec postgres backups + $ docker compose -f docker-compose.local.yml exec postgres backups These are the sample contents of ``/backups``: :: @@ -55,9 +55,9 @@ With a single backup file copied to ``.`` that would be :: $ docker cp 9c5c3f055843:/backups/backup_2018_03_13T09_05_07.sql.gz . -You can also get the container ID using ``docker compose -f local.yml ps -q postgres`` so if you want to automate your backups, you don't have to check the container ID manually every time. Here is the full command :: +You can also get the container ID using ``docker compose -f docker-compose.local.yml ps -q postgres`` so if you want to automate your backups, you don't have to check the container ID manually every time. Here is the full command :: - $ docker cp $(docker compose -f local.yml ps -q postgres):/backups ./backups + $ docker cp $(docker compose -f docker-compose.local.yml ps -q postgres):/backups ./backups .. _`command`: https://docs.docker.com/engine/reference/commandline/cp/ @@ -66,7 +66,7 @@ Restoring from the Existing Backup To restore from one of the backups you have already got (take the ``backup_2018_03_13T09_05_07.sql.gz`` for example), :: - $ docker compose -f local.yml exec postgres restore backup_2018_03_13T09_05_07.sql.gz + $ docker compose -f docker-compose.local.yml exec postgres restore backup_2018_03_13T09_05_07.sql.gz You will see something like :: @@ -95,15 +95,15 @@ Backup to Amazon S3 For uploading your backups to Amazon S3 you can use the aws cli container. There is an upload command for uploading the postgres /backups directory recursively and there is a download command for downloading a specific backup. The default S3 environment variables are used. :: - $ docker compose -f production.yml run --rm awscli upload - $ docker compose -f production.yml run --rm awscli download backup_2018_03_13T09_05_07.sql.gz + $ docker compose -f docker-compose.production.yml run --rm awscli upload + $ docker compose -f docker-compose.production.yml run --rm awscli download backup_2018_03_13T09_05_07.sql.gz Remove Backup ---------------------------------- To remove backup you can use the ``rmbackup`` command. This will remove the backup from the ``/backups`` directory. :: - $ docker compose -f local.yml exec postgres rmbackup backup_2018_03_13T09_05_07.sql.gz + $ docker compose -f docker-compose.local.yml exec postgres rmbackup backup_2018_03_13T09_05_07.sql.gz Upgrading PostgreSQL @@ -111,17 +111,17 @@ Upgrading PostgreSQL Upgrading PostgreSQL in your project requires a series of carefully executed steps. Start by halting all containers, excluding the postgres container. Following this, create a backup and proceed to remove the outdated data volume. :: - $ docker compose -f local.yml down - $ docker compose -f local.yml up -d postgres - $ docker compose -f local.yml run --rm postgres backup - $ docker compose -f local.yml down + $ docker compose -f docker-compose.local.yml down + $ docker compose -f docker-compose.local.yml up -d postgres + $ docker compose -f docker-compose.local.yml run --rm postgres backup + $ docker compose -f docker-compose.local.yml down $ docker volume rm my_project_postgres_data .. note:: Neglecting to remove the old data volume may lead to issues, such as the new postgres container failing to start with errors like ``FATAL: database files are incompatible with server``, and ``could not translate host name "postgres" to address: Name or service not known``. To complete the upgrade, update the PostgreSQL version in the corresponding Dockerfile (e.g. ``compose/production/postgres/Dockerfile``) and build a new version of PostgreSQL. :: - $ docker compose -f local.yml build postgres - $ docker compose -f local.yml up -d postgres - $ docker compose -f local.yml run --rm postgres restore backup_2018_03_13T09_05_07.sql.gz - $ docker compose -f local.yml up -d + $ docker compose -f docker-compose.local.yml build postgres + $ docker compose -f docker-compose.local.yml up -d postgres + $ docker compose -f docker-compose.local.yml run --rm postgres restore backup_2018_03_13T09_05_07.sql.gz + $ docker compose -f docker-compose.local.yml up -d diff --git a/docs/document.rst b/docs/document.rst index f93f2b60..61cb692d 100644 --- a/docs/document.rst +++ b/docs/document.rst @@ -11,7 +11,7 @@ After you have set up to `develop locally`_, run the following command from the If you set up your project to `develop locally with docker`_, run the following command: :: - $ docker compose -f docs.yml up + $ docker compose -f docker-compose.docs.yml up Navigate to port 9000 on your host to see the documentation. This will be opened automatically at `localhost`_ for local, non-docker development. diff --git a/docs/testing.rst b/docs/testing.rst index d403a30e..58a05770 100644 --- a/docs/testing.rst +++ b/docs/testing.rst @@ -19,7 +19,7 @@ You will get a readout of the `users` app that has already been set up with test If you set up your project to `develop locally with docker`_, run the following command: :: - $ docker compose -f local.yml run --rm django pytest + $ docker compose -f docker-compose.local.yml run --rm django pytest Targeting particular apps for testing in ``docker`` follows a similar pattern as previously shown above. @@ -36,8 +36,8 @@ Once the tests are complete, in order to see the code coverage, run the followin If you're running the project locally with Docker, use these commands instead: :: - $ docker compose -f local.yml run --rm django coverage run -m pytest - $ docker compose -f local.yml run --rm django coverage report + $ docker compose -f docker-compose.local.yml run --rm django coverage run -m pytest + $ docker compose -f docker-compose.local.yml run --rm django coverage report .. note:: diff --git a/docs/troubleshooting.rst b/docs/troubleshooting.rst index 80bab2e2..847f0a70 100644 --- a/docs/troubleshooting.rst +++ b/docs/troubleshooting.rst @@ -30,7 +30,7 @@ If you recreate the project multiple times with the same name, Docker would pres To fix this, you can either: -- Clear your project-related Docker cache with ``docker compose -f local.yml down --volumes --rmi all``. +- Clear your project-related Docker cache with ``docker compose -f docker-compose.local.yml down --volumes --rmi all``. - Use the Docker volume sub-commands to find volumes (`ls`_) and remove them (`rm`_). - Use the `prune`_ command to clear system-wide (use with care!). diff --git a/hooks/post_gen_project.py b/hooks/post_gen_project.py index 1ddab063..9e9af5f2 100644 --- a/hooks/post_gen_project.py +++ b/hooks/post_gen_project.py @@ -78,7 +78,11 @@ def remove_docker_files(): shutil.rmtree(".devcontainer") shutil.rmtree("compose") - file_names = ["local.yml", "production.yml", ".dockerignore"] + file_names = [ + "docker-compose.local.yml", + "docker-compose.production.yml", + ".dockerignore", + ] for file_name in file_names: os.remove(file_name) if "{{ cookiecutter.editor }}" == "PyCharm": diff --git a/tests/test_cookiecutter_generation.py b/tests/test_cookiecutter_generation.py index 6b581b03..141ae4cc 100755 --- a/tests/test_cookiecutter_generation.py +++ b/tests/test_cookiecutter_generation.py @@ -247,7 +247,13 @@ def test_djlint_lint_passes(cookies, context_override): # TODO: remove T002 when fixed https://github.com/Riverside-Healthcare/djLint/issues/687 ignored_rules = "H006,H030,H031,T002" try: - sh.djlint("--lint", "--ignore", f"{autofixable_rules},{ignored_rules}", ".", _cwd=str(result.project_path)) + sh.djlint( + "--lint", + "--ignore", + f"{autofixable_rules},{ignored_rules}", + ".", + _cwd=str(result.project_path), + ) except sh.ErrorReturnCode as e: pytest.fail(e.stdout.decode()) @@ -268,7 +274,7 @@ def test_djlint_check_passes(cookies, context_override): ["use_docker", "expected_test_script"], [ ("n", "pytest"), - ("y", "docker compose -f local.yml run django pytest"), + ("y", "docker compose -f docker-compose.local.yml run django pytest"), ], ) def test_travis_invokes_pytest(cookies, context, use_docker, expected_test_script): @@ -293,7 +299,7 @@ def test_travis_invokes_pytest(cookies, context, use_docker, expected_test_scrip ["use_docker", "expected_test_script"], [ ("n", "pytest"), - ("y", "docker compose -f local.yml run django pytest"), + ("y", "docker compose -f docker-compose.local.yml run django pytest"), ], ) def test_gitlab_invokes_precommit_and_pytest(cookies, context, use_docker, expected_test_script): @@ -320,7 +326,7 @@ def test_gitlab_invokes_precommit_and_pytest(cookies, context, use_docker, expec ["use_docker", "expected_test_script"], [ ("n", "pytest"), - ("y", "docker compose -f local.yml run django pytest"), + ("y", "docker compose -f docker-compose.local.yml run django pytest"), ], ) def test_github_invokes_linter_and_pytest(cookies, context, use_docker, expected_test_script): diff --git a/tests/test_docker.sh b/tests/test_docker.sh index 96bf8662..473eede0 100755 --- a/tests/test_docker.sh +++ b/tests/test_docker.sh @@ -15,22 +15,22 @@ cookiecutter ../../ --no-input --overwrite-if-exists use_docker=y "$@" cd my_awesome_project # make sure all images build -docker compose -f local.yml build +docker compose -f docker-compose.local.yml build # run the project's type checks -docker compose -f local.yml run django mypy my_awesome_project +docker compose -f docker-compose.local.yml run django mypy my_awesome_project # run the project's tests -docker compose -f local.yml run django pytest +docker compose -f docker-compose.local.yml run django pytest # return non-zero status code if there are migrations that have not been created -docker compose -f local.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; } +docker compose -f docker-compose.local.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; } # Test support for translations -docker compose -f local.yml run django python manage.py makemessages --all +docker compose -f docker-compose.local.yml run django python manage.py makemessages --all # Make sure the check doesn't raise any warnings -docker compose -f local.yml run \ +docker compose -f docker-compose.local.yml run \ -e DJANGO_SECRET_KEY="$(openssl rand -base64 64)" \ -e REDIS_URL=redis://redis:6379/0 \ -e CELERY_BROKER_URL=redis://redis:6379/0 \ @@ -43,10 +43,10 @@ docker compose -f local.yml run \ django python manage.py check --settings=config.settings.production --deploy --database default --fail-level WARNING # Generate the HTML for the documentation -docker compose -f docs.yml run docs make html +docker compose -f docker-compose.docs.yml run docs make html # Run npm build script if package.json is present if [ -f "package.json" ] then - docker compose -f local.yml run node npm run build + docker compose -f docker-compose.local.yml run node npm run build fi diff --git a/{{cookiecutter.project_slug}}/.devcontainer/devcontainer.json b/{{cookiecutter.project_slug}}/.devcontainer/devcontainer.json index e16d06a2..5604b8a8 100644 --- a/{{cookiecutter.project_slug}}/.devcontainer/devcontainer.json +++ b/{{cookiecutter.project_slug}}/.devcontainer/devcontainer.json @@ -2,7 +2,7 @@ { "name": "{{cookiecutter.project_slug}}_dev", "dockerComposeFile": [ - "../local.yml" + "../docker-compose.local.yml" ], "init": true, "mounts": [ diff --git a/{{cookiecutter.project_slug}}/.drone.yml b/{{cookiecutter.project_slug}}/.drone.yml index 829ead2c..d6c13e62 100644 --- a/{{cookiecutter.project_slug}}/.drone.yml +++ b/{{cookiecutter.project_slug}}/.drone.yml @@ -31,11 +31,11 @@ steps: environment: DATABASE_URL: pgsql://$POSTGRES_USER:$POSTGRES_PASSWORD@postgres/$POSTGRES_DB commands: - - docker-compose -f local.yml build - - docker-compose -f docs.yml build - - docker-compose -f local.yml run --rm django python manage.py migrate - - docker-compose -f local.yml up -d - - docker-compose -f local.yml run django pytest + - docker-compose -f docker-compose.local.yml build + - docker-compose -f docker-compose.docs.yml build + - docker-compose -f docker-compose.local.yml run --rm django python manage.py migrate + - docker-compose -f docker-compose.local.yml up -d + - docker-compose -f docker-compose.local.yml run django pytest {%- else %} image: python:3.12 commands: diff --git a/{{cookiecutter.project_slug}}/.github/workflows/ci.yml b/{{cookiecutter.project_slug}}/.github/workflows/ci.yml index 4e6ec1cd..5cb9ead4 100644 --- a/{{cookiecutter.project_slug}}/.github/workflows/ci.yml +++ b/{{cookiecutter.project_slug}}/.github/workflows/ci.yml @@ -69,19 +69,19 @@ jobs: {%- if cookiecutter.use_docker == 'y' %} - name: Build the Stack - run: docker compose -f local.yml build django + run: docker compose -f docker-compose.local.yml build django - name: Build the docs - run: docker compose -f docs.yml build docs + run: docker compose -f docker-compose.docs.yml build docs - name: Run DB Migrations - run: docker compose -f local.yml run --rm django python manage.py migrate + run: docker compose -f docker-compose.local.yml run --rm django python manage.py migrate - name: Run Django Tests - run: docker compose -f local.yml run django pytest + run: docker compose -f docker-compose.local.yml run django pytest - name: Tear down the Stack - run: docker compose -f local.yml down + run: docker compose -f docker-compose.local.yml down {%- else %} - name: Set up Python diff --git a/{{cookiecutter.project_slug}}/.gitlab-ci.yml b/{{cookiecutter.project_slug}}/.gitlab-ci.yml index 75c03e87..41eea0db 100644 --- a/{{cookiecutter.project_slug}}/.gitlab-ci.yml +++ b/{{cookiecutter.project_slug}}/.gitlab-ci.yml @@ -33,13 +33,13 @@ pytest: services: - docker:dind before_script: - - docker compose -f local.yml build - - docker compose -f docs.yml build + - docker compose -f docker-compose.local.yml build + - docker compose -f docker-compose.docs.yml build # Ensure celerybeat does not crash due to non-existent tables - - docker compose -f local.yml run --rm django python manage.py migrate - - docker compose -f local.yml up -d + - docker compose -f docker-compose.local.yml run --rm django python manage.py migrate + - docker compose -f docker-compose.local.yml up -d script: - - docker compose -f local.yml run django pytest + - docker compose -f docker-compose.local.yml run django pytest {%- else %} image: python:3.12 tags: diff --git a/{{cookiecutter.project_slug}}/.travis.yml b/{{cookiecutter.project_slug}}/.travis.yml index abf12f42..97f9f60a 100644 --- a/{{cookiecutter.project_slug}}/.travis.yml +++ b/{{cookiecutter.project_slug}}/.travis.yml @@ -19,15 +19,15 @@ jobs: before_script: - docker compose -v - docker -v - - docker compose -f local.yml build - - docker compose -f docs.yml build + - docker compose -f docker-compose.local.yml build + - docker compose -f docker-compose.docs.yml build # Ensure celerybeat does not crash due to non-existent tables - - docker compose -f local.yml run --rm django python manage.py migrate - - docker compose -f local.yml up -d + - docker compose -f docker-compose.local.yml run --rm django python manage.py migrate + - docker compose -f docker-compose.local.yml up -d script: - - docker compose -f local.yml run django pytest + - docker compose -f docker-compose.local.yml run django pytest after_failure: - - docker compose -f local.yml logs + - docker compose -f docker-compose.local.yml logs {%- else %} before_install: - sudo apt-get update -qq diff --git a/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/download b/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/download index 9561d917..12871a77 100644 --- a/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/download +++ b/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/download @@ -3,7 +3,7 @@ ### Download a file from your Amazon S3 bucket to the postgres /backups folder ### ### Usage: -### $ docker compose -f production.yml run --rm awscli <1> +### $ docker compose -f docker-compose.production.yml run --rm awscli <1> set -o errexit set -o pipefail diff --git a/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/upload b/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/upload index 73c1b9be..2f577824 100644 --- a/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/upload +++ b/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/upload @@ -3,7 +3,7 @@ ### Upload the /backups folder to Amazon S3 ### ### Usage: -### $ docker compose -f production.yml run --rm awscli upload +### $ docker compose -f docker-compose.production.yml run --rm awscli upload set -o errexit set -o pipefail diff --git a/{{cookiecutter.project_slug}}/docs.yml b/{{cookiecutter.project_slug}}/docker-compose.docs.yml similarity index 100% rename from {{cookiecutter.project_slug}}/docs.yml rename to {{cookiecutter.project_slug}}/docker-compose.docs.yml diff --git a/{{cookiecutter.project_slug}}/local.yml b/{{cookiecutter.project_slug}}/docker-compose.local.yml similarity index 100% rename from {{cookiecutter.project_slug}}/local.yml rename to {{cookiecutter.project_slug}}/docker-compose.local.yml diff --git a/{{cookiecutter.project_slug}}/production.yml b/{{cookiecutter.project_slug}}/docker-compose.production.yml similarity index 100% rename from {{cookiecutter.project_slug}}/production.yml rename to {{cookiecutter.project_slug}}/docker-compose.production.yml diff --git a/{{cookiecutter.project_slug}}/docs/howto.rst b/{{cookiecutter.project_slug}}/docs/howto.rst index 2d734ceb..944c2b73 100644 --- a/{{cookiecutter.project_slug}}/docs/howto.rst +++ b/{{cookiecutter.project_slug}}/docs/howto.rst @@ -15,7 +15,7 @@ from inside the `{{cookiecutter.project_slug}}/docs` directory. {% else %} To build and serve docs, use the commands:: - docker compose -f local.yml up docs + docker compose -f docker-compose.local.yml up docs {% endif %} diff --git a/{{cookiecutter.project_slug}}/docs/pycharm/configuration.rst b/{{cookiecutter.project_slug}}/docs/pycharm/configuration.rst index d8e76916..148854c6 100644 --- a/{{cookiecutter.project_slug}}/docs/pycharm/configuration.rst +++ b/{{cookiecutter.project_slug}}/docs/pycharm/configuration.rst @@ -21,7 +21,7 @@ Next, you have to add new remote python interpreter, based on already tested dep .. image:: images/3.png -Switch to *Docker Compose* and select `local.yml` file from directory of your project, next set *Service name* to `django` +Switch to *Docker Compose* and select `docker-compose.local.yml` file from directory of your project, next set *Service name* to `django` .. image:: images/4.png diff --git a/{{cookiecutter.project_slug}}/locale/README.md b/{{cookiecutter.project_slug}}/locale/README.md index a514ad10..8971441a 100644 --- a/{{cookiecutter.project_slug}}/locale/README.md +++ b/{{cookiecutter.project_slug}}/locale/README.md @@ -3,7 +3,7 @@ Start by configuring the `LANGUAGES` settings in `base.py`, by uncommenting languages you are willing to support. Then, translations strings will be placed in this folder when running: ```bash -{% if cookiecutter.use_docker == 'y' %}docker compose -f local.yml run --rm django {% endif %}python manage.py makemessages -all --no-location +{% if cookiecutter.use_docker == 'y' %}docker compose -f docker-compose.local.yml run --rm django {% endif %}python manage.py makemessages -all --no-location ``` This should generate `django.po` (stands for Portable Object) files under each locale `/LC_MESSAGES/django.po`. Each translatable string in the codebase is collected with its `msgid` and need to be translated as `msgstr`, for example: @@ -16,7 +16,7 @@ msgstr "utilisateurs" Once all translations are done, they need to be compiled into `.mo` files (stands for Machine Object), which are the actual binary files used by the application: ```bash -{% if cookiecutter.use_docker == 'y' %}docker compose -f local.yml run --rm django {% endif %}python manage.py compilemessages +{% if cookiecutter.use_docker == 'y' %}docker compose -f docker-compose.local.yml run --rm django {% endif %}python manage.py compilemessages ``` Note that the `.po` files are NOT used by the application directly, so if the `.mo` files are out of dates, the content won't appear as translated even if the `.po` files are up-to-date.