diff --git a/.github/contributors.json b/.github/contributors.json
index 40a7d26d6..b74dc4d15 100644
--- a/.github/contributors.json
+++ b/.github/contributors.json
@@ -1563,5 +1563,10 @@
"name": "quroom",
"github_login": "quroom",
"twitter_username": ""
+ },
+ {
+ "name": "Marios Frixou",
+ "github_login": "frixou89",
+ "twitter_username": ""
}
]
\ No newline at end of file
diff --git a/.github/workflows/update-changelog.yml b/.github/workflows/update-changelog.yml
index 305a608df..635d26f23 100644
--- a/.github/workflows/update-changelog.yml
+++ b/.github/workflows/update-changelog.yml
@@ -8,7 +8,7 @@ on:
workflow_dispatch:
jobs:
- release:
+ update:
# Disables this workflow from running in a repository that is not part of the indicated organization/user
if: github.repository_owner == 'cookiecutter'
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 2fda65d01..b2856f307 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -3,6 +3,35 @@ All enhancements and patches to Cookiecutter Django will be documented in this f
+## 2024.05.11
+
+
+### Updated
+
+- Update pre-commit to 3.7.1 ([#5066](https://github.com/cookiecutter/cookiecutter-django/pull/5066))
+
+- Auto-update pre-commit hooks ([#5067](https://github.com/cookiecutter/cookiecutter-django/pull/5067))
+
+## 2024.05.10
+
+
+### Updated
+
+- Update psycopg to 3.1.19 ([#5064](https://github.com/cookiecutter/cookiecutter-django/pull/5064))
+
+- Update django-upgrade to 1.17.0 ([#5065](https://github.com/cookiecutter/cookiecutter-django/pull/5065))
+
+- Auto-update pre-commit hooks ([#5062](https://github.com/cookiecutter/cookiecutter-django/pull/5062))
+
+- Update ruff to 0.4.4 ([#5061](https://github.com/cookiecutter/cookiecutter-django/pull/5061))
+
+## 2024.05.07
+
+
+### Updated
+
+- Update django to 4.2.13 ([#5058](https://github.com/cookiecutter/cookiecutter-django/pull/5058))
+
## 2024.05.06
diff --git a/CONTRIBUTORS.md b/CONTRIBUTORS.md
index bf4896dc8..595e9924a 100644
--- a/CONTRIBUTORS.md
+++ b/CONTRIBUTORS.md
@@ -1405,6 +1405,13 @@ Listed in alphabetical order.
Martin Blech |
diff --git a/docs/deployment-with-docker.rst b/docs/deployment-with-docker.rst
index 3d2f9f813..ebc42a52d 100644
--- a/docs/deployment-with-docker.rst
+++ b/docs/deployment-with-docker.rst
@@ -14,7 +14,7 @@ Prerequisites
Understanding the Docker Compose Setup
--------------------------------------
-Before you begin, check out the ``production.yml`` file in the root of this project. Keep note of how it provides configuration for the following services:
+Before you begin, check out the ``docker-compose.production.yml`` file in the root of this project. Keep note of how it provides configuration for the following services:
* ``django``: your application running behind ``Gunicorn``;
* ``postgres``: PostgreSQL database with the application's relational data;
@@ -107,7 +107,7 @@ To solve this, you can either:
2. create a ``.env`` file in the root of the project with just variables you need. You'll need to also define them in ``.envs/.production/.django`` (hence duplicating them).
3. set these variables when running the build command::
- DJANGO_AWS_S3_CUSTOM_DOMAIN=example.com docker compose -f production.yml build``.
+ DJANGO_AWS_S3_CUSTOM_DOMAIN=example.com docker compose -f docker-compose.production.yml build``.
None of these options are ideal, we're open to suggestions on how to improve this. If you think you have one, please open an issue or a pull request.
@@ -122,42 +122,42 @@ Building & Running Production Stack
You will need to build the stack first. To do that, run::
- docker compose -f production.yml build
+ docker compose -f docker-compose.production.yml build
Once this is ready, you can run it with::
- docker compose -f production.yml up
+ docker compose -f docker-compose.production.yml up
To run the stack and detach the containers, run::
- docker compose -f production.yml up -d
+ docker compose -f docker-compose.production.yml up -d
To run a migration, open up a second terminal and run::
- docker compose -f production.yml run --rm django python manage.py migrate
+ docker compose -f docker-compose.production.yml run --rm django python manage.py migrate
To create a superuser, run::
- docker compose -f production.yml run --rm django python manage.py createsuperuser
+ docker compose -f docker-compose.production.yml run --rm django python manage.py createsuperuser
If you need a shell, run::
- docker compose -f production.yml run --rm django python manage.py shell
+ docker compose -f docker-compose.production.yml run --rm django python manage.py shell
To check the logs out, run::
- docker compose -f production.yml logs
+ docker compose -f docker-compose.production.yml logs
If you want to scale your application, run::
- docker compose -f production.yml up --scale django=4
- docker compose -f production.yml up --scale celeryworker=2
+ docker compose -f docker-compose.production.yml up --scale django=4
+ docker compose -f docker-compose.production.yml up --scale celeryworker=2
.. warning:: don't try to scale ``postgres``, ``celerybeat``, or ``traefik``.
To see how your containers are doing run::
- docker compose -f production.yml ps
+ docker compose -f docker-compose.production.yml ps
Example: Supervisor
@@ -165,12 +165,12 @@ Example: Supervisor
Once you are ready with your initial setup, you want to make sure that your application is run by a process manager to
survive reboots and auto restarts in case of an error. You can use the process manager you are most familiar with. All
-it needs to do is to run ``docker compose -f production.yml up`` in your projects root directory.
+it needs to do is to run ``docker compose -f docker-compose.production.yml up`` in your projects root directory.
If you are using ``supervisor``, you can use this file as a starting point::
[program:{{cookiecutter.project_slug}}]
- command=docker compose -f production.yml up
+ command=docker compose -f docker-compose.production.yml up
directory=/path/to/{{cookiecutter.project_slug}}
redirect_stderr=true
autostart=true
diff --git a/docs/developing-locally-docker.rst b/docs/developing-locally-docker.rst
index 01970e469..83de99bb9 100644
--- a/docs/developing-locally-docker.rst
+++ b/docs/developing-locally-docker.rst
@@ -32,9 +32,9 @@ Build the Stack
This can take a while, especially the first time you run this particular command on your development system::
- $ docker compose -f local.yml build
+ $ docker compose -f docker-compose.local.yml build
-Generally, if you want to emulate production environment use ``production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it!
+Generally, if you want to emulate production environment use ``docker-compose.production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it!
Before doing any git commit, `pre-commit`_ should be installed globally on your local machine, and then::
@@ -51,11 +51,11 @@ This brings up both Django and PostgreSQL. The first time it is run it might tak
Open a terminal at the project root and run the following for local development::
- $ docker compose -f local.yml up
+ $ docker compose -f docker-compose.local.yml up
-You can also set the environment variable ``COMPOSE_FILE`` pointing to ``local.yml`` like this::
+You can also set the environment variable ``COMPOSE_FILE`` pointing to ``docker-compose.local.yml`` like this::
- $ export COMPOSE_FILE=local.yml
+ $ export COMPOSE_FILE=docker-compose.local.yml
And then run::
@@ -67,21 +67,21 @@ To run in a detached (background) mode, just::
These commands don't run the docs service. In order to run docs service you can run::
- $ docker compose -f docs.yml up
+ $ docker compose -f docker-compose.docs.yml up
To run the docs with local services just use::
- $ docker compose -f local.yml -f docs.yml up
+ $ docker compose -f docker-compose.local.yml -f docker-compose.docs.yml up
The site should start and be accessible at http://localhost:3000 if you selected Webpack or Gulp as frontend pipeline and http://localhost:8000 otherwise.
Execute Management Commands
---------------------------
-As with any shell command that we wish to run in our container, this is done using the ``docker compose -f local.yml run --rm`` command: ::
+As with any shell command that we wish to run in our container, this is done using the ``docker compose -f docker-compose.local.yml run --rm`` command: ::
- $ docker compose -f local.yml run --rm django python manage.py migrate
- $ docker compose -f local.yml run --rm django python manage.py createsuperuser
+ $ docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
+ $ docker compose -f docker-compose.local.yml run --rm django python manage.py createsuperuser
Here, ``django`` is the target service we are executing the commands against.
Also, please note that the ``docker exec`` does not work for running management commands.
@@ -97,7 +97,7 @@ When ``DEBUG`` is set to ``True``, the host is validated against ``['localhost',
Configuring the Environment
---------------------------
-This is the excerpt from your project's ``local.yml``: ::
+This is the excerpt from your project's ``docker-compose.local.yml``: ::
# ...
@@ -163,8 +163,8 @@ You have to modify the relevant requirement file: base, local or production by a
To get this change picked up, you'll need to rebuild the image(s) and restart the running container: ::
- docker compose -f local.yml build
- docker compose -f local.yml up
+ docker compose -f docker-compose.local.yml build
+ docker compose -f docker-compose.local.yml up
Debugging
~~~~~~~~~
@@ -178,7 +178,7 @@ If you are using the following within your code to debug: ::
Then you may need to run the following for it to work as desired: ::
- $ docker compose -f local.yml run --rm --service-ports django
+ $ docker compose -f docker-compose.local.yml run --rm --service-ports django
django-debug-toolbar
@@ -231,7 +231,7 @@ Prerequisites:
* ``use_docker`` was set to ``y`` on project initialization;
* ``use_celery`` was set to ``y`` on project initialization.
-By default, it's enabled both in local and production environments (``local.yml`` and ``production.yml`` Docker Compose configs, respectively) through a ``flower`` service. For added security, ``flower`` requires its clients to provide authentication credentials specified as the corresponding environments' ``.envs/.local/.django`` and ``.envs/.production/.django`` ``CELERY_FLOWER_USER`` and ``CELERY_FLOWER_PASSWORD`` environment variables. Check out ``localhost:5555`` and see for yourself.
+By default, it's enabled both in local and production environments (``docker-compose.local.yml`` and ``docker-compose.production.yml`` Docker Compose configs, respectively) through a ``flower`` service. For added security, ``flower`` requires its clients to provide authentication credentials specified as the corresponding environments' ``.envs/.local/.django`` and ``.envs/.production/.django`` ``CELERY_FLOWER_USER`` and ``CELERY_FLOWER_PASSWORD`` environment variables. Check out ``localhost:5555`` and see for yourself.
.. _`Flower`: https://github.com/mher/flower
@@ -279,7 +279,7 @@ certs
Take the certificates that you generated and place them in a folder called ``certs`` in the project's root folder. Assuming that you registered your local hostname as ``my-dev-env.local``, the certificates you will put in the folder should have the names ``my-dev-env.local.crt`` and ``my-dev-env.local.key``.
-local.yml
+docker-compose.local.yml
~~~~~~~~~
#. Add the ``nginx-proxy`` service. ::
@@ -323,7 +323,7 @@ You should allow the new hostname. ::
Rebuild your ``docker`` application. ::
- $ docker compose -f local.yml up -d --build
+ $ docker compose -f docker-compose.local.yml up -d --build
Go to your browser and type in your URL bar ``https://my-dev-env.local``
@@ -343,9 +343,9 @@ Webpack
If you are using Webpack:
-1. On the ``nginx-proxy`` service in ``local.yml``, change ``depends_on`` to ``node`` instead of ``django``.
+1. On the ``nginx-proxy`` service in ``docker-compose.local.yml``, change ``depends_on`` to ``node`` instead of ``django``.
-2. On the ``node`` service in ``local.yml``, add the following environment configuration:
+2. On the ``node`` service in ``docker-compose.local.yml``, add the following environment configuration:
::
diff --git a/docs/docker-postgres-backups.rst b/docs/docker-postgres-backups.rst
index 302e4c4b8..d214ee4e8 100644
--- a/docs/docker-postgres-backups.rst
+++ b/docs/docker-postgres-backups.rst
@@ -1,14 +1,14 @@
PostgreSQL Backups with Docker
==============================
-.. note:: For brevity it is assumed that you will be running the below commands against local environment, however, this is by no means mandatory so feel free to switch to ``production.yml`` when needed.
+.. note:: For brevity it is assumed that you will be running the below commands against local environment, however, this is by no means mandatory so feel free to switch to ``docker-compose.production.yml`` when needed.
Prerequisites
-------------
#. the project was generated with ``use_docker`` set to ``y``;
-#. the stack is up and running: ``docker compose -f local.yml up -d postgres``.
+#. the stack is up and running: ``docker compose -f docker-compose.local.yml up -d postgres``.
Creating a Backup
@@ -16,7 +16,7 @@ Creating a Backup
To create a backup, run::
- $ docker compose -f local.yml exec postgres backup
+ $ docker compose -f docker-compose.local.yml exec postgres backup
Assuming your project's database is named ``my_project`` here is what you will see: ::
@@ -31,7 +31,7 @@ Viewing the Existing Backups
To list existing backups, ::
- $ docker compose -f local.yml exec postgres backups
+ $ docker compose -f docker-compose.local.yml exec postgres backups
These are the sample contents of ``/backups``: ::
@@ -55,9 +55,9 @@ With a single backup file copied to ``.`` that would be ::
$ docker cp 9c5c3f055843:/backups/backup_2018_03_13T09_05_07.sql.gz .
-You can also get the container ID using ``docker compose -f local.yml ps -q postgres`` so if you want to automate your backups, you don't have to check the container ID manually every time. Here is the full command ::
+You can also get the container ID using ``docker compose -f docker-compose.local.yml ps -q postgres`` so if you want to automate your backups, you don't have to check the container ID manually every time. Here is the full command ::
- $ docker cp $(docker compose -f local.yml ps -q postgres):/backups ./backups
+ $ docker cp $(docker compose -f docker-compose.local.yml ps -q postgres):/backups ./backups
.. _`command`: https://docs.docker.com/engine/reference/commandline/cp/
@@ -66,7 +66,7 @@ Restoring from the Existing Backup
To restore from one of the backups you have already got (take the ``backup_2018_03_13T09_05_07.sql.gz`` for example), ::
- $ docker compose -f local.yml exec postgres restore backup_2018_03_13T09_05_07.sql.gz
+ $ docker compose -f docker-compose.local.yml exec postgres restore backup_2018_03_13T09_05_07.sql.gz
You will see something like ::
@@ -95,15 +95,15 @@ Backup to Amazon S3
For uploading your backups to Amazon S3 you can use the aws cli container. There is an upload command for uploading the postgres /backups directory recursively and there is a download command for downloading a specific backup. The default S3 environment variables are used. ::
- $ docker compose -f production.yml run --rm awscli upload
- $ docker compose -f production.yml run --rm awscli download backup_2018_03_13T09_05_07.sql.gz
+ $ docker compose -f docker-compose.production.yml run --rm awscli upload
+ $ docker compose -f docker-compose.production.yml run --rm awscli download backup_2018_03_13T09_05_07.sql.gz
Remove Backup
----------------------------------
To remove backup you can use the ``rmbackup`` command. This will remove the backup from the ``/backups`` directory. ::
- $ docker compose -f local.yml exec postgres rmbackup backup_2018_03_13T09_05_07.sql.gz
+ $ docker compose -f docker-compose.local.yml exec postgres rmbackup backup_2018_03_13T09_05_07.sql.gz
Upgrading PostgreSQL
@@ -111,17 +111,17 @@ Upgrading PostgreSQL
Upgrading PostgreSQL in your project requires a series of carefully executed steps. Start by halting all containers, excluding the postgres container. Following this, create a backup and proceed to remove the outdated data volume. ::
- $ docker compose -f local.yml down
- $ docker compose -f local.yml up -d postgres
- $ docker compose -f local.yml run --rm postgres backup
- $ docker compose -f local.yml down
+ $ docker compose -f docker-compose.local.yml down
+ $ docker compose -f docker-compose.local.yml up -d postgres
+ $ docker compose -f docker-compose.local.yml run --rm postgres backup
+ $ docker compose -f docker-compose.local.yml down
$ docker volume rm my_project_postgres_data
.. note:: Neglecting to remove the old data volume may lead to issues, such as the new postgres container failing to start with errors like ``FATAL: database files are incompatible with server``, and ``could not translate host name "postgres" to address: Name or service not known``.
To complete the upgrade, update the PostgreSQL version in the corresponding Dockerfile (e.g. ``compose/production/postgres/Dockerfile``) and build a new version of PostgreSQL. ::
- $ docker compose -f local.yml build postgres
- $ docker compose -f local.yml up -d postgres
- $ docker compose -f local.yml run --rm postgres restore backup_2018_03_13T09_05_07.sql.gz
- $ docker compose -f local.yml up -d
+ $ docker compose -f docker-compose.local.yml build postgres
+ $ docker compose -f docker-compose.local.yml up -d postgres
+ $ docker compose -f docker-compose.local.yml run --rm postgres restore backup_2018_03_13T09_05_07.sql.gz
+ $ docker compose -f docker-compose.local.yml up -d
diff --git a/docs/document.rst b/docs/document.rst
index f93f2b60e..61cb692d3 100644
--- a/docs/document.rst
+++ b/docs/document.rst
@@ -11,7 +11,7 @@ After you have set up to `develop locally`_, run the following command from the
If you set up your project to `develop locally with docker`_, run the following command: ::
- $ docker compose -f docs.yml up
+ $ docker compose -f docker-compose.docs.yml up
Navigate to port 9000 on your host to see the documentation. This will be opened automatically at `localhost`_ for local, non-docker development.
diff --git a/docs/testing.rst b/docs/testing.rst
index d403a30eb..58a05770a 100644
--- a/docs/testing.rst
+++ b/docs/testing.rst
@@ -19,7 +19,7 @@ You will get a readout of the `users` app that has already been set up with test
If you set up your project to `develop locally with docker`_, run the following command: ::
- $ docker compose -f local.yml run --rm django pytest
+ $ docker compose -f docker-compose.local.yml run --rm django pytest
Targeting particular apps for testing in ``docker`` follows a similar pattern as previously shown above.
@@ -36,8 +36,8 @@ Once the tests are complete, in order to see the code coverage, run the followin
If you're running the project locally with Docker, use these commands instead: ::
- $ docker compose -f local.yml run --rm django coverage run -m pytest
- $ docker compose -f local.yml run --rm django coverage report
+ $ docker compose -f docker-compose.local.yml run --rm django coverage run -m pytest
+ $ docker compose -f docker-compose.local.yml run --rm django coverage report
.. note::
diff --git a/docs/troubleshooting.rst b/docs/troubleshooting.rst
index 80bab2e29..847f0a701 100644
--- a/docs/troubleshooting.rst
+++ b/docs/troubleshooting.rst
@@ -30,7 +30,7 @@ If you recreate the project multiple times with the same name, Docker would pres
To fix this, you can either:
-- Clear your project-related Docker cache with ``docker compose -f local.yml down --volumes --rmi all``.
+- Clear your project-related Docker cache with ``docker compose -f docker-compose.local.yml down --volumes --rmi all``.
- Use the Docker volume sub-commands to find volumes (`ls`_) and remove them (`rm`_).
- Use the `prune`_ command to clear system-wide (use with care!).
diff --git a/hooks/post_gen_project.py b/hooks/post_gen_project.py
index 1ddab0636..9e9af5f2d 100644
--- a/hooks/post_gen_project.py
+++ b/hooks/post_gen_project.py
@@ -78,7 +78,11 @@ def remove_docker_files():
shutil.rmtree(".devcontainer")
shutil.rmtree("compose")
- file_names = ["local.yml", "production.yml", ".dockerignore"]
+ file_names = [
+ "docker-compose.local.yml",
+ "docker-compose.production.yml",
+ ".dockerignore",
+ ]
for file_name in file_names:
os.remove(file_name)
if "{{ cookiecutter.editor }}" == "PyCharm":
diff --git a/requirements.txt b/requirements.txt
index ee1a39967..47d06222f 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -4,10 +4,10 @@ binaryornot==0.4.4
# Code quality
# ------------------------------------------------------------------------------
-ruff==0.4.3
-django-upgrade==1.16.0
+ruff==0.4.4
+django-upgrade==1.17.0
djlint==1.34.1
-pre-commit==3.7.0
+pre-commit==3.7.1
# Testing
# ------------------------------------------------------------------------------
diff --git a/setup.py b/setup.py
index 77cac6782..6fac0c577 100644
--- a/setup.py
+++ b/setup.py
@@ -5,7 +5,7 @@ except ImportError:
from distutils.core import setup
# We use calendar versioning
-version = "2024.05.06"
+version = "2024.05.11"
with open("README.md") as readme_file:
long_description = readme_file.read()
diff --git a/tests/test_cookiecutter_generation.py b/tests/test_cookiecutter_generation.py
index 6b581b03a..141ae4cc1 100755
--- a/tests/test_cookiecutter_generation.py
+++ b/tests/test_cookiecutter_generation.py
@@ -247,7 +247,13 @@ def test_djlint_lint_passes(cookies, context_override):
# TODO: remove T002 when fixed https://github.com/Riverside-Healthcare/djLint/issues/687
ignored_rules = "H006,H030,H031,T002"
try:
- sh.djlint("--lint", "--ignore", f"{autofixable_rules},{ignored_rules}", ".", _cwd=str(result.project_path))
+ sh.djlint(
+ "--lint",
+ "--ignore",
+ f"{autofixable_rules},{ignored_rules}",
+ ".",
+ _cwd=str(result.project_path),
+ )
except sh.ErrorReturnCode as e:
pytest.fail(e.stdout.decode())
@@ -268,7 +274,7 @@ def test_djlint_check_passes(cookies, context_override):
["use_docker", "expected_test_script"],
[
("n", "pytest"),
- ("y", "docker compose -f local.yml run django pytest"),
+ ("y", "docker compose -f docker-compose.local.yml run django pytest"),
],
)
def test_travis_invokes_pytest(cookies, context, use_docker, expected_test_script):
@@ -293,7 +299,7 @@ def test_travis_invokes_pytest(cookies, context, use_docker, expected_test_scrip
["use_docker", "expected_test_script"],
[
("n", "pytest"),
- ("y", "docker compose -f local.yml run django pytest"),
+ ("y", "docker compose -f docker-compose.local.yml run django pytest"),
],
)
def test_gitlab_invokes_precommit_and_pytest(cookies, context, use_docker, expected_test_script):
@@ -320,7 +326,7 @@ def test_gitlab_invokes_precommit_and_pytest(cookies, context, use_docker, expec
["use_docker", "expected_test_script"],
[
("n", "pytest"),
- ("y", "docker compose -f local.yml run django pytest"),
+ ("y", "docker compose -f docker-compose.local.yml run django pytest"),
],
)
def test_github_invokes_linter_and_pytest(cookies, context, use_docker, expected_test_script):
diff --git a/tests/test_docker.sh b/tests/test_docker.sh
index 96bf8662d..473eede04 100755
--- a/tests/test_docker.sh
+++ b/tests/test_docker.sh
@@ -15,22 +15,22 @@ cookiecutter ../../ --no-input --overwrite-if-exists use_docker=y "$@"
cd my_awesome_project
# make sure all images build
-docker compose -f local.yml build
+docker compose -f docker-compose.local.yml build
# run the project's type checks
-docker compose -f local.yml run django mypy my_awesome_project
+docker compose -f docker-compose.local.yml run django mypy my_awesome_project
# run the project's tests
-docker compose -f local.yml run django pytest
+docker compose -f docker-compose.local.yml run django pytest
# return non-zero status code if there are migrations that have not been created
-docker compose -f local.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; }
+docker compose -f docker-compose.local.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; }
# Test support for translations
-docker compose -f local.yml run django python manage.py makemessages --all
+docker compose -f docker-compose.local.yml run django python manage.py makemessages --all
# Make sure the check doesn't raise any warnings
-docker compose -f local.yml run \
+docker compose -f docker-compose.local.yml run \
-e DJANGO_SECRET_KEY="$(openssl rand -base64 64)" \
-e REDIS_URL=redis://redis:6379/0 \
-e CELERY_BROKER_URL=redis://redis:6379/0 \
@@ -43,10 +43,10 @@ docker compose -f local.yml run \
django python manage.py check --settings=config.settings.production --deploy --database default --fail-level WARNING
# Generate the HTML for the documentation
-docker compose -f docs.yml run docs make html
+docker compose -f docker-compose.docs.yml run docs make html
# Run npm build script if package.json is present
if [ -f "package.json" ]
then
- docker compose -f local.yml run node npm run build
+ docker compose -f docker-compose.local.yml run node npm run build
fi
diff --git a/{{cookiecutter.project_slug}}/.devcontainer/devcontainer.json b/{{cookiecutter.project_slug}}/.devcontainer/devcontainer.json
index e16d06a20..5604b8a85 100644
--- a/{{cookiecutter.project_slug}}/.devcontainer/devcontainer.json
+++ b/{{cookiecutter.project_slug}}/.devcontainer/devcontainer.json
@@ -2,7 +2,7 @@
{
"name": "{{cookiecutter.project_slug}}_dev",
"dockerComposeFile": [
- "../local.yml"
+ "../docker-compose.local.yml"
],
"init": true,
"mounts": [
diff --git a/{{cookiecutter.project_slug}}/.drone.yml b/{{cookiecutter.project_slug}}/.drone.yml
index 829ead2ca..d6c13e62b 100644
--- a/{{cookiecutter.project_slug}}/.drone.yml
+++ b/{{cookiecutter.project_slug}}/.drone.yml
@@ -31,11 +31,11 @@ steps:
environment:
DATABASE_URL: pgsql://$POSTGRES_USER:$POSTGRES_PASSWORD@postgres/$POSTGRES_DB
commands:
- - docker-compose -f local.yml build
- - docker-compose -f docs.yml build
- - docker-compose -f local.yml run --rm django python manage.py migrate
- - docker-compose -f local.yml up -d
- - docker-compose -f local.yml run django pytest
+ - docker-compose -f docker-compose.local.yml build
+ - docker-compose -f docker-compose.docs.yml build
+ - docker-compose -f docker-compose.local.yml run --rm django python manage.py migrate
+ - docker-compose -f docker-compose.local.yml up -d
+ - docker-compose -f docker-compose.local.yml run django pytest
{%- else %}
image: python:3.12
commands:
diff --git a/{{cookiecutter.project_slug}}/.github/workflows/ci.yml b/{{cookiecutter.project_slug}}/.github/workflows/ci.yml
index 4e6ec1cd2..5cb9ead4f 100644
--- a/{{cookiecutter.project_slug}}/.github/workflows/ci.yml
+++ b/{{cookiecutter.project_slug}}/.github/workflows/ci.yml
@@ -69,19 +69,19 @@ jobs:
{%- if cookiecutter.use_docker == 'y' %}
- name: Build the Stack
- run: docker compose -f local.yml build django
+ run: docker compose -f docker-compose.local.yml build django
- name: Build the docs
- run: docker compose -f docs.yml build docs
+ run: docker compose -f docker-compose.docs.yml build docs
- name: Run DB Migrations
- run: docker compose -f local.yml run --rm django python manage.py migrate
+ run: docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
- name: Run Django Tests
- run: docker compose -f local.yml run django pytest
+ run: docker compose -f docker-compose.local.yml run django pytest
- name: Tear down the Stack
- run: docker compose -f local.yml down
+ run: docker compose -f docker-compose.local.yml down
{%- else %}
- name: Set up Python
diff --git a/{{cookiecutter.project_slug}}/.gitlab-ci.yml b/{{cookiecutter.project_slug}}/.gitlab-ci.yml
index 75c03e876..41eea0db4 100644
--- a/{{cookiecutter.project_slug}}/.gitlab-ci.yml
+++ b/{{cookiecutter.project_slug}}/.gitlab-ci.yml
@@ -33,13 +33,13 @@ pytest:
services:
- docker:dind
before_script:
- - docker compose -f local.yml build
- - docker compose -f docs.yml build
+ - docker compose -f docker-compose.local.yml build
+ - docker compose -f docker-compose.docs.yml build
# Ensure celerybeat does not crash due to non-existent tables
- - docker compose -f local.yml run --rm django python manage.py migrate
- - docker compose -f local.yml up -d
+ - docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
+ - docker compose -f docker-compose.local.yml up -d
script:
- - docker compose -f local.yml run django pytest
+ - docker compose -f docker-compose.local.yml run django pytest
{%- else %}
image: python:3.12
tags:
diff --git a/{{cookiecutter.project_slug}}/.pre-commit-config.yaml b/{{cookiecutter.project_slug}}/.pre-commit-config.yaml
index 71fee958b..461dcf920 100644
--- a/{{cookiecutter.project_slug}}/.pre-commit-config.yaml
+++ b/{{cookiecutter.project_slug}}/.pre-commit-config.yaml
@@ -28,14 +28,14 @@ repos:
exclude: '{{cookiecutter.project_slug}}/templates/'
- repo: https://github.com/adamchainz/django-upgrade
- rev: '1.16.0'
+ rev: '1.17.0'
hooks:
- id: django-upgrade
args: ['--target-version', '4.2']
# Run the Ruff linter.
- repo: https://github.com/astral-sh/ruff-pre-commit
- rev: v0.4.3
+ rev: v0.4.4
hooks:
# Linter
- id: ruff
diff --git a/{{cookiecutter.project_slug}}/.travis.yml b/{{cookiecutter.project_slug}}/.travis.yml
index abf12f42e..97f9f60a2 100644
--- a/{{cookiecutter.project_slug}}/.travis.yml
+++ b/{{cookiecutter.project_slug}}/.travis.yml
@@ -19,15 +19,15 @@ jobs:
before_script:
- docker compose -v
- docker -v
- - docker compose -f local.yml build
- - docker compose -f docs.yml build
+ - docker compose -f docker-compose.local.yml build
+ - docker compose -f docker-compose.docs.yml build
# Ensure celerybeat does not crash due to non-existent tables
- - docker compose -f local.yml run --rm django python manage.py migrate
- - docker compose -f local.yml up -d
+ - docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
+ - docker compose -f docker-compose.local.yml up -d
script:
- - docker compose -f local.yml run django pytest
+ - docker compose -f docker-compose.local.yml run django pytest
after_failure:
- - docker compose -f local.yml logs
+ - docker compose -f docker-compose.local.yml logs
{%- else %}
before_install:
- sudo apt-get update -qq
diff --git a/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/download b/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/download
index 9561d917a..12871a773 100644
--- a/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/download
+++ b/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/download
@@ -3,7 +3,7 @@
### Download a file from your Amazon S3 bucket to the postgres /backups folder
###
### Usage:
-### $ docker compose -f production.yml run --rm awscli <1>
+### $ docker compose -f docker-compose.production.yml run --rm awscli <1>
set -o errexit
set -o pipefail
diff --git a/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/upload b/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/upload
index 73c1b9bec..2f577824e 100644
--- a/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/upload
+++ b/{{cookiecutter.project_slug}}/compose/production/aws/maintenance/upload
@@ -3,7 +3,7 @@
### Upload the /backups folder to Amazon S3
###
### Usage:
-### $ docker compose -f production.yml run --rm awscli upload
+### $ docker compose -f docker-compose.production.yml run --rm awscli upload
set -o errexit
set -o pipefail
diff --git a/{{cookiecutter.project_slug}}/docs.yml b/{{cookiecutter.project_slug}}/docker-compose.docs.yml
similarity index 97%
rename from {{cookiecutter.project_slug}}/docs.yml
rename to {{cookiecutter.project_slug}}/docker-compose.docs.yml
index d9e50b928..215b6c3b7 100644
--- a/{{cookiecutter.project_slug}}/docs.yml
+++ b/{{cookiecutter.project_slug}}/docker-compose.docs.yml
@@ -1,5 +1,3 @@
-version: '3'
-
services:
docs:
image: {{ cookiecutter.project_slug }}_local_docs
diff --git a/{{cookiecutter.project_slug}}/local.yml b/{{cookiecutter.project_slug}}/docker-compose.local.yml
similarity index 92%
rename from {{cookiecutter.project_slug}}/local.yml
rename to {{cookiecutter.project_slug}}/docker-compose.local.yml
index cd2c7fad0..eced08ee8 100644
--- a/{{cookiecutter.project_slug}}/local.yml
+++ b/{{cookiecutter.project_slug}}/docker-compose.local.yml
@@ -1,8 +1,7 @@
-version: '3'
-
volumes:
{{ cookiecutter.project_slug }}_local_postgres_data: {}
{{ cookiecutter.project_slug }}_local_postgres_data_backups: {}
+ {% if cookiecutter.use_celery == 'y' %}{{ cookiecutter.project_slug }}_local_redis_data: {}{% endif %}
services:
django:{% if cookiecutter.use_celery == 'y' %} &django{% endif %}
@@ -54,6 +53,10 @@ services:
redis:
image: docker.io/redis:6
container_name: {{ cookiecutter.project_slug }}_local_redis
+ {% if cookiecutter.use_celery == 'y' %}
+ volumes:
+ - {{ cookiecutter.project_slug }}_local_redis_data:/data
+ {% endif %}
celeryworker:
<<: *django
diff --git a/{{cookiecutter.project_slug}}/production.yml b/{{cookiecutter.project_slug}}/docker-compose.production.yml
similarity index 94%
rename from {{cookiecutter.project_slug}}/production.yml
rename to {{cookiecutter.project_slug}}/docker-compose.production.yml
index 29052cbac..d0d06338d 100644
--- a/{{cookiecutter.project_slug}}/production.yml
+++ b/{{cookiecutter.project_slug}}/docker-compose.production.yml
@@ -1,5 +1,3 @@
-version: '3'
-
volumes:
production_postgres_data: {}
production_postgres_data_backups: {}
@@ -7,6 +5,10 @@ volumes:
{%- if cookiecutter.cloud_provider == 'None' %}
production_django_media: {}
{%- endif %}
+ {% if cookiecutter.use_celery == 'y' %}
+ production_redis_data: {}
+ {% endif %}
+
services:
django:{% if cookiecutter.use_celery == 'y' %} &django{% endif %}
@@ -68,6 +70,12 @@ services:
redis:
image: docker.io/redis:6
+ {% if cookiecutter.use_celery == 'y' %}
+ volumes:
+ - production_redis_data:/data
+ {% endif %}
+
+
{%- if cookiecutter.use_celery == 'y' %}
celeryworker:
diff --git a/{{cookiecutter.project_slug}}/docs/howto.rst b/{{cookiecutter.project_slug}}/docs/howto.rst
index 2d734ceb2..944c2b731 100644
--- a/{{cookiecutter.project_slug}}/docs/howto.rst
+++ b/{{cookiecutter.project_slug}}/docs/howto.rst
@@ -15,7 +15,7 @@ from inside the `{{cookiecutter.project_slug}}/docs` directory.
{% else %}
To build and serve docs, use the commands::
- docker compose -f local.yml up docs
+ docker compose -f docker-compose.local.yml up docs
{% endif %}
diff --git a/{{cookiecutter.project_slug}}/docs/pycharm/configuration.rst b/{{cookiecutter.project_slug}}/docs/pycharm/configuration.rst
index d8e769167..148854c64 100644
--- a/{{cookiecutter.project_slug}}/docs/pycharm/configuration.rst
+++ b/{{cookiecutter.project_slug}}/docs/pycharm/configuration.rst
@@ -21,7 +21,7 @@ Next, you have to add new remote python interpreter, based on already tested dep
.. image:: images/3.png
-Switch to *Docker Compose* and select `local.yml` file from directory of your project, next set *Service name* to `django`
+Switch to *Docker Compose* and select `docker-compose.local.yml` file from directory of your project, next set *Service name* to `django`
.. image:: images/4.png
diff --git a/{{cookiecutter.project_slug}}/locale/README.md b/{{cookiecutter.project_slug}}/locale/README.md
index a514ad10c..8971441a0 100644
--- a/{{cookiecutter.project_slug}}/locale/README.md
+++ b/{{cookiecutter.project_slug}}/locale/README.md
@@ -3,7 +3,7 @@
Start by configuring the `LANGUAGES` settings in `base.py`, by uncommenting languages you are willing to support. Then, translations strings will be placed in this folder when running:
```bash
-{% if cookiecutter.use_docker == 'y' %}docker compose -f local.yml run --rm django {% endif %}python manage.py makemessages -all --no-location
+{% if cookiecutter.use_docker == 'y' %}docker compose -f docker-compose.local.yml run --rm django {% endif %}python manage.py makemessages -all --no-location
```
This should generate `django.po` (stands for Portable Object) files under each locale `/LC_MESSAGES/django.po`. Each translatable string in the codebase is collected with its `msgid` and need to be translated as `msgstr`, for example:
@@ -16,7 +16,7 @@ msgstr "utilisateurs"
Once all translations are done, they need to be compiled into `.mo` files (stands for Machine Object), which are the actual binary files used by the application:
```bash
-{% if cookiecutter.use_docker == 'y' %}docker compose -f local.yml run --rm django {% endif %}python manage.py compilemessages
+{% if cookiecutter.use_docker == 'y' %}docker compose -f docker-compose.local.yml run --rm django {% endif %}python manage.py compilemessages
```
Note that the `.po` files are NOT used by the application directly, so if the `.mo` files are out of dates, the content won't appear as translated even if the `.po` files are up-to-date.
diff --git a/{{cookiecutter.project_slug}}/requirements/base.txt b/{{cookiecutter.project_slug}}/requirements/base.txt
index a50afc603..1b10a113c 100644
--- a/{{cookiecutter.project_slug}}/requirements/base.txt
+++ b/{{cookiecutter.project_slug}}/requirements/base.txt
@@ -28,7 +28,7 @@ uvicorn[standard]==0.29.0 # https://github.com/encode/uvicorn
# Django
# ------------------------------------------------------------------------------
-django==4.2.12 # pyup: < 5.0 # https://www.djangoproject.com/
+django==4.2.13 # pyup: < 5.0 # https://www.djangoproject.com/
django-environ==0.11.2 # https://github.com/joke2k/django-environ
django-model-utils==4.5.1 # https://github.com/jazzband/django-model-utils
django-allauth[mfa]==0.62.1 # https://github.com/pennersr/django-allauth
diff --git a/{{cookiecutter.project_slug}}/requirements/local.txt b/{{cookiecutter.project_slug}}/requirements/local.txt
index 85af20bea..23253a9da 100644
--- a/{{cookiecutter.project_slug}}/requirements/local.txt
+++ b/{{cookiecutter.project_slug}}/requirements/local.txt
@@ -1,11 +1,11 @@
-r production.txt
-Werkzeug[watchdog]==3.0.2 # https://github.com/pallets/werkzeug
+Werkzeug[watchdog]==3.0.3 # https://github.com/pallets/werkzeug
ipdb==0.13.13 # https://github.com/gotcha/ipdb
{%- if cookiecutter.use_docker == 'y' %}
-psycopg[c]==3.1.18 # https://github.com/psycopg/psycopg
+psycopg[c]==3.1.19 # https://github.com/psycopg/psycopg
{%- else %}
-psycopg[binary]==3.1.18 # https://github.com/psycopg/psycopg
+psycopg[binary]==3.1.19 # https://github.com/psycopg/psycopg
{%- endif %}
{%- if cookiecutter.use_async == 'y' or cookiecutter.use_celery == 'y' %}
watchfiles==0.21.0 # https://github.com/samuelcolvin/watchfiles
@@ -28,10 +28,10 @@ sphinx-autobuild==2024.4.16 # https://github.com/GaretJax/sphinx-autobuild
# Code quality
# ------------------------------------------------------------------------------
-ruff==0.4.3 # https://github.com/astral-sh/ruff
+ruff==0.4.4 # https://github.com/astral-sh/ruff
coverage==7.5.1 # https://github.com/nedbat/coveragepy
djlint==1.34.1 # https://github.com/Riverside-Healthcare/djLint
-pre-commit==3.7.0 # https://github.com/pre-commit/pre-commit
+pre-commit==3.7.1 # https://github.com/pre-commit/pre-commit
# Django
# ------------------------------------------------------------------------------
diff --git a/{{cookiecutter.project_slug}}/requirements/production.txt b/{{cookiecutter.project_slug}}/requirements/production.txt
index 02f9120ba..65f82e57a 100644
--- a/{{cookiecutter.project_slug}}/requirements/production.txt
+++ b/{{cookiecutter.project_slug}}/requirements/production.txt
@@ -3,7 +3,7 @@
-r base.txt
gunicorn==22.0.0 # https://github.com/benoitc/gunicorn
-psycopg[c]==3.1.18 # https://github.com/psycopg/psycopg
+psycopg[c]==3.1.19 # https://github.com/psycopg/psycopg
{%- if cookiecutter.use_whitenoise == 'n' %}
Collectfast==2.2.0 # https://github.com/antonagestam/collectfast
{%- endif %}
|