Renamed local.yml to docker-compose.local.yml

This commit is contained in:
Matthew Foster Walsh 2024-04-15 13:09:20 -04:00
parent 7d032d7303
commit 0338748969
No known key found for this signature in database
16 changed files with 145 additions and 89 deletions

View File

@ -32,7 +32,7 @@ Build the Stack
This can take a while, especially the first time you run this particular command on your development system::
$ docker compose -f local.yml build
$ docker compose -f docker-compose.local.yml build
Generally, if you want to emulate production environment use ``production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it!
@ -51,11 +51,11 @@ This brings up both Django and PostgreSQL. The first time it is run it might tak
Open a terminal at the project root and run the following for local development::
$ docker compose -f local.yml up
$ docker compose -f docker-compose.local.yml up
You can also set the environment variable ``COMPOSE_FILE`` pointing to ``local.yml`` like this::
You can also set the environment variable ``COMPOSE_FILE`` pointing to ``docker-compose.local.yml`` like this::
$ export COMPOSE_FILE=local.yml
$ export COMPOSE_FILE=docker-compose.local.yml
And then run::
@ -71,17 +71,17 @@ These commands don't run the docs service. In order to run docs service you can
To run the docs with local services just use::
$ docker compose -f local.yml -f docs.yml up
$ docker compose -f docker-compose.local.yml -f docs.yml up
The site should start and be accessible at http://localhost:3000 if you selected Webpack or Gulp as frontend pipeline and http://localhost:8000 otherwise.
Execute Management Commands
---------------------------
As with any shell command that we wish to run in our container, this is done using the ``docker compose -f local.yml run --rm`` command: ::
As with any shell command that we wish to run in our container, this is done using the ``docker compose -f docker-compose.local.yml run --rm`` command: ::
$ docker compose -f local.yml run --rm django python manage.py migrate
$ docker compose -f local.yml run --rm django python manage.py createsuperuser
$ docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
$ docker compose -f docker-compose.local.yml run --rm django python manage.py createsuperuser
Here, ``django`` is the target service we are executing the commands against.
Also, please note that the ``docker exec`` does not work for running management commands.
@ -97,7 +97,7 @@ When ``DEBUG`` is set to ``True``, the host is validated against ``['localhost',
Configuring the Environment
---------------------------
This is the excerpt from your project's ``local.yml``: ::
This is the excerpt from your project's ``docker-compose.local.yml``: ::
# ...
@ -163,8 +163,8 @@ You have to modify the relevant requirement file: base, local or production by a
To get this change picked up, you'll need to rebuild the image(s) and restart the running container: ::
docker compose -f local.yml build
docker compose -f local.yml up
docker compose -f docker-compose.local.yml build
docker compose -f docker-compose.local.yml up
Debugging
~~~~~~~~~
@ -178,7 +178,7 @@ If you are using the following within your code to debug: ::
Then you may need to run the following for it to work as desired: ::
$ docker compose -f local.yml run --rm --service-ports django
$ docker compose -f docker-compose.local.yml run --rm --service-ports django
django-debug-toolbar
@ -231,7 +231,7 @@ Prerequisites:
* ``use_docker`` was set to ``y`` on project initialization;
* ``use_celery`` was set to ``y`` on project initialization.
By default, it's enabled both in local and production environments (``local.yml`` and ``production.yml`` Docker Compose configs, respectively) through a ``flower`` service. For added security, ``flower`` requires its clients to provide authentication credentials specified as the corresponding environments' ``.envs/.local/.django`` and ``.envs/.production/.django`` ``CELERY_FLOWER_USER`` and ``CELERY_FLOWER_PASSWORD`` environment variables. Check out ``localhost:5555`` and see for yourself.
By default, it's enabled both in local and production environments (``docker-compose.local.yml`` and ``production.yml`` Docker Compose configs, respectively) through a ``flower`` service. For added security, ``flower`` requires its clients to provide authentication credentials specified as the corresponding environments' ``.envs/.local/.django`` and ``.envs/.production/.django`` ``CELERY_FLOWER_USER`` and ``CELERY_FLOWER_PASSWORD`` environment variables. Check out ``localhost:5555`` and see for yourself.
.. _`Flower`: https://github.com/mher/flower
@ -279,7 +279,7 @@ certs
Take the certificates that you generated and place them in a folder called ``certs`` in the project's root folder. Assuming that you registered your local hostname as ``my-dev-env.local``, the certificates you will put in the folder should have the names ``my-dev-env.local.crt`` and ``my-dev-env.local.key``.
local.yml
docker-compose.local.yml
~~~~~~~~~
#. Add the ``nginx-proxy`` service. ::
@ -323,7 +323,7 @@ You should allow the new hostname. ::
Rebuild your ``docker`` application. ::
$ docker compose -f local.yml up -d --build
$ docker compose -f docker-compose.local.yml up -d --build
Go to your browser and type in your URL bar ``https://my-dev-env.local``
@ -343,9 +343,9 @@ Webpack
If you are using Webpack:
1. On the ``nginx-proxy`` service in ``local.yml``, change ``depends_on`` to ``node`` instead of ``django``.
1. On the ``nginx-proxy`` service in ``docker-compose.local.yml``, change ``depends_on`` to ``node`` instead of ``django``.
2. On the ``node`` service in ``local.yml``, add the following environment configuration:
2. On the ``node`` service in ``docker-compose.local.yml``, add the following environment configuration:
::

View File

@ -8,7 +8,7 @@ Prerequisites
-------------
#. the project was generated with ``use_docker`` set to ``y``;
#. the stack is up and running: ``docker compose -f local.yml up -d postgres``.
#. the stack is up and running: ``docker compose -f docker-compose.local.yml up -d postgres``.
Creating a Backup
@ -16,7 +16,7 @@ Creating a Backup
To create a backup, run::
$ docker compose -f local.yml exec postgres backup
$ docker compose -f docker-compose.local.yml exec postgres backup
Assuming your project's database is named ``my_project`` here is what you will see: ::
@ -31,7 +31,7 @@ Viewing the Existing Backups
To list existing backups, ::
$ docker compose -f local.yml exec postgres backups
$ docker compose -f docker-compose.local.yml exec postgres backups
These are the sample contents of ``/backups``: ::
@ -55,9 +55,9 @@ With a single backup file copied to ``.`` that would be ::
$ docker cp 9c5c3f055843:/backups/backup_2018_03_13T09_05_07.sql.gz .
You can also get the container ID using ``docker compose -f local.yml ps -q postgres`` so if you want to automate your backups, you don't have to check the container ID manually every time. Here is the full command ::
You can also get the container ID using ``docker compose -f docker-compose.local.yml ps -q postgres`` so if you want to automate your backups, you don't have to check the container ID manually every time. Here is the full command ::
$ docker cp $(docker compose -f local.yml ps -q postgres):/backups ./backups
$ docker cp $(docker compose -f docker-compose.local.yml ps -q postgres):/backups ./backups
.. _`command`: https://docs.docker.com/engine/reference/commandline/cp/
@ -66,7 +66,7 @@ Restoring from the Existing Backup
To restore from one of the backups you have already got (take the ``backup_2018_03_13T09_05_07.sql.gz`` for example), ::
$ docker compose -f local.yml exec postgres restore backup_2018_03_13T09_05_07.sql.gz
$ docker compose -f docker-compose.local.yml exec postgres restore backup_2018_03_13T09_05_07.sql.gz
You will see something like ::
@ -103,7 +103,7 @@ Remove Backup
To remove backup you can use the ``rmbackup`` command. This will remove the backup from the ``/backups`` directory. ::
$ docker compose -f local.yml exec postgres rmbackup backup_2018_03_13T09_05_07.sql.gz
$ docker compose -f docker-compose.local.yml exec postgres rmbackup backup_2018_03_13T09_05_07.sql.gz
Upgrading PostgreSQL
@ -111,17 +111,17 @@ Upgrading PostgreSQL
Upgrading PostgreSQL in your project requires a series of carefully executed steps. Start by halting all containers, excluding the postgres container. Following this, create a backup and proceed to remove the outdated data volume. ::
$ docker compose -f local.yml down
$ docker compose -f local.yml up -d postgres
$ docker compose -f local.yml run --rm postgres backup
$ docker compose -f local.yml down
$ docker compose -f docker-compose.local.yml down
$ docker compose -f docker-compose.local.yml up -d postgres
$ docker compose -f docker-compose.local.yml run --rm postgres backup
$ docker compose -f docker-compose.local.yml down
$ docker volume rm my_project_postgres_data
.. note:: Neglecting to remove the old data volume may lead to issues, such as the new postgres container failing to start with errors like ``FATAL: database files are incompatible with server``, and ``could not translate host name "postgres" to address: Name or service not known``.
To complete the upgrade, update the PostgreSQL version in the corresponding Dockerfile (e.g. ``compose/production/postgres/Dockerfile``) and build a new version of PostgreSQL. ::
$ docker compose -f local.yml build postgres
$ docker compose -f local.yml up -d postgres
$ docker compose -f local.yml run --rm postgres restore backup_2018_03_13T09_05_07.sql.gz
$ docker compose -f local.yml up -d
$ docker compose -f docker-compose.local.yml build postgres
$ docker compose -f docker-compose.local.yml up -d postgres
$ docker compose -f docker-compose.local.yml run --rm postgres restore backup_2018_03_13T09_05_07.sql.gz
$ docker compose -f docker-compose.local.yml up -d

View File

@ -19,7 +19,7 @@ You will get a readout of the `users` app that has already been set up with test
If you set up your project to `develop locally with docker`_, run the following command: ::
$ docker compose -f local.yml run --rm django pytest
$ docker compose -f docker-compose.local.yml run --rm django pytest
Targeting particular apps for testing in ``docker`` follows a similar pattern as previously shown above.
@ -36,8 +36,8 @@ Once the tests are complete, in order to see the code coverage, run the followin
If you're running the project locally with Docker, use these commands instead: ::
$ docker compose -f local.yml run --rm django coverage run -m pytest
$ docker compose -f local.yml run --rm django coverage report
$ docker compose -f docker-compose.local.yml run --rm django coverage run -m pytest
$ docker compose -f docker-compose.local.yml run --rm django coverage report
.. note::

View File

@ -30,7 +30,7 @@ If you recreate the project multiple times with the same name, Docker would pres
To fix this, you can either:
- Clear your project-related Docker cache with ``docker compose -f local.yml down --volumes --rmi all``.
- Clear your project-related Docker cache with ``docker compose -f docker-compose.local.yml down --volumes --rmi all``.
- Use the Docker volume sub-commands to find volumes (`ls`_) and remove them (`rm`_).
- Use the `prune`_ command to clear system-wide (use with care!).

View File

@ -78,7 +78,7 @@ def remove_docker_files():
shutil.rmtree(".devcontainer")
shutil.rmtree("compose")
file_names = ["local.yml", "production.yml", ".dockerignore"]
file_names = ["docker-compose.local.yml", "production.yml", ".dockerignore"]
for file_name in file_names:
os.remove(file_name)
if "{{ cookiecutter.editor }}" == "PyCharm":
@ -94,7 +94,10 @@ def remove_utility_files():
def remove_heroku_files():
file_names = ["Procfile", "runtime.txt", "requirements.txt"]
for file_name in file_names:
if file_name == "requirements.txt" and "{{ cookiecutter.ci_tool }}".lower() == "travis":
if (
file_name == "requirements.txt"
and "{{ cookiecutter.ci_tool }}".lower() == "travis"
):
# don't remove the file if we are using travisci but not using heroku
continue
os.remove(file_name)
@ -194,7 +197,9 @@ def handle_js_runner(choice, use_docker, use_async):
]
if not use_docker:
dev_django_cmd = (
"uvicorn config.asgi:application --reload" if use_async else "python manage.py runserver_plus"
"uvicorn config.asgi:application --reload"
if use_async
else "python manage.py runserver_plus"
)
scripts.update(
{
@ -231,7 +236,9 @@ def remove_celery_files():
file_names = [
os.path.join("config", "celery_app.py"),
os.path.join("{{ cookiecutter.project_slug }}", "users", "tasks.py"),
os.path.join("{{ cookiecutter.project_slug }}", "users", "tests", "test_tasks.py"),
os.path.join(
"{{ cookiecutter.project_slug }}", "users", "tests", "test_tasks.py"
),
]
for file_name in file_names:
os.remove(file_name)
@ -262,7 +269,9 @@ def remove_dotdrone_file():
os.remove(".drone.yml")
def generate_random_string(length, using_digits=False, using_ascii_letters=False, using_punctuation=False):
def generate_random_string(
length, using_digits=False, using_ascii_letters=False, using_punctuation=False
):
"""
Example:
opting out for 50 symbol-long, [a-z][A-Z][0-9] string
@ -356,7 +365,9 @@ def set_postgres_password(file_path, value=None):
def set_celery_flower_user(file_path, value):
celery_flower_user = set_flag(file_path, "!!!SET CELERY_FLOWER_USER!!!", value=value)
celery_flower_user = set_flag(
file_path, "!!!SET CELERY_FLOWER_USER!!!", value=value
)
return celery_flower_user
@ -388,14 +399,22 @@ def set_flags_in_envs(postgres_user, celery_flower_user, debug=False):
set_django_admin_url(production_django_envs_path)
set_postgres_user(local_postgres_envs_path, value=postgres_user)
set_postgres_password(local_postgres_envs_path, value=DEBUG_VALUE if debug else None)
set_postgres_password(
local_postgres_envs_path, value=DEBUG_VALUE if debug else None
)
set_postgres_user(production_postgres_envs_path, value=postgres_user)
set_postgres_password(production_postgres_envs_path, value=DEBUG_VALUE if debug else None)
set_postgres_password(
production_postgres_envs_path, value=DEBUG_VALUE if debug else None
)
set_celery_flower_user(local_django_envs_path, value=celery_flower_user)
set_celery_flower_password(local_django_envs_path, value=DEBUG_VALUE if debug else None)
set_celery_flower_password(
local_django_envs_path, value=DEBUG_VALUE if debug else None
)
set_celery_flower_user(production_django_envs_path, value=celery_flower_user)
set_celery_flower_password(production_django_envs_path, value=DEBUG_VALUE if debug else None)
set_celery_flower_password(
production_django_envs_path, value=DEBUG_VALUE if debug else None
)
def set_flags_in_settings_files():
@ -425,9 +444,21 @@ def remove_aws_dockerfile():
def remove_drf_starter_files():
os.remove(os.path.join("config", "api_router.py"))
shutil.rmtree(os.path.join("{{cookiecutter.project_slug}}", "users", "api"))
os.remove(os.path.join("{{cookiecutter.project_slug}}", "users", "tests", "test_drf_urls.py"))
os.remove(os.path.join("{{cookiecutter.project_slug}}", "users", "tests", "test_drf_views.py"))
os.remove(os.path.join("{{cookiecutter.project_slug}}", "users", "tests", "test_swagger.py"))
os.remove(
os.path.join(
"{{cookiecutter.project_slug}}", "users", "tests", "test_drf_urls.py"
)
)
os.remove(
os.path.join(
"{{cookiecutter.project_slug}}", "users", "tests", "test_drf_views.py"
)
)
os.remove(
os.path.join(
"{{cookiecutter.project_slug}}", "users", "tests", "test_swagger.py"
)
)
def main():
@ -456,13 +487,19 @@ def main():
else:
remove_docker_files()
if "{{ cookiecutter.use_docker }}".lower() == "y" and "{{ cookiecutter.cloud_provider}}" != "AWS":
if (
"{{ cookiecutter.use_docker }}".lower() == "y"
and "{{ cookiecutter.cloud_provider}}" != "AWS"
):
remove_aws_dockerfile()
if "{{ cookiecutter.use_heroku }}".lower() == "n":
remove_heroku_files()
if "{{ cookiecutter.use_docker }}".lower() == "n" and "{{ cookiecutter.use_heroku }}".lower() == "n":
if (
"{{ cookiecutter.use_docker }}".lower() == "n"
and "{{ cookiecutter.use_heroku }}".lower() == "n"
):
if "{{ cookiecutter.keep_local_envs_in_vcs }}".lower() == "y":
print(
INFO + ".env(s) are only utilized when Docker Compose and/or "
@ -491,7 +528,10 @@ def main():
use_async=("{{ cookiecutter.use_async }}".lower() == "y"),
)
if "{{ cookiecutter.cloud_provider }}" == "None" and "{{ cookiecutter.use_docker }}".lower() == "n":
if (
"{{ cookiecutter.cloud_provider }}" == "None"
and "{{ cookiecutter.use_docker }}".lower() == "n"
):
print(
WARNING + "You chose to not use any cloud providers nor Docker, "
"media files won't be served in production." + TERMINATOR

View File

@ -148,7 +148,11 @@ def _fixture_id(ctx):
def build_files_list(base_dir):
"""Build a list containing absolute paths to the generated files."""
return [os.path.join(dirpath, file_path) for dirpath, subdirs, files in os.walk(base_dir) for file_path in files]
return [
os.path.join(dirpath, file_path)
for dirpath, subdirs, files in os.walk(base_dir)
for file_path in files
]
def check_paths(paths):
@ -225,7 +229,9 @@ def test_django_upgrade_passes(cookies, context_override):
python_files = [
file_path.removeprefix(f"{result.project_path}/")
for file_path in glob.glob(str(result.project_path / "**" / "*.py"), recursive=True)
for file_path in glob.glob(
str(result.project_path / "**" / "*.py"), recursive=True
)
]
try:
sh.django_upgrade(
@ -247,7 +253,13 @@ def test_djlint_lint_passes(cookies, context_override):
# TODO: remove T002 when fixed https://github.com/Riverside-Healthcare/djLint/issues/687
ignored_rules = "H006,H030,H031,T002"
try:
sh.djlint("--lint", "--ignore", f"{autofixable_rules},{ignored_rules}", ".", _cwd=str(result.project_path))
sh.djlint(
"--lint",
"--ignore",
f"{autofixable_rules},{ignored_rules}",
".",
_cwd=str(result.project_path),
)
except sh.ErrorReturnCode as e:
pytest.fail(e.stdout.decode())
@ -268,7 +280,7 @@ def test_djlint_check_passes(cookies, context_override):
["use_docker", "expected_test_script"],
[
("n", "pytest"),
("y", "docker compose -f local.yml run django pytest"),
("y", "docker compose -f docker-compose.local.yml run django pytest"),
],
)
def test_travis_invokes_pytest(cookies, context, use_docker, expected_test_script):
@ -293,10 +305,12 @@ def test_travis_invokes_pytest(cookies, context, use_docker, expected_test_scrip
["use_docker", "expected_test_script"],
[
("n", "pytest"),
("y", "docker compose -f local.yml run django pytest"),
("y", "docker compose -f docker-compose.local.yml run django pytest"),
],
)
def test_gitlab_invokes_precommit_and_pytest(cookies, context, use_docker, expected_test_script):
def test_gitlab_invokes_precommit_and_pytest(
cookies, context, use_docker, expected_test_script
):
context.update({"ci_tool": "Gitlab", "use_docker": use_docker})
result = cookies.bake(extra_context=context)
@ -320,10 +334,12 @@ def test_gitlab_invokes_precommit_and_pytest(cookies, context, use_docker, expec
["use_docker", "expected_test_script"],
[
("n", "pytest"),
("y", "docker compose -f local.yml run django pytest"),
("y", "docker compose -f docker-compose.local.yml run django pytest"),
],
)
def test_github_invokes_linter_and_pytest(cookies, context, use_docker, expected_test_script):
def test_github_invokes_linter_and_pytest(
cookies, context, use_docker, expected_test_script
):
context.update({"ci_tool": "Github", "use_docker": use_docker})
result = cookies.bake(extra_context=context)

View File

@ -15,22 +15,22 @@ cookiecutter ../../ --no-input --overwrite-if-exists use_docker=y "$@"
cd my_awesome_project
# make sure all images build
docker compose -f local.yml build
docker compose -f docker-compose.local.yml build
# run the project's type checks
docker compose -f local.yml run django mypy my_awesome_project
docker compose -f docker-compose.local.yml run django mypy my_awesome_project
# run the project's tests
docker compose -f local.yml run django pytest
docker compose -f docker-compose.local.yml run django pytest
# return non-zero status code if there are migrations that have not been created
docker compose -f local.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; }
docker compose -f docker-compose.local.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; }
# Test support for translations
docker compose -f local.yml run django python manage.py makemessages --all
docker compose -f docker-compose.local.yml run django python manage.py makemessages --all
# Make sure the check doesn't raise any warnings
docker compose -f local.yml run \
docker compose -f docker-compose.local.yml run \
-e DJANGO_SECRET_KEY="$(openssl rand -base64 64)" \
-e REDIS_URL=redis://redis:6379/0 \
-e CELERY_BROKER_URL=redis://redis:6379/0 \
@ -48,5 +48,5 @@ docker compose -f docs.yml run docs make html
# Run npm build script if package.json is present
if [ -f "package.json" ]
then
docker compose -f local.yml run node npm run build
docker compose -f docker-compose.local.yml run node npm run build
fi

View File

@ -2,7 +2,7 @@
{
"name": "{{cookiecutter.project_slug}}_dev",
"dockerComposeFile": [
"../local.yml"
"../docker-compose.local.yml"
],
"init": true,
"mounts": [

View File

@ -31,11 +31,11 @@ steps:
environment:
DATABASE_URL: pgsql://$POSTGRES_USER:$POSTGRES_PASSWORD@postgres/$POSTGRES_DB
commands:
- docker-compose -f local.yml build
- docker-compose -f docker-compose.local.yml build
- docker-compose -f docs.yml build
- docker-compose -f local.yml run --rm django python manage.py migrate
- docker-compose -f local.yml up -d
- docker-compose -f local.yml run django pytest
- docker-compose -f docker-compose.local.yml run --rm django python manage.py migrate
- docker-compose -f docker-compose.local.yml up -d
- docker-compose -f docker-compose.local.yml run django pytest
{%- else %}
image: python:3.12
commands:

View File

@ -69,19 +69,19 @@ jobs:
{%- if cookiecutter.use_docker == 'y' %}
- name: Build the Stack
run: docker compose -f local.yml build django
run: docker compose -f docker-compose.local.yml build django
- name: Build the docs
run: docker compose -f docs.yml build docs
- name: Run DB Migrations
run: docker compose -f local.yml run --rm django python manage.py migrate
run: docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
- name: Run Django Tests
run: docker compose -f local.yml run django pytest
run: docker compose -f docker-compose.local.yml run django pytest
- name: Tear down the Stack
run: docker compose -f local.yml down
run: docker compose -f docker-compose.local.yml down
{%- else %}
- name: Set up Python

View File

@ -33,13 +33,13 @@ pytest:
services:
- docker:dind
before_script:
- docker compose -f local.yml build
- docker compose -f docker-compose.local.yml build
- docker compose -f docs.yml build
# Ensure celerybeat does not crash due to non-existent tables
- docker compose -f local.yml run --rm django python manage.py migrate
- docker compose -f local.yml up -d
- docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
- docker compose -f docker-compose.local.yml up -d
script:
- docker compose -f local.yml run django pytest
- docker compose -f docker-compose.local.yml run django pytest
{%- else %}
image: python:3.12
tags:

View File

@ -19,15 +19,15 @@ jobs:
before_script:
- docker compose -v
- docker -v
- docker compose -f local.yml build
- docker compose -f docker-compose.local.yml build
- docker compose -f docs.yml build
# Ensure celerybeat does not crash due to non-existent tables
- docker compose -f local.yml run --rm django python manage.py migrate
- docker compose -f local.yml up -d
- docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
- docker compose -f docker-compose.local.yml up -d
script:
- docker compose -f local.yml run django pytest
- docker compose -f docker-compose.local.yml run django pytest
after_failure:
- docker compose -f local.yml logs
- docker compose -f docker-compose.local.yml logs
{%- else %}
before_install:
- sudo apt-get update -qq

View File

@ -15,7 +15,7 @@ from inside the `{{cookiecutter.project_slug}}/docs` directory.
{% else %}
To build and serve docs, use the commands::
docker compose -f local.yml up docs
docker compose -f docker-compose.local.yml up docs
{% endif %}

View File

@ -21,7 +21,7 @@ Next, you have to add new remote python interpreter, based on already tested dep
.. image:: images/3.png
Switch to *Docker Compose* and select `local.yml` file from directory of your project, next set *Service name* to `django`
Switch to *Docker Compose* and select `docker-compose.local.yml` file from directory of your project, next set *Service name* to `django`
.. image:: images/4.png

View File

@ -3,7 +3,7 @@
Start by configuring the `LANGUAGES` settings in `base.py`, by uncommenting languages you are willing to support. Then, translations strings will be placed in this folder when running:
```bash
{% if cookiecutter.use_docker == 'y' %}docker compose -f local.yml run --rm django {% endif %}python manage.py makemessages -all --no-location
{% if cookiecutter.use_docker == 'y' %}docker compose -f docker-compose.local.yml run --rm django {% endif %}python manage.py makemessages -all --no-location
```
This should generate `django.po` (stands for Portable Object) files under each locale `<locale name>/LC_MESSAGES/django.po`. Each translatable string in the codebase is collected with its `msgid` and need to be translated as `msgstr`, for example:
@ -16,7 +16,7 @@ msgstr "utilisateurs"
Once all translations are done, they need to be compiled into `.mo` files (stands for Machine Object), which are the actual binary files used by the application:
```bash
{% if cookiecutter.use_docker == 'y' %}docker compose -f local.yml run --rm django {% endif %}python manage.py compilemessages
{% if cookiecutter.use_docker == 'y' %}docker compose -f docker-compose.local.yml run --rm django {% endif %}python manage.py compilemessages
```
Note that the `.po` files are NOT used by the application directly, so if the `.mo` files are out of dates, the content won't appear as translated even if the `.po` files are up-to-date.