Adopt uv as package manager

This commit is contained in:
jelmert 2025-09-03 08:28:11 +02:00
parent f069dea086
commit 7c8444eb2c
26 changed files with 429 additions and 306 deletions

View File

@ -84,11 +84,11 @@ and then editing the results to include your name, email, and various configurat
First, get Cookiecutter. Trust me, it's awesome: First, get Cookiecutter. Trust me, it's awesome:
pip install "cookiecutter>=1.7.0" uv tool install "cookiecutter>=1.7.0"
Now run it against this repo: Now run it against this repo:
cookiecutter https://github.com/cookiecutter/cookiecutter-django uvx cookiecutter https://github.com/cookiecutter/cookiecutter-django
You'll be prompted for some values. Provide them, then a Django project will be created for you. You'll be prompted for some values. Provide them, then a Django project will be created for you.

View File

@ -32,14 +32,32 @@ Build the Stack
This can take a while, especially the first time you run this particular command on your development system:: This can take a while, especially the first time you run this particular command on your development system::
$ docker compose -f docker-compose.local.yml build docker compose -f docker-compose.local.yml build
Generally, if you want to emulate production environment use ``docker-compose.production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it! Generally, if you want to emulate production environment use ``docker-compose.production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it!
After we have created our initial image we nee to generate a lockfile for our dependencies.
Docker cannot write to the host system during builds, so we have to run the command to generate the lockfile in the container.
This is important for reproducible builds and to ensure that the dependencies are installed correctly in the container.
Updating the lockfile manually is normally not necessary when you add packages through `uv add <package_name>`.
docker compose -f docker-compose.local.yml run --rm django uv lock
This is done by running the following command: ::
docker compose -f docker-compose.local.yml run --rm django uv lock
To be sure we are on the right track we need to build our image again: ::
docker compose -f docker-compose.local.yml build
Before doing any git commit, `pre-commit`_ should be installed globally on your local machine, and then:: Before doing any git commit, `pre-commit`_ should be installed globally on your local machine, and then::
$ git init git init
$ pre-commit install pre-commit install
Failing to do so will result with a bunch of CI and Linter errors that can be avoided with pre-commit. Failing to do so will result with a bunch of CI and Linter errors that can be avoided with pre-commit.
@ -50,27 +68,27 @@ This brings up both Django and PostgreSQL. The first time it is run it might tak
Open a terminal at the project root and run the following for local development:: Open a terminal at the project root and run the following for local development::
$ docker compose -f docker-compose.local.yml up docker compose -f docker-compose.local.yml up
You can also set the environment variable ``COMPOSE_FILE`` pointing to ``docker-compose.local.yml`` like this:: You can also set the environment variable ``COMPOSE_FILE`` pointing to ``docker-compose.local.yml`` like this::
$ export COMPOSE_FILE=docker-compose.local.yml export COMPOSE_FILE=docker-compose.local.yml
And then run:: And then run::
$ docker compose up docker compose up
To run in a detached (background) mode, just:: To run in a detached (background) mode, just::
$ docker compose up -d docker compose up -d
These commands don't run the docs service. In order to run docs service you can run:: These commands don't run the docs service. In order to run docs service you can run::
$ docker compose -f docker-compose.docs.yml up docker compose -f docker-compose.docs.yml up
To run the docs with local services just use:: To run the docs with local services just use::
$ docker compose -f docker-compose.local.yml -f docker-compose.docs.yml up docker compose -f docker-compose.local.yml -f docker-compose.docs.yml up
The site should start and be accessible at http://localhost:3000 if you selected Webpack or Gulp as frontend pipeline and http://localhost:8000 otherwise. The site should start and be accessible at http://localhost:3000 if you selected Webpack or Gulp as frontend pipeline and http://localhost:8000 otherwise.
@ -79,8 +97,8 @@ Execute Management Commands
As with any shell command that we wish to run in our container, this is done using the ``docker compose -f docker-compose.local.yml run --rm`` command: :: As with any shell command that we wish to run in our container, this is done using the ``docker compose -f docker-compose.local.yml run --rm`` command: ::
$ docker compose -f docker-compose.local.yml run --rm django python manage.py migrate docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
$ docker compose -f docker-compose.local.yml run --rm django python manage.py createsuperuser docker compose -f docker-compose.local.yml run --rm django python manage.py createsuperuser
Here, ``django`` is the target service we are executing the commands against. Here, ``django`` is the target service we are executing the commands against.
Also, please note that the ``docker exec`` does not work for running management commands. Also, please note that the ``docker exec`` does not work for running management commands.
@ -136,7 +154,7 @@ The three envs we are presented with here are ``POSTGRES_DB``, ``POSTGRES_USER``
One final touch: should you ever need to merge ``.envs/.production/*`` in a single ``.env`` run the ``merge_production_dotenvs_in_dotenv.py``: :: One final touch: should you ever need to merge ``.envs/.production/*`` in a single ``.env`` run the ``merge_production_dotenvs_in_dotenv.py``: ::
$ python merge_production_dotenvs_in_dotenv.py python merge_production_dotenvs_in_dotenv.py
The ``.env`` file will then be created, with all your production envs residing beside each other. The ``.env`` file will then be created, with all your production envs residing beside each other.
@ -149,15 +167,15 @@ Activate a Docker Machine
This tells our computer that all future commands are specifically for the dev1 machine. Using the ``eval`` command we can switch machines as needed.:: This tells our computer that all future commands are specifically for the dev1 machine. Using the ``eval`` command we can switch machines as needed.::
$ eval "$(docker-machine env dev1)" eval "$(docker-machine env dev1)"
Add 3rd party python packages Add 3rd party python packages
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
To install a new 3rd party python package, you cannot use ``pip install <package_name>``, that would only add the package to the container. The container is ephemeral, so that new library won't be persisted if you run another container. Instead, you should modify the Docker image: To install a new 3rd party python package, you cannot use ``uv add <package_name>``, that would only add the package to the container. The container is ephemeral, so that new library won't be persisted if you run another container. Instead, you should modify the Docker image:
You have to modify the relevant requirement file: base, local or production by adding: :: You have to modify pyproject.toml and either add it to project.dependencies or to tool.uv.dev-dependencies by adding: ::
<package_name>==<package_version> "<package_name>==<package_version>"
To get this change picked up, you'll need to rebuild the image(s) and restart the running container: :: To get this change picked up, you'll need to rebuild the image(s) and restart the running container: ::
@ -176,7 +194,7 @@ If you are using the following within your code to debug: ::
Then you may need to run the following for it to work as desired: :: Then you may need to run the following for it to work as desired: ::
$ docker compose -f docker-compose.local.yml run --rm --service-ports django docker compose -f docker-compose.local.yml run --rm --service-ports django
django-debug-toolbar django-debug-toolbar
@ -190,8 +208,8 @@ docker
The ``container_name`` from the yml file can be used to check on containers with docker commands, for example: :: The ``container_name`` from the yml file can be used to check on containers with docker commands, for example: ::
$ docker logs <project_slug>_local_celeryworker docker logs <project_slug>_local_celeryworker
$ docker top <project_slug>_local_celeryworker docker top <project_slug>_local_celeryworker
Notice that the ``container_name`` is generated dynamically using your project slug as a prefix Notice that the ``container_name`` is generated dynamically using your project slug as a prefix
@ -331,7 +349,7 @@ Assuming that you registered your local hostname as ``my-dev-env.local``, the ce
Rebuild your ``docker`` application. :: Rebuild your ``docker`` application. ::
$ docker compose -f docker-compose.local.yml up -d --build docker compose -f docker-compose.local.yml up -d --build
Go to your browser and type in your URL bar ``https://my-dev-env.local``. Go to your browser and type in your URL bar ``https://my-dev-env.local``.

View File

@ -1,7 +1,7 @@
Getting Up and Running Locally Getting Up and Running Locally
============================== ==============================
.. index:: pip, virtualenv, PostgreSQL .. index:: PostgreSQL
Setting Up Development Environment Setting Up Development Environment
@ -9,29 +9,19 @@ Setting Up Development Environment
Make sure to have the following on your host: Make sure to have the following on your host:
* Python 3.12 * uv https://docs.astral.sh/uv/getting-started/installation/
* PostgreSQL_. * PostgreSQL_.
* Redis_, if using Celery * Redis_, if using Celery
* Cookiecutter_ * Cookiecutter_
First things first.
#. Create a virtualenv: ::
$ python3.12 -m venv <virtual env path>
#. Activate the virtualenv you have just created: ::
$ source <virtual env path>/bin/activate
#. .. include:: generate-project-block.rst #. .. include:: generate-project-block.rst
#. Install development requirements: :: #. Install development requirements: ::
$ cd <what you have entered as the project_slug at setup stage> cd <what you have entered as the project_slug at setup stage>
$ pip install -r requirements/local.txt uv sync
$ git init # A git repo is required for pre-commit to install git init # A git repo is required for pre-commit to install
$ pre-commit install uv run pre-commit install
.. note:: .. note::
@ -40,7 +30,7 @@ First things first.
#. Create a new PostgreSQL database using createdb_: :: #. Create a new PostgreSQL database using createdb_: ::
$ createdb --username=postgres <project_slug> createdb --username=postgres <project_slug>
``project_slug`` is what you have entered as the project_slug at the setup stage. ``project_slug`` is what you have entered as the project_slug at the setup stage.
@ -54,7 +44,7 @@ First things first.
#. Set the environment variables for your database(s): :: #. Set the environment variables for your database(s): ::
$ export DATABASE_URL=postgres://postgres:<password>@127.0.0.1:5432/<DB name given to createdb> export DATABASE_URL=postgres://postgres:<password>@127.0.0.1:5432/<DB name given to createdb>
.. note:: .. note::
@ -71,15 +61,15 @@ First things first.
#. Apply migrations: :: #. Apply migrations: ::
$ python manage.py migrate uv run python manage.py migrate
#. If you're running synchronously, see the application being served through Django development server: :: #. If you're running synchronously, see the application being served through Django development server: ::
$ python manage.py runserver 0.0.0.0:8000 uv run python manage.py runserver 0.0.0.0:8000
or if you're running asynchronously: :: or if you're running asynchronously: ::
$ uvicorn config.asgi:application --host 0.0.0.0 --reload --reload-include '*.html' uv run uvicorn config.asgi:application --host 0.0.0.0 --reload --reload-include '*.html'
If you've opted for Webpack or Gulp as frontend pipeline, please see the :ref:`dedicated section <bare-metal-webpack-gulp>` below. If you've opted for Webpack or Gulp as frontend pipeline, please see the :ref:`dedicated section <bare-metal-webpack-gulp>` below.
@ -136,11 +126,11 @@ Following this structured approach, here's how to add a new app:
#. **Create the app** using Django's ``startapp`` command, replacing ``<name-of-the-app>`` with your desired app name: :: #. **Create the app** using Django's ``startapp`` command, replacing ``<name-of-the-app>`` with your desired app name: ::
$ python manage.py startapp <name-of-the-app> uv run python manage.py startapp <name-of-the-app>
#. **Move the app** to the Django Project Root, maintaining the project's two-tier structure: :: #. **Move the app** to the Django Project Root, maintaining the project's two-tier structure: ::
$ mv <name-of-the-app> <django_project_root>/ mv <name-of-the-app> <django_project_root>/
#. **Edit the app's apps.py** change ``name = '<name-of-the-app>'`` to ``name = '<django_project_root>.<name-of-the-app>'``. #. **Edit the app's apps.py** change ``name = '<name-of-the-app>'`` to ``name = '<django_project_root>.<name-of-the-app>'``.
@ -166,7 +156,7 @@ For instance, one of the packages we depend upon, ``django-allauth`` sends verif
#. Make it executable: :: #. Make it executable: ::
$ chmod +x mailpit chmod +x mailpit
#. Spin up another terminal window and start it there: :: #. Spin up another terminal window and start it there: ::
@ -199,18 +189,18 @@ If the project is configured to use Celery as a task scheduler then, by default,
Next, make sure `redis-server` is installed (per the `Getting started with Redis`_ guide) and run the server in one terminal:: Next, make sure `redis-server` is installed (per the `Getting started with Redis`_ guide) and run the server in one terminal::
$ redis-server redis-server
Start the Celery worker by running the following command in another terminal:: Start the Celery worker by running the following command in another terminal::
$ celery -A config.celery_app worker --loglevel=info uv run celery -A config.celery_app worker --loglevel=info
That Celery worker should be running whenever your app is running, typically as a background process, That Celery worker should be running whenever your app is running, typically as a background process,
so that it can pick up any tasks that get queued. Learn more from the `Celery Workers Guide`_. so that it can pick up any tasks that get queued. Learn more from the `Celery Workers Guide`_.
The project comes with a simple task for manual testing purposes, inside `<project_slug>/users/tasks.py`. To queue that task locally, start the Django shell, import the task, and call `delay()` on it:: The project comes with a simple task for manual testing purposes, inside `<project_slug>/users/tasks.py`. To queue that task locally, start the Django shell, import the task, and call `delay()` on it::
$ python manage.py shell uv run python manage.py shell
>> from <project_slug>.users.tasks import get_users_count >> from <project_slug>.users.tasks import get_users_count
>> get_users_count.delay() >> get_users_count.delay()
@ -231,11 +221,11 @@ If you've opted for Gulp or Webpack as front-end pipeline, the project comes con
#. Make sure that `Node.js`_ v18 is installed on your machine. #. Make sure that `Node.js`_ v18 is installed on your machine.
#. In the project root, install the JS dependencies with:: #. In the project root, install the JS dependencies with::
$ npm install npm install
#. Now - with your virtualenv activated - start the application by running:: #. Now - with your virtualenv activated - start the application by running::
$ npm run dev npm run dev
This will start 2 processes in parallel: the static assets build loop on one side, and the Django server on the other. This will start 2 processes in parallel: the static assets build loop on one side, and the Django server on the other.

View File

@ -3,7 +3,7 @@
# You can set these variables from the command line. # You can set these variables from the command line.
SPHINXOPTS = SPHINXOPTS =
SPHINXBUILD = sphinx-build SPHINXBUILD = uv run sphinx-build
SOURCEDIR = . SOURCEDIR = .
BUILDDIR = _build BUILDDIR = _build

View File

@ -1,8 +1,11 @@
# ruff: noqa: PLR0133 # ruff: noqa: PLR0133
import json import json
import os
import random import random
import shutil import shutil
import string import string
import subprocess
import sys
from pathlib import Path from pathlib import Path
try: try:
@ -77,7 +80,7 @@ def remove_utility_files():
def remove_heroku_files(): def remove_heroku_files():
file_names = ["Procfile", "requirements.txt"] file_names = ["Procfile"]
for file_name in file_names: for file_name in file_names:
if file_name == "requirements.txt" and "{{ cookiecutter.ci_tool }}".lower() == "travis": if file_name == "requirements.txt" and "{{ cookiecutter.ci_tool }}".lower() == "travis":
# Don't remove the file if we are using Travis CI but not using Heroku # Don't remove the file if we are using Travis CI but not using Heroku
@ -195,20 +198,24 @@ def handle_js_runner(choice, use_docker, use_async):
def remove_prettier_pre_commit(): def remove_prettier_pre_commit():
pre_commit_yaml = Path(".pre-commit-config.yaml") remove_repo_from_pre_commit_config("mirrors-prettier")
content = pre_commit_yaml.read_text().splitlines()
def remove_repo_from_pre_commit_config(repo_to_remove: str):
pre_commit_config = Path(".pre-commit-config.yaml")
content = pre_commit_config.read_text().splitlines(keepends=True)
removing = False removing = False
new_lines = [] new_lines = []
for line in content: for line in content:
if removing and "- repo:" in line: if removing and "- repo:" in line:
removing = False removing = False
if "mirrors-prettier" in line: if repo_to_remove in line:
removing = True removing = True
if not removing: if not removing:
new_lines.append(line) new_lines.append(line)
pre_commit_yaml.write_text("\n".join(new_lines)) pre_commit_config.write_text("\n".join(new_lines))
def remove_celery_files(): def remove_celery_files():
@ -499,8 +506,52 @@ def main(): # noqa: C901, PLR0912, PLR0915
if "{{ cookiecutter.use_async }}".lower() == "n": if "{{ cookiecutter.use_async }}".lower() == "n":
remove_async_files() remove_async_files()
setup_dependencies()
print(SUCCESS + "Project initialized, keep up the good work!" + TERMINATOR) print(SUCCESS + "Project initialized, keep up the good work!" + TERMINATOR)
def setup_dependencies():
print("Installing python dependencies using uv...")
if "{{ cookiecutter.use_docker }}".lower() == "y":
# Build the Docker service using Docker Compose
try:
subprocess.run(["docker", "compose", "-f", "docker-compose.local.yml", "build", "django"], check=True) # noqa: S607
except subprocess.CalledProcessError as e:
print(f"Error building Docker service: {e}", file=sys.stderr)
sys.exit(1)
# Use Docker to run the uv command
uv_cmd = ["docker", "compose", "-f", "docker-compose.local.yml", "run", "--rm", "django", "uv"]
else:
# Use uv command directly
uv_cmd = ["uv"]
# Install production dependencies
try:
subprocess.run([*uv_cmd, "add", "--no-sync", "-r", "requirements/production.txt"], check=True) # noqa: S603
except subprocess.CalledProcessError as e:
print(f"Error installing production dependencies: {e}", file=sys.stderr)
sys.exit(1)
# Install local (development) dependencies
try:
subprocess.run([*uv_cmd, "add", "--no-sync", "--dev", "-r", "requirements/local.txt"], check=True) # noqa: S603
except subprocess.CalledProcessError as e:
print(f"Error installing local dependencies: {e}", file=sys.stderr)
sys.exit(1)
# Remove the requirements directory
if os.path.exists("requirements"): # noqa: PTH110
try:
shutil.rmtree("requirements")
except Exception as e: # noqa: BLE001
print(f"Error removing 'requirements' folder: {e}", file=sys.stderr)
sys.exit(1)
print("Setup complete!")
if __name__ == "__main__": if __name__ == "__main__":
main() main()

View File

@ -18,13 +18,13 @@ cd my_awesome_project
sudo utility/install_os_dependencies.sh install sudo utility/install_os_dependencies.sh install
# Install Python deps # Install Python deps
pip install -r requirements/local.txt uv sync
# run the project's tests # run the project's tests
pytest uv run pytest
# Make sure the check doesn't raise any warnings # Make sure the check doesn't raise any warnings
python manage.py check --fail-level WARNING uv run python manage.py check --fail-level WARNING
# Run npm build script if package.json is present # Run npm build script if package.json is present
if [ -f "package.json" ] if [ -f "package.json" ]
@ -34,4 +34,4 @@ then
fi fi
# Generate the HTML for the documentation # Generate the HTML for the documentation
cd docs && make html cd docs && uv run make html

View File

@ -6,6 +6,7 @@ from collections.abc import Iterable
from pathlib import Path from pathlib import Path
import pytest import pytest
import tomllib
try: try:
import sh import sh
@ -150,7 +151,14 @@ def _fixture_id(ctx):
def build_files_list(base_path: Path): def build_files_list(base_path: Path):
"""Build a list containing absolute paths to the generated files.""" """Build a list containing absolute paths to the generated files."""
return [dirpath / file_path for dirpath, subdirs, files in base_path.walk() for file_path in files] excluded_dirs = {".venv", "__pycache__"}
f = []
for dirpath, subdirs, files in base_path.walk():
subdirs[:] = [d for d in subdirs if d not in excluded_dirs]
f.extend(dirpath / file_path for file_path in files)
return f
def check_paths(paths: Iterable[Path]): def check_paths(paths: Iterable[Path]):
@ -263,7 +271,7 @@ def test_djlint_check_passes(cookies, context_override):
@pytest.mark.parametrize( @pytest.mark.parametrize(
("use_docker", "expected_test_script"), ("use_docker", "expected_test_script"),
[ [
("n", "pytest"), ("n", "uv run pytest"),
("y", "docker compose -f docker-compose.local.yml run django pytest"), ("y", "docker compose -f docker-compose.local.yml run django pytest"),
], ],
) )
@ -288,7 +296,7 @@ def test_travis_invokes_pytest(cookies, context, use_docker, expected_test_scrip
@pytest.mark.parametrize( @pytest.mark.parametrize(
("use_docker", "expected_test_script"), ("use_docker", "expected_test_script"),
[ [
("n", "pytest"), ("n", "uv run pytest"),
("y", "docker compose -f docker-compose.local.yml run django pytest"), ("y", "docker compose -f docker-compose.local.yml run django pytest"),
], ],
) )
@ -305,7 +313,7 @@ def test_gitlab_invokes_precommit_and_pytest(cookies, context, use_docker, expec
try: try:
gitlab_config = yaml.safe_load(gitlab_yml) gitlab_config = yaml.safe_load(gitlab_yml)
assert gitlab_config["precommit"]["script"] == [ assert gitlab_config["precommit"]["script"] == [
"pre-commit run --show-diff-on-failure --color=always --all-files", "uv run pre-commit run --show-diff-on-failure --color=always --all-files",
] ]
assert gitlab_config["pytest"]["script"] == [expected_test_script] assert gitlab_config["pytest"]["script"] == [expected_test_script]
except yaml.YAMLError as e: except yaml.YAMLError as e:
@ -315,7 +323,7 @@ def test_gitlab_invokes_precommit_and_pytest(cookies, context, use_docker, expec
@pytest.mark.parametrize( @pytest.mark.parametrize(
("use_docker", "expected_test_script"), ("use_docker", "expected_test_script"),
[ [
("n", "pytest"), ("n", "uv run pytest"),
("y", "docker compose -f docker-compose.local.yml run django pytest"), ("y", "docker compose -f docker-compose.local.yml run django pytest"),
], ],
) )
@ -402,3 +410,39 @@ def test_trim_domain_email(cookies, context):
base_settings = result.project_path / "config" / "settings" / "base.py" base_settings = result.project_path / "config" / "settings" / "base.py"
assert '"me@example.com"' in base_settings.read_text() assert '"me@example.com"' in base_settings.read_text()
def test_pyproject_toml(cookies, context):
author_name = "Project Author"
author_email = "me@example.com"
context.update(
{
"description": "DESCRIPTION",
"domain_name": "example.com",
"email": author_email,
"author_name": author_name,
},
)
result = cookies.bake(extra_context=context)
assert result.exit_code == 0
pyproject_toml = result.project_path / "pyproject.toml"
data = tomllib.loads(pyproject_toml.read_text())
assert data
assert data["project"]["authors"][0]["email"] == author_email
assert data["project"]["authors"][0]["name"] == author_name
assert data["project"]["name"] == context["project_slug"]
def test_pre_commit_without_heroku(cookies, context):
context.update({"use_heroku": "n"})
result = cookies.bake(extra_context=context)
assert result.exit_code == 0
pre_commit_config = result.project_path / ".pre-commit-config.yaml"
data = pre_commit_config.read_text()
assert "uv-pre-commit" not in data

View File

@ -17,6 +17,10 @@ cd my_awesome_project
# make sure all images build # make sure all images build
docker compose -f docker-compose.local.yml build docker compose -f docker-compose.local.yml build
docker compose -f docker-compose.local.yml run django uv lock
docker compose -f docker-compose.local.yml build
# run the project's type checks # run the project's type checks
docker compose -f docker-compose.local.yml run --rm django mypy my_awesome_project docker compose -f docker-compose.local.yml run --rm django mypy my_awesome_project
@ -44,6 +48,22 @@ docker compose -f docker-compose.local.yml run --rm \
# Generate the HTML for the documentation # Generate the HTML for the documentation
docker compose -f docker-compose.docs.yml run --rm docs make html docker compose -f docker-compose.docs.yml run --rm docs make html
docker build -f ./compose/production/django/Dockerfile -t django-prod .
docker run --rm \
--env-file .envs/.local/.django \
--env-file .envs/.local/.postgres \
--network my_awesome_project_default \
-e DJANGO_SECRET_KEY="$(openssl rand -base64 64)" \
-e REDIS_URL=redis://redis:6379/0 \
-e DJANGO_AWS_ACCESS_KEY_ID=x \
-e DJANGO_AWS_SECRET_ACCESS_KEY=x \
-e DJANGO_AWS_STORAGE_BUCKET_NAME=x \
-e DJANGO_ADMIN_URL=x \
-e MAILGUN_API_KEY=x \
-e MAILGUN_DOMAIN=x \
django-prod python manage.py check --settings=config.settings.production --deploy --database default --fail-level WARNING
# Run npm build script if package.json is present # Run npm build script if package.json is present
if [ -f "package.json" ] if [ -f "package.json" ]
then then

98
uv.lock
View File

@ -12,16 +12,16 @@ wheels = [
[[package]] [[package]]
name = "anyio" name = "anyio"
version = "4.7.0" version = "4.9.0"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "idna" }, { name = "idna" },
{ name = "sniffio" }, { name = "sniffio" },
{ name = "typing-extensions" }, { name = "typing-extensions" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/f6/40/318e58f669b1a9e00f5c4453910682e2d9dd594334539c7b7817dabb765f/anyio-4.7.0.tar.gz", hash = "sha256:2f834749c602966b7d456a7567cafcb309f96482b5081d14ac93ccd457f9dd48", size = 177076 } sdist = { url = "https://files.pythonhosted.org/packages/95/7d/4c1bd541d4dffa1b52bd83fb8527089e097a106fc90b467a7313b105f840/anyio-4.9.0.tar.gz", hash = "sha256:673c0c244e15788651a4ff38710fea9675823028a6f08a5eda409e0c9840a028", size = 190949 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/a0/7a/4daaf3b6c08ad7ceffea4634ec206faeff697526421c20f07628c7372156/anyio-4.7.0-py3-none-any.whl", hash = "sha256:ea60c3723ab42ba6fff7e8ccb0488c898ec538ff4df1f1d5e642c3601d07e352", size = 93052 }, { url = "https://files.pythonhosted.org/packages/a1/ee/48ca1a7c89ffec8b6a0c5d02b89c305671d5ffd8d3c94acf8b8c408575bb/anyio-4.9.0-py3-none-any.whl", hash = "sha256:9f76d541cad6e36af7beb62e978876f3b41e3e04f2c1fbf0884604c0a9c4d93c", size = 100916 },
] ]
[[package]] [[package]]
@ -39,11 +39,11 @@ wheels = [
[[package]] [[package]]
name = "babel" name = "babel"
version = "2.16.0" version = "2.17.0"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/2a/74/f1bc80f23eeba13393b7222b11d95ca3af2c1e28edca18af487137eefed9/babel-2.16.0.tar.gz", hash = "sha256:d1f3554ca26605fe173f3de0c65f750f5a42f924499bf134de6423582298e316", size = 9348104 } sdist = { url = "https://files.pythonhosted.org/packages/7d/6b/d52e42361e1aa00709585ecc30b3f9684b3ab62530771402248b1b1d6240/babel-2.17.0.tar.gz", hash = "sha256:0c54cffb19f690cdcc52a3b50bcbf71e07a808d1c80d549f2459b9d2cf0afb9d", size = 9951852 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/ed/20/bc79bc575ba2e2a7f70e8a1155618bb1301eaa5132a8271373a6903f73f8/babel-2.16.0-py3-none-any.whl", hash = "sha256:368b5b98b37c06b7daf6696391c3240c938b37767d4584413e8438c5c435fa8b", size = 9587599 }, { url = "https://files.pythonhosted.org/packages/b7/b8/3fe70c75fe32afc4bb507f75563d39bc5642255d1d94f1f23604725780bf/babel-2.17.0-py3-none-any.whl", hash = "sha256:4d0b53093fdfb4b21c92b5213dba5a1b23885afa8383709427046b21c366e5f2", size = 10182537 },
] ]
[[package]] [[package]]
@ -526,7 +526,7 @@ wheels = [
[[package]] [[package]]
name = "myst-parser" name = "myst-parser"
version = "4.0.0" version = "4.0.1"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "docutils" }, { name = "docutils" },
@ -536,9 +536,9 @@ dependencies = [
{ name = "pyyaml" }, { name = "pyyaml" },
{ name = "sphinx" }, { name = "sphinx" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/85/55/6d1741a1780e5e65038b74bce6689da15f620261c490c3511eb4c12bac4b/myst_parser-4.0.0.tar.gz", hash = "sha256:851c9dfb44e36e56d15d05e72f02b80da21a9e0d07cba96baf5e2d476bb91531", size = 93858 } sdist = { url = "https://files.pythonhosted.org/packages/66/a5/9626ba4f73555b3735ad86247a8077d4603aa8628537687c839ab08bfe44/myst_parser-4.0.1.tar.gz", hash = "sha256:5cfea715e4f3574138aecbf7d54132296bfd72bb614d31168f48c477a830a7c4", size = 93985 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/ca/b4/b036f8fdb667587bb37df29dc6644681dd78b7a2a6321a34684b79412b28/myst_parser-4.0.0-py3-none-any.whl", hash = "sha256:b9317997552424448c6096c2558872fdb6f81d3ecb3a40ce84a7518798f3f28d", size = 84563 }, { url = "https://files.pythonhosted.org/packages/5f/df/76d0321c3797b54b60fef9ec3bd6f4cfd124b9e422182156a1dd418722cf/myst_parser-4.0.1-py3-none-any.whl", hash = "sha256:9134e88959ec3b5780aedf8a99680ea242869d012e8821db3126d427edc9c95d", size = 84579 },
] ]
[[package]] [[package]]
@ -828,6 +828,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/c7/d9/c2a126eeae791e90ea099d05cb0515feea3688474b978343f3cdcfe04523/rich-13.8.0-py3-none-any.whl", hash = "sha256:2e85306a063b9492dffc86278197a60cbece75bcb766022f3436f567cae11bdc", size = 241597 }, { url = "https://files.pythonhosted.org/packages/c7/d9/c2a126eeae791e90ea099d05cb0515feea3688474b978343f3cdcfe04523/rich-13.8.0-py3-none-any.whl", hash = "sha256:2e85306a063b9492dffc86278197a60cbece75bcb766022f3436f567cae11bdc", size = 241597 },
] ]
[[package]]
name = "roman-numerals-py"
version = "3.1.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/30/76/48fd56d17c5bdbdf65609abbc67288728a98ed4c02919428d4f52d23b24b/roman_numerals_py-3.1.0.tar.gz", hash = "sha256:be4bf804f083a4ce001b5eb7e3c0862479d10f94c936f6c4e5f250aa5ff5bd2d", size = 9017 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/53/97/d2cbbaa10c9b826af0e10fdf836e1bf344d9f0abb873ebc34d1f49642d3f/roman_numerals_py-3.1.0-py3-none-any.whl", hash = "sha256:9da2ad2fb670bcf24e81070ceb3be72f6c11c440d73bd579fbeca1e9f330954c", size = 7742 },
]
[[package]] [[package]]
name = "ruff" name = "ruff"
version = "0.12.11" version = "0.12.11"
@ -901,7 +910,7 @@ wheels = [
[[package]] [[package]]
name = "sphinx" name = "sphinx"
version = "8.1.3" version = "8.2.3"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "alabaster" }, { name = "alabaster" },
@ -913,6 +922,7 @@ dependencies = [
{ name = "packaging" }, { name = "packaging" },
{ name = "pygments" }, { name = "pygments" },
{ name = "requests" }, { name = "requests" },
{ name = "roman-numerals-py" },
{ name = "snowballstemmer" }, { name = "snowballstemmer" },
{ name = "sphinxcontrib-applehelp" }, { name = "sphinxcontrib-applehelp" },
{ name = "sphinxcontrib-devhelp" }, { name = "sphinxcontrib-devhelp" },
@ -921,9 +931,9 @@ dependencies = [
{ name = "sphinxcontrib-qthelp" }, { name = "sphinxcontrib-qthelp" },
{ name = "sphinxcontrib-serializinghtml" }, { name = "sphinxcontrib-serializinghtml" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/6f/6d/be0b61178fe2cdcb67e2a92fc9ebb488e3c51c4f74a36a7824c0adf23425/sphinx-8.1.3.tar.gz", hash = "sha256:43c1911eecb0d3e161ad78611bc905d1ad0e523e4ddc202a58a821773dc4c927", size = 8184611 } sdist = { url = "https://files.pythonhosted.org/packages/38/ad/4360e50ed56cb483667b8e6dadf2d3fda62359593faabbe749a27c4eaca6/sphinx-8.2.3.tar.gz", hash = "sha256:398ad29dee7f63a75888314e9424d40f52ce5a6a87ae88e7071e80af296ec348", size = 8321876 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/26/60/1ddff83a56d33aaf6f10ec8ce84b4c007d9368b21008876fceda7e7381ef/sphinx-8.1.3-py3-none-any.whl", hash = "sha256:09719015511837b76bf6e03e42eb7595ac8c2e41eeb9c29c5b755c6b677992a2", size = 3487125 }, { url = "https://files.pythonhosted.org/packages/31/53/136e9eca6e0b9dc0e1962e2c908fbea2e5ac000c2a2fbd9a35797958c48b/sphinx-8.2.3-py3-none-any.whl", hash = "sha256:4405915165f13521d875a8c29c8970800a0141c14cc5416a38feca4ea5d9b9c3", size = 3589741 },
] ]
[[package]] [[package]]
@ -1025,14 +1035,14 @@ wheels = [
[[package]] [[package]]
name = "starlette" name = "starlette"
version = "0.45.1" version = "0.46.1"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "anyio" }, { name = "anyio" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/c1/be/b398217eb35b356d2d9bb84ec67071ea2842e02950fcf38b33df9d5b24ba/starlette-0.45.1.tar.gz", hash = "sha256:a8ae1fa3b1ab7ca83a4abd77871921a13fb5aeaf4874436fb96c29dfcd4ecfa3", size = 2573953 } sdist = { url = "https://files.pythonhosted.org/packages/04/1b/52b27f2e13ceedc79a908e29eac426a63465a1a01248e5f24aa36a62aeb3/starlette-0.46.1.tar.gz", hash = "sha256:3c88d58ee4bd1bb807c0d1acb381838afc7752f9ddaec81bbe4383611d833230", size = 2580102 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/6b/2c/a50484b035ee0e13ebb7a42391e391befbfc1b6a9ad5503e83badd182ada/starlette-0.45.1-py3-none-any.whl", hash = "sha256:5656c0524f586e9148d9a3c1dd5257fb42a99892fb0dc6877dd76ef4d184aac3", size = 71488 }, { url = "https://files.pythonhosted.org/packages/a0/4b/528ccf7a982216885a1ff4908e886b8fb5f19862d1962f56a3fce2435a70/starlette-0.46.1-py3-none-any.whl", hash = "sha256:77c74ed9d2720138b25875133f3a2dae6d854af2ec37dceb56aef370c1d8a227", size = 71995 },
] ]
[[package]] [[package]]
@ -1179,46 +1189,46 @@ wheels = [
[[package]] [[package]]
name = "watchfiles" name = "watchfiles"
version = "1.0.3" version = "1.0.5"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "anyio" }, { name = "anyio" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/3c/7e/4569184ea04b501840771b8fcecee19b2233a8b72c196061263c0ef23c0b/watchfiles-1.0.3.tar.gz", hash = "sha256:f3ff7da165c99a5412fe5dd2304dd2dbaaaa5da718aad942dcb3a178eaa70c56", size = 38185 } sdist = { url = "https://files.pythonhosted.org/packages/03/e2/8ed598c42057de7aa5d97c472254af4906ff0a59a66699d426fc9ef795d7/watchfiles-1.0.5.tar.gz", hash = "sha256:b7529b5dcc114679d43827d8c35a07c493ad6f083633d573d81c660abc5979e9", size = 94537 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/bf/a9/c8b5ab33444306e1a324cb2b51644f8458dd459e30c3841f925012893e6a/watchfiles-1.0.3-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:93436ed550e429da007fbafb723e0769f25bae178fbb287a94cb4ccdf42d3af3", size = 391395 }, { url = "https://files.pythonhosted.org/packages/2a/8c/4f0b9bdb75a1bfbd9c78fad7d8854369283f74fe7cf03eb16be77054536d/watchfiles-1.0.5-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:b5eb568c2aa6018e26da9e6c86f3ec3fd958cee7f0311b35c2630fa4217d17f2", size = 401511 },
{ url = "https://files.pythonhosted.org/packages/ad/d3/403af5f07359863c03951796ddab265ee8cce1a6147510203d0bf43950e7/watchfiles-1.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:c18f3502ad0737813c7dad70e3e1cc966cc147fbaeef47a09463bbffe70b0a00", size = 381432 }, { url = "https://files.pythonhosted.org/packages/dc/4e/7e15825def77f8bd359b6d3f379f0c9dac4eb09dd4ddd58fd7d14127179c/watchfiles-1.0.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0a04059f4923ce4e856b4b4e5e783a70f49d9663d22a4c3b3298165996d1377f", size = 392715 },
{ url = "https://files.pythonhosted.org/packages/f6/5f/921f2f2beabaf24b1ad81ac22bb69df8dd5771fdb68d6f34a5912a420941/watchfiles-1.0.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6a5bc3ca468bb58a2ef50441f953e1f77b9a61bd1b8c347c8223403dc9b4ac9a", size = 441448 }, { url = "https://files.pythonhosted.org/packages/58/65/b72fb817518728e08de5840d5d38571466c1b4a3f724d190cec909ee6f3f/watchfiles-1.0.5-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e380c89983ce6e6fe2dd1e1921b9952fb4e6da882931abd1824c092ed495dec", size = 454138 },
{ url = "https://files.pythonhosted.org/packages/63/d7/67d0d750b246f248ccdb400a85a253e93e419ea5b6cbe968fa48b97a5f30/watchfiles-1.0.3-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:0d1ec043f02ca04bf21b1b32cab155ce90c651aaf5540db8eb8ad7f7e645cba8", size = 446852 }, { url = "https://files.pythonhosted.org/packages/3e/a4/86833fd2ea2e50ae28989f5950b5c3f91022d67092bfec08f8300d8b347b/watchfiles-1.0.5-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fe43139b2c0fdc4a14d4f8d5b5d967f7a2777fd3d38ecf5b1ec669b0d7e43c21", size = 458592 },
{ url = "https://files.pythonhosted.org/packages/53/7c/d7cd94c7d0905f1e2f1c2232ea9bc39b1a48affd007e09c547ead96edb8f/watchfiles-1.0.3-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f58d3bfafecf3d81c15d99fc0ecf4319e80ac712c77cf0ce2661c8cf8bf84066", size = 471662 }, { url = "https://files.pythonhosted.org/packages/38/7e/42cb8df8be9a37e50dd3a818816501cf7a20d635d76d6bd65aae3dbbff68/watchfiles-1.0.5-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ee0822ce1b8a14fe5a066f93edd20aada932acfe348bede8aa2149f1a4489512", size = 487532 },
{ url = "https://files.pythonhosted.org/packages/26/81/738f8e66f7525753996b8aa292f78dcec1ef77887d62e6cdfb04cc2f352f/watchfiles-1.0.3-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1df924ba82ae9e77340101c28d56cbaff2c991bd6fe8444a545d24075abb0a87", size = 493765 }, { url = "https://files.pythonhosted.org/packages/fc/fd/13d26721c85d7f3df6169d8b495fcac8ab0dc8f0945ebea8845de4681dab/watchfiles-1.0.5-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a0dbcb1c2d8f2ab6e0a81c6699b236932bd264d4cef1ac475858d16c403de74d", size = 522865 },
{ url = "https://files.pythonhosted.org/packages/d2/50/78e21f5da24ab39114e9b24f7b0945ea1c6fc7bc9ae86cd87f8eaeb47325/watchfiles-1.0.3-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:632a52dcaee44792d0965c17bdfe5dc0edad5b86d6a29e53d6ad4bf92dc0ff49", size = 490558 }, { url = "https://files.pythonhosted.org/packages/a1/0d/7f9ae243c04e96c5455d111e21b09087d0eeaf9a1369e13a01c7d3d82478/watchfiles-1.0.5-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a2014a2b18ad3ca53b1f6c23f8cd94a18ce930c1837bd891262c182640eb40a6", size = 499887 },
{ url = "https://files.pythonhosted.org/packages/a8/93/1873fea6354b2858eae8970991d64e9a449d87726d596490d46bf00af8ed/watchfiles-1.0.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bf4b459d94a0387617a1b499f314aa04d8a64b7a0747d15d425b8c8b151da0", size = 442808 }, { url = "https://files.pythonhosted.org/packages/8e/0f/a257766998e26aca4b3acf2ae97dff04b57071e991a510857d3799247c67/watchfiles-1.0.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:10f6ae86d5cb647bf58f9f655fcf577f713915a5d69057a0371bc257e2553234", size = 454498 },
{ url = "https://files.pythonhosted.org/packages/4f/b4/2fc4c92fb28b029f66d04a4d430fe929284e9ff717b04bb7a3bb8a7a5605/watchfiles-1.0.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:ca94c85911601b097d53caeeec30201736ad69a93f30d15672b967558df02885", size = 615287 }, { url = "https://files.pythonhosted.org/packages/81/79/8bf142575a03e0af9c3d5f8bcae911ee6683ae93a625d349d4ecf4c8f7df/watchfiles-1.0.5-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:1a7bac2bde1d661fb31f4d4e8e539e178774b76db3c2c17c4bb3e960a5de07a2", size = 630663 },
{ url = "https://files.pythonhosted.org/packages/1e/d4/93da24db39257e440240d338b617c5153ad11d361c34108f5c0e1e0743eb/watchfiles-1.0.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:65ab1fb635476f6170b07e8e21db0424de94877e4b76b7feabfe11f9a5fc12b5", size = 612812 }, { url = "https://files.pythonhosted.org/packages/f1/80/abe2e79f610e45c63a70d271caea90c49bbf93eb00fa947fa9b803a1d51f/watchfiles-1.0.5-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ab626da2fc1ac277bbf752446470b367f84b50295264d2d313e28dc4405d663", size = 625410 },
{ url = "https://files.pythonhosted.org/packages/c6/67/9fd3661c2dc0309abd6021876653d91e8b64fb279529e2cadaa3520ef3e3/watchfiles-1.0.3-cp312-cp312-win32.whl", hash = "sha256:49bc1bc26abf4f32e132652f4b3bfeec77d8f8f62f57652703ef127e85a3e38d", size = 271642 }, { url = "https://files.pythonhosted.org/packages/91/6f/bc7fbecb84a41a9069c2c6eb6319f7f7df113adf113e358c57fc1aff7ff5/watchfiles-1.0.5-cp312-cp312-win32.whl", hash = "sha256:9f4571a783914feda92018ef3901dab8caf5b029325b5fe4558c074582815249", size = 277965 },
{ url = "https://files.pythonhosted.org/packages/ae/aa/8c887edb78cd67f5d4d6a35c3aeb46d748643ebf962163130fb1871e2ee0/watchfiles-1.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:48681c86f2cb08348631fed788a116c89c787fdf1e6381c5febafd782f6c3b44", size = 285505 }, { url = "https://files.pythonhosted.org/packages/99/a5/bf1c297ea6649ec59e935ab311f63d8af5faa8f0b86993e3282b984263e3/watchfiles-1.0.5-cp312-cp312-win_amd64.whl", hash = "sha256:360a398c3a19672cf93527f7e8d8b60d8275119c5d900f2e184d32483117a705", size = 291693 },
{ url = "https://files.pythonhosted.org/packages/7b/31/d212fa6390f0e73a91913ada0b925b294a78d67794795371208baf73f0b5/watchfiles-1.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:9e080cf917b35b20c889225a13f290f2716748362f6071b859b60b8847a6aa43", size = 277263 }, { url = "https://files.pythonhosted.org/packages/7f/7b/fd01087cc21db5c47e5beae507b87965db341cce8a86f9eb12bf5219d4e0/watchfiles-1.0.5-cp312-cp312-win_arm64.whl", hash = "sha256:1a2902ede862969077b97523987c38db28abbe09fb19866e711485d9fbf0d417", size = 283287 },
] ]
[[package]] [[package]]
name = "websockets" name = "websockets"
version = "14.1" version = "15.0.1"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/f4/1b/380b883ce05bb5f45a905b61790319a28958a9ab1e4b6b95ff5464b60ca1/websockets-14.1.tar.gz", hash = "sha256:398b10c77d471c0aab20a845e7a60076b6390bfdaac7a6d2edb0d2c59d75e8d8", size = 162840 } sdist = { url = "https://files.pythonhosted.org/packages/21/e6/26d09fab466b7ca9c7737474c52be4f76a40301b08362eb2dbc19dcc16c1/websockets-15.0.1.tar.gz", hash = "sha256:82544de02076bafba038ce055ee6412d68da13ab47f0c60cab827346de828dee", size = 177016 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/55/64/55698544ce29e877c9188f1aee9093712411a8fc9732cca14985e49a8e9c/websockets-14.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:ed907449fe5e021933e46a3e65d651f641975a768d0649fee59f10c2985529ed", size = 161957 }, { url = "https://files.pythonhosted.org/packages/51/6b/4545a0d843594f5d0771e86463606a3988b5a09ca5123136f8a76580dd63/websockets-15.0.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3e90baa811a5d73f3ca0bcbf32064d663ed81318ab225ee4f427ad4e26e5aff3", size = 175437 },
{ url = "https://files.pythonhosted.org/packages/a2/b1/b088f67c2b365f2c86c7b48edb8848ac27e508caf910a9d9d831b2f343cb/websockets-14.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:87e31011b5c14a33b29f17eb48932e63e1dcd3fa31d72209848652310d3d1f0d", size = 159620 }, { url = "https://files.pythonhosted.org/packages/f4/71/809a0f5f6a06522af902e0f2ea2757f71ead94610010cf570ab5c98e99ed/websockets-15.0.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:592f1a9fe869c778694f0aa806ba0374e97648ab57936f092fd9d87f8bc03665", size = 173096 },
{ url = "https://files.pythonhosted.org/packages/c1/89/2a09db1bbb40ba967a1b8225b07b7df89fea44f06de9365f17f684d0f7e6/websockets-14.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:bc6ccf7d54c02ae47a48ddf9414c54d48af9c01076a2e1023e3b486b6e72c707", size = 159852 }, { url = "https://files.pythonhosted.org/packages/3d/69/1a681dd6f02180916f116894181eab8b2e25b31e484c5d0eae637ec01f7c/websockets-15.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0701bc3cfcb9164d04a14b149fd74be7347a530ad3bbf15ab2c678a2cd3dd9a2", size = 173332 },
{ url = "https://files.pythonhosted.org/packages/ca/c1/f983138cd56e7d3079f1966e81f77ce6643f230cd309f73aa156bb181749/websockets-14.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9777564c0a72a1d457f0848977a1cbe15cfa75fa2f67ce267441e465717dcf1a", size = 169675 }, { url = "https://files.pythonhosted.org/packages/a6/02/0073b3952f5bce97eafbb35757f8d0d54812b6174ed8dd952aa08429bcc3/websockets-15.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e8b56bdcdb4505c8078cb6c7157d9811a85790f2f2b3632c7d1462ab5783d215", size = 183152 },
{ url = "https://files.pythonhosted.org/packages/c1/c8/84191455d8660e2a0bdb33878d4ee5dfa4a2cedbcdc88bbd097303b65bfa/websockets-14.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a655bde548ca98f55b43711b0ceefd2a88a71af6350b0c168aa77562104f3f45", size = 168619 }, { url = "https://files.pythonhosted.org/packages/74/45/c205c8480eafd114b428284840da0b1be9ffd0e4f87338dc95dc6ff961a1/websockets-15.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0af68c55afbd5f07986df82831c7bff04846928ea8d1fd7f30052638788bc9b5", size = 182096 },
{ url = "https://files.pythonhosted.org/packages/8d/a7/62e551fdcd7d44ea74a006dc193aba370505278ad76efd938664531ce9d6/websockets-14.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a3dfff83ca578cada2d19e665e9c8368e1598d4e787422a460ec70e531dbdd58", size = 169042 }, { url = "https://files.pythonhosted.org/packages/14/8f/aa61f528fba38578ec553c145857a181384c72b98156f858ca5c8e82d9d3/websockets-15.0.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64dee438fed052b52e4f98f76c5790513235efaa1ef7f3f2192c392cd7c91b65", size = 182523 },
{ url = "https://files.pythonhosted.org/packages/ad/ed/1532786f55922c1e9c4d329608e36a15fdab186def3ca9eb10d7465bc1cc/websockets-14.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:6a6c9bcf7cdc0fd41cc7b7944447982e8acfd9f0d560ea6d6845428ed0562058", size = 169345 }, { url = "https://files.pythonhosted.org/packages/ec/6d/0267396610add5bc0d0d3e77f546d4cd287200804fe02323797de77dbce9/websockets-15.0.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d5f6b181bb38171a8ad1d6aa58a67a6aa9d4b38d0f8c5f496b9e42561dfc62fe", size = 182790 },
{ url = "https://files.pythonhosted.org/packages/ea/fb/160f66960d495df3de63d9bcff78e1b42545b2a123cc611950ffe6468016/websockets-14.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:4b6caec8576e760f2c7dd878ba817653144d5f369200b6ddf9771d64385b84d4", size = 168725 }, { url = "https://files.pythonhosted.org/packages/02/05/c68c5adbf679cf610ae2f74a9b871ae84564462955d991178f95a1ddb7dd/websockets-15.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5d54b09eba2bada6011aea5375542a157637b91029687eb4fdb2dab11059c1b4", size = 182165 },
{ url = "https://files.pythonhosted.org/packages/cf/53/1bf0c06618b5ac35f1d7906444b9958f8485682ab0ea40dee7b17a32da1e/websockets-14.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:eb6d38971c800ff02e4a6afd791bbe3b923a9a57ca9aeab7314c21c84bf9ff05", size = 168712 }, { url = "https://files.pythonhosted.org/packages/29/93/bb672df7b2f5faac89761cb5fa34f5cec45a4026c383a4b5761c6cea5c16/websockets-15.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3be571a8b5afed347da347bfcf27ba12b069d9d7f42cb8c7028b5e98bbb12597", size = 182160 },
{ url = "https://files.pythonhosted.org/packages/e5/22/5ec2f39fff75f44aa626f86fa7f20594524a447d9c3be94d8482cd5572ef/websockets-14.1-cp312-cp312-win32.whl", hash = "sha256:1d045cbe1358d76b24d5e20e7b1878efe578d9897a25c24e6006eef788c0fdf0", size = 162838 }, { url = "https://files.pythonhosted.org/packages/ff/83/de1f7709376dc3ca9b7eeb4b9a07b4526b14876b6d372a4dc62312bebee0/websockets-15.0.1-cp312-cp312-win32.whl", hash = "sha256:c338ffa0520bdb12fbc527265235639fb76e7bc7faafbb93f6ba80d9c06578a9", size = 176395 },
{ url = "https://files.pythonhosted.org/packages/74/27/28f07df09f2983178db7bf6c9cccc847205d2b92ced986cd79565d68af4f/websockets-14.1-cp312-cp312-win_amd64.whl", hash = "sha256:90f4c7a069c733d95c308380aae314f2cb45bd8a904fb03eb36d1a4983a4993f", size = 163277 }, { url = "https://files.pythonhosted.org/packages/7d/71/abf2ebc3bbfa40f391ce1428c7168fb20582d0ff57019b69ea20fa698043/websockets-15.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:fcd5cf9e305d7b8338754470cf69cf81f420459dbae8a3b40cee57417f4614a7", size = 176841 },
{ url = "https://files.pythonhosted.org/packages/b0/0b/c7e5d11020242984d9d37990310520ed663b942333b83a033c2f20191113/websockets-14.1-py3-none-any.whl", hash = "sha256:4d4fc827a20abe6d544a119896f6b78ee13fe81cbfef416f3f2ddf09a03f0e2e", size = 156277 }, { url = "https://files.pythonhosted.org/packages/fa/a8/5b41e0da817d64113292ab1f8247140aac61cbf6cfd085d6a0fa77f4984f/websockets-15.0.1-py3-none-any.whl", hash = "sha256:f7a866fbc1e97b5c617ee4116daaa09b722101d4a3c170c787450ba409f9736f", size = 169743 },
] ]
[[package]] [[package]]

View File

@ -13,7 +13,7 @@ environment:
steps: steps:
- name: lint - name: lint
pull: if-not-exists pull: if-not-exists
image: python:3.12 image: ghcr.io/astral-sh/uv:python3.12
environment: environment:
PRE_COMMIT_HOME: ${CI_PROJECT_DIR}/.cache/pre-commit PRE_COMMIT_HOME: ${CI_PROJECT_DIR}/.cache/pre-commit
volumes: volumes:
@ -21,8 +21,8 @@ steps:
path: ${PRE_COMMIT_HOME} path: ${PRE_COMMIT_HOME}
commands: commands:
- export PRE_COMMIT_HOME=$CI_PROJECT_DIR/.cache/pre-commit - export PRE_COMMIT_HOME=$CI_PROJECT_DIR/.cache/pre-commit
- pip install -q pre-commit - uv pip install -q pre-commit pre-commit-uv
- pre-commit run --show-diff-on-failure --color=always --all-files - uv run pre-commit run --show-diff-on-failure --color=always --all-files
- name: test - name: test
pull: if-not-exists pull: if-not-exists
@ -37,10 +37,10 @@ steps:
- docker-compose -f docker-compose.local.yml up -d - docker-compose -f docker-compose.local.yml up -d
- docker-compose -f docker-compose.local.yml run django pytest - docker-compose -f docker-compose.local.yml run django pytest
{%- else %} {%- else %}
image: python:3.12 image: ghcr.io/astral-sh/uv:python3.12
commands: commands:
- pip install -r requirements/local.txt - uv sync --frozen
- pytest - uv run pytest
{%- endif%} {%- endif%}
volumes: volumes:

View File

@ -107,26 +107,25 @@ jobs:
run: docker compose -f docker-compose.local.yml down run: docker compose -f docker-compose.local.yml down
{%- else %} {%- else %}
- name: Install uv
uses: astral-sh/setup-uv@v5
with:
enable-cache: "true"
- name: Set up Python - name: Set up Python
uses: actions/setup-python@v5 uses: actions/setup-python@v5
with: with:
python-version-file: '.python-version' python-version-file: ".python-version"
cache: pip
cache-dependency-path: |
requirements/base.txt
requirements/local.txt
- name: Install Dependencies - name: Install dependencies
run: | run: uv sync
python -m pip install --upgrade pip
pip install -r requirements/local.txt
- name: Check DB Migrations - name: Check DB Migrations
run: python manage.py makemigrations --check run: uv run python manage.py makemigrations --check
- name: Run DB Migrations - name: Run DB Migrations
run: python manage.py migrate run: uv run python manage.py migrate
- name: Test with pytest - name: Test with pytest
run: pytest run: uv run pytest
{%- endif %} {%- endif %}

View File

@ -13,16 +13,16 @@ variables:
precommit: precommit:
stage: lint stage: lint
image: python:3.12 image: ghcr.io/astral-sh/uv:python3.12
variables: variables:
PRE_COMMIT_HOME: ${CI_PROJECT_DIR}/.cache/pre-commit PRE_COMMIT_HOME: ${CI_PROJECT_DIR}/.cache/pre-commit
cache: cache:
paths: paths:
- ${PRE_COMMIT_HOME} - ${PRE_COMMIT_HOME}
before_script: before_script:
- pip install -q pre-commit - uv pip install -q pre-commit pre-commit-uv
script: script:
- pre-commit run --show-diff-on-failure --color=always --all-files - uv run pre-commit run --show-diff-on-failure --color=always --all-files
pytest: pytest:
stage: test stage: test
@ -39,13 +39,13 @@ pytest:
script: script:
- docker compose -f docker-compose.local.yml run django pytest - docker compose -f docker-compose.local.yml run django pytest
{%- else %} {%- else %}
image: python:3.12 image: ghcr.io/astral-sh/uv:python3.12
services: services:
- postgres:{{ cookiecutter.postgresql_version }} - postgres:{{ cookiecutter.postgresql_version }}
variables: variables:
DATABASE_URL: pgsql://$POSTGRES_USER:$POSTGRES_PASSWORD@postgres/$POSTGRES_DB DATABASE_URL: pgsql://$POSTGRES_USER:$POSTGRES_PASSWORD@postgres/$POSTGRES_DB
before_script: before_script:
- pip install -r requirements/local.txt - uv sync --frozen
script: script:
- pytest - uv run pytest
{%- endif %} {%- endif %}

View File

@ -40,7 +40,8 @@ jobs:
python: python:
- "3.12" - "3.12"
install: install:
- pip install -r requirements/local.txt - pip install uv
- uv sync
script: script:
- pytest - uv run pytest
{%- endif %} {%- endif %}

View File

@ -22,7 +22,7 @@ Moved to [settings](https://cookiecutter-django.readthedocs.io/en/latest/1-getti
- To create a **superuser account**, use this command: - To create a **superuser account**, use this command:
$ python manage.py createsuperuser uv run python manage.py createsuperuser
For convenience, you can keep your normal user logged in on Chrome and your superuser logged in on Firefox (or similar), so that you can see how the site behaves for both kinds of users. For convenience, you can keep your normal user logged in on Chrome and your superuser logged in on Firefox (or similar), so that you can see how the site behaves for both kinds of users.
@ -30,19 +30,19 @@ For convenience, you can keep your normal user logged in on Chrome and your supe
Running type checks with mypy: Running type checks with mypy:
$ mypy {{cookiecutter.project_slug}} uv run mypy {{cookiecutter.project_slug}}
### Test coverage ### Test coverage
To run the tests, check your test coverage, and generate an HTML coverage report: To run the tests, check your test coverage, and generate an HTML coverage report:
$ coverage run -m pytest uv run coverage run -m pytest
$ coverage html uv run coverage html
$ open htmlcov/index.html uv run open htmlcov/index.html
#### Running tests with pytest #### Running tests with pytest
$ pytest uv run pytest
### Live reloading and Sass CSS compilation ### Live reloading and Sass CSS compilation
@ -58,7 +58,7 @@ To run a celery worker:
```bash ```bash
cd {{cookiecutter.project_slug}} cd {{cookiecutter.project_slug}}
celery -A config.celery_app worker -l info uv run celery -A config.celery_app worker -l info
``` ```
Please note: For Celery's import magic to work, it is important _where_ the celery commands are run. If you are in the same folder with _manage.py_, you should be right. Please note: For Celery's import magic to work, it is important _where_ the celery commands are run. If you are in the same folder with _manage.py_, you should be right.
@ -67,14 +67,14 @@ To run [periodic tasks](https://docs.celeryq.dev/en/stable/userguide/periodic-ta
```bash ```bash
cd {{cookiecutter.project_slug}} cd {{cookiecutter.project_slug}}
celery -A config.celery_app beat uv run celery -A config.celery_app beat
``` ```
or you can embed the beat service inside a worker with the `-B` option (not recommended for production use): or you can embed the beat service inside a worker with the `-B` option (not recommended for production use):
```bash ```bash
cd {{cookiecutter.project_slug}} cd {{cookiecutter.project_slug}}
celery -A config.celery_app worker -B -l info uv run celery -A config.celery_app worker -B -l info
``` ```
{%- endif %} {%- endif %}
@ -100,7 +100,7 @@ In development, it is often nice to be able to see emails that are being sent fr
3. Make it executable: 3. Make it executable:
$ chmod +x mailpit chmod +x mailpit
4. Spin up another terminal window and start it there: 4. Spin up another terminal window and start it there:

View File

@ -1,39 +1,38 @@
# define an alias for the specific python version used in this file. # define an alias for the specific python version used in this file.
FROM docker.io/python:3.12.11-slim-bookworm AS python FROM ghcr.io/astral-sh/uv:python3.12-bookworm-slim AS python
# Python build stage # Python build stage
FROM python AS python-build-stage FROM python AS python-build-stage
ARG BUILD_ENVIRONMENT=local ARG APP_HOME=/app
WORKDIR ${APP_HOME}
# we need to move the virtualenv outside of the $APP_HOME directory because it will be overriden by the docker compose mount
ENV UV_COMPILE_BYTECODE=1 UV_LINK_MODE=copy UV_PYTHON_DOWNLOADS=0
# Install apt packages # Install apt packages
RUN apt-get update && apt-get install --no-install-recommends -y \ RUN apt-get update && apt-get install --no-install-recommends -y \
# dependencies for building Python packages # dependencies for building Python packages
build-essential \ build-essential \
# psycopg dependencies # psycopg dependencies
libpq-dev libpq-dev \
gettext \
wait-for-it
# Requirements are installed here to ensure they will be cached. # Requirements are installed here to ensure they will be cached.
COPY ./requirements . RUN --mount=type=cache,target=/root/.cache/uv \
--mount=type=bind,source=pyproject.toml,target=pyproject.toml \
--mount=type=bind,source=uv.lock,target=uv.lock:rw \
uv sync --no-install-project
# Create Python Dependency and Sub-Dependency Wheels. COPY . ${APP_HOME}
RUN pip wheel --wheel-dir /usr/src/app/wheels \
-r ${BUILD_ENVIRONMENT}.txt
RUN --mount=type=cache,target=/root/.cache/uv \
--mount=type=bind,source=pyproject.toml,target=pyproject.toml \
--mount=type=bind,source=uv.lock,target=uv.lock:rw \
uv sync
# Python 'run' stage
FROM python AS python-run-stage
ARG BUILD_ENVIRONMENT=local
ARG APP_HOME=/app
ENV PYTHONUNBUFFERED=1
ENV PYTHONDONTWRITEBYTECODE=1
ENV BUILD_ENV=${BUILD_ENVIRONMENT}
WORKDIR ${APP_HOME}
{% if cookiecutter.use_docker == "y" %}
# devcontainer dependencies and utils # devcontainer dependencies and utils
RUN apt-get update && apt-get install --no-install-recommends -y \ RUN apt-get update && apt-get install --no-install-recommends -y \
sudo git bash-completion nano ssh sudo git bash-completion nano ssh
@ -43,26 +42,9 @@ RUN groupadd --gid 1000 dev-user \
&& useradd --uid 1000 --gid dev-user --shell /bin/bash --create-home dev-user \ && useradd --uid 1000 --gid dev-user --shell /bin/bash --create-home dev-user \
&& echo dev-user ALL=\(root\) NOPASSWD:ALL > /etc/sudoers.d/dev-user \ && echo dev-user ALL=\(root\) NOPASSWD:ALL > /etc/sudoers.d/dev-user \
&& chmod 0440 /etc/sudoers.d/dev-user && chmod 0440 /etc/sudoers.d/dev-user
{% endif %}
# Install required system dependencies ENV PATH="/${APP_HOME}/.venv/bin:$PATH"
RUN apt-get update && apt-get install --no-install-recommends -y \ ENV PYTHONPATH="${APP_HOME}/.venv/lib/python3.12/site-packages:$PYTHONPATH"
# psycopg dependencies
libpq-dev \
wait-for-it \
# Translations dependencies
gettext \
# cleaning up unused files
&& apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \
&& rm -rf /var/lib/apt/lists/*
# All absolute dir copies ignore workdir instruction. All relative dir copies are wrt to the workdir instruction
# copy python dependency wheels from python-build-stage
COPY --from=python-build-stage /usr/src/app/wheels /wheels/
# use wheels to install python dependencies
RUN pip install --no-cache-dir --no-index --find-links=/wheels/ /wheels/* \
&& rm -rf /wheels/
COPY ./compose/production/django/entrypoint /entrypoint COPY ./compose/production/django/entrypoint /entrypoint
RUN sed -i 's/\r$//g' /entrypoint RUN sed -i 's/\r$//g' /entrypoint
@ -86,7 +68,4 @@ RUN sed -i 's/\r$//g' /start-flower
RUN chmod +x /start-flower RUN chmod +x /start-flower
{% endif %} {% endif %}
# copy application code to WORKDIR
COPY . ${APP_HOME}
ENTRYPOINT ["/entrypoint"] ENTRYPOINT ["/entrypoint"]

View File

@ -1,11 +1,13 @@
# define an alias for the specific python version used in this file. # define an alias for the specific python version used in this file.
FROM docker.io/python:3.12.11-slim-bookworm AS python FROM ghcr.io/astral-sh/uv:python3.12-bookworm-slim AS python
# Python build stage # Python build stage
FROM python AS python-build-stage FROM python AS python-build-stage
ENV PYTHONDONTWRITEBYTECODE=1 ARG APP_HOME=/app
WORKDIR ${APP_HOME}
RUN apt-get update && apt-get install --no-install-recommends -y \ RUN apt-get update && apt-get install --no-install-recommends -y \
# dependencies for building Python packages # dependencies for building Python packages
@ -17,12 +19,15 @@ RUN apt-get update && apt-get install --no-install-recommends -y \
&& rm -rf /var/lib/apt/lists/* && rm -rf /var/lib/apt/lists/*
# Requirements are installed here to ensure they will be cached. # Requirements are installed here to ensure they will be cached.
COPY ./requirements /requirements RUN --mount=type=cache,target=/root/.cache/uv \
--mount=type=bind,source=uv.lock,target=uv.lock \
--mount=type=bind,source=pyproject.toml,target=pyproject.toml \
uv sync --no-install-project
# create python dependency wheels COPY . ${APP_HOME}
RUN pip wheel --no-cache-dir --wheel-dir /usr/src/app/wheels \
-r /requirements/local.txt -r /requirements/production.txt \ RUN --mount=type=cache,target=/root/.cache/uv \
&& rm -rf /requirements uv sync
# Python 'run' stage # Python 'run' stage
@ -49,14 +54,12 @@ RUN apt-get update && apt-get install --no-install-recommends -y \
&& rm -rf /var/lib/apt/lists/* && rm -rf /var/lib/apt/lists/*
# copy python dependency wheels from python-build-stage # copy python dependency wheels from python-build-stage
COPY --from=python-build-stage /usr/src/app/wheels /wheels COPY --from=python-build-stage --chown=app:app /app /app
# use wheels to install python dependencies
RUN pip install --no-cache /wheels/* \
&& rm -rf /wheels
COPY ./compose/local/docs/start /start-docs COPY ./compose/local/docs/start /start-docs
RUN sed -i 's/\r$//g' /start-docs RUN sed -i 's/\r$//g' /start-docs
RUN chmod +x /start-docs RUN chmod +x /start-docs
ENV PATH="/app/.venv/bin:$PATH"
WORKDIR /docs WORKDIR /docs

View File

@ -22,15 +22,15 @@ ENV DJANGO_AZURE_ACCOUNT_NAME=${DJANGO_AZURE_ACCOUNT_NAME}
{%- endif %} {%- endif %}
{%- endif %} {%- endif %}
RUN npm run build RUN npm run build
{%- endif %} {%- endif %}
# define an alias for the specific python version used in this file. # define an alias for the specific python version used in this file.
FROM docker.io/python:3.12.11-slim-bookworm AS python FROM ghcr.io/astral-sh/uv:python3.12-bookworm-slim AS python-build-stage
# Python build stage ENV UV_COMPILE_BYTECODE=1 UV_LINK_MODE=copy UV_PYTHON_DOWNLOADS=0
FROM python AS python-build-stage
ARG BUILD_ENVIRONMENT=production ARG APP_HOME=/app
WORKDIR ${APP_HOME}
# Install apt packages # Install apt packages
RUN apt-get update && apt-get install --no-install-recommends -y \ RUN apt-get update && apt-get install --no-install-recommends -y \
@ -41,23 +41,26 @@ RUN apt-get update && apt-get install --no-install-recommends -y \
# Requirements are installed here to ensure they will be cached. # Requirements are installed here to ensure they will be cached.
COPY ./requirements . RUN --mount=type=cache,target=/root/.cache/uv \
--mount=type=bind,source=uv.lock,target=uv.lock \
# Create Python Dependency and Sub-Dependency Wheels. --mount=type=bind,source=pyproject.toml,target=pyproject.toml \
RUN pip wheel --wheel-dir /usr/src/app/wheels \ uv sync --frozen --no-install-project --no-dev
-r ${BUILD_ENVIRONMENT}.txt {%- if cookiecutter.frontend_pipeline in ['Gulp', 'Webpack'] %}
COPY --from=client-builder ${APP_HOME} ${APP_HOME}
{% else %}
COPY . ${APP_HOME}
{%- endif %}
RUN --mount=type=cache,target=/root/.cache/uv \
--mount=type=bind,source=uv.lock,target=uv.lock \
--mount=type=bind,source=pyproject.toml,target=pyproject.toml \
uv sync --frozen --no-dev
# Python 'run' stage # Python 'run' stage
FROM python AS python-run-stage FROM python:3.12-slim-bookworm AS python-run-stage
ARG BUILD_ENVIRONMENT=production
ARG APP_HOME=/app ARG APP_HOME=/app
ENV PYTHONUNBUFFERED=1
ENV PYTHONDONTWRITEBYTECODE=1
ENV BUILD_ENV=${BUILD_ENVIRONMENT}
WORKDIR ${APP_HOME} WORKDIR ${APP_HOME}
RUN addgroup --system django \ RUN addgroup --system django \
@ -76,14 +79,6 @@ RUN apt-get update && apt-get install --no-install-recommends -y \
&& apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \ && apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \
&& rm -rf /var/lib/apt/lists/* && rm -rf /var/lib/apt/lists/*
# All absolute dir copies ignore workdir instruction. All relative dir copies are wrt to the workdir instruction
# copy python dependency wheels from python-build-stage
COPY --from=python-build-stage /usr/src/app/wheels /wheels/
# use wheels to install python dependencies
RUN pip install --no-cache-dir --no-index --find-links=/wheels/ /wheels/* \
&& rm -rf /wheels/
COPY --chown=django:django ./compose/production/django/entrypoint /entrypoint COPY --chown=django:django ./compose/production/django/entrypoint /entrypoint
RUN sed -i 's/\r$//g' /entrypoint RUN sed -i 's/\r$//g' /entrypoint
@ -111,13 +106,8 @@ RUN sed -i 's/\r$//g' /start-flower
RUN chmod +x /start-flower RUN chmod +x /start-flower
{%- endif %} {%- endif %}
# Copy the application from the builder
# copy application code to WORKDIR COPY --from=python-build-stage --chown=django:django ${APP_HOME} ${APP_HOME}
{%- if cookiecutter.frontend_pipeline in ['Gulp', 'Webpack'] %}
COPY --from=client-builder --chown=django:django ${APP_HOME} ${APP_HOME}
{% else %}
COPY --chown=django:django . ${APP_HOME}
{%- endif %}
{%- if cookiecutter.cloud_provider == 'None' %} {%- if cookiecutter.cloud_provider == 'None' %}
# explicitly create the media folder before changing ownership below # explicitly create the media folder before changing ownership below
@ -125,7 +115,10 @@ RUN mkdir -p ${APP_HOME}/{{ cookiecutter.project_slug }}/media
{%- endif %} {%- endif %}
# make django owner of the WORKDIR directory as well. # make django owner of the WORKDIR directory as well.
RUN chown -R django:django ${APP_HOME} RUN chown django:django ${APP_HOME}
# Place executables in the environment at the front of the path
ENV PATH="/app/.venv/bin:$PATH"
USER django USER django

View File

@ -28,7 +28,7 @@ if compress_enabled; then
fi fi
{%- endif %} {%- endif %}
{%- if cookiecutter.use_async == 'y' %} {%- if cookiecutter.use_async == 'y' %}
exec /usr/local/bin/gunicorn config.asgi --bind 0.0.0.0:5000 --chdir=/app -k uvicorn_worker.UvicornWorker exec gunicorn config.asgi --bind 0.0.0.0:5000 --chdir=/app -k uvicorn_worker.UvicornWorker
{%- else %} {%- else %}
exec /usr/local/bin/gunicorn config.wsgi --bind 0.0.0.0:5000 --chdir=/app exec gunicorn config.wsgi --bind 0.0.0.0:5000 --chdir=/app
{%- endif %} {%- endif %}

View File

@ -8,6 +8,7 @@ services:
env_file: env_file:
- ./.envs/.local/.django - ./.envs/.local/.django
volumes: volumes:
- /app/.venv
- ./docs:/docs:z - ./docs:/docs:z
- ./config:/app/config:z - ./config:/app/config:z
- ./{{ cookiecutter.project_slug }}:/app/{{ cookiecutter.project_slug }}:z - ./{{ cookiecutter.project_slug }}:/app/{{ cookiecutter.project_slug }}:z

View File

@ -19,6 +19,7 @@ services:
- mailpit - mailpit
{%- endif %} {%- endif %}
volumes: volumes:
- /app/.venv
- .:/app:z - .:/app:z
env_file: env_file:
- ./.envs/.local/.django - ./.envs/.local/.django

View File

@ -9,7 +9,7 @@ Documentation can be written as rst files in `{{cookiecutter.project_slug}}/docs
{% if cookiecutter.use_docker == 'n' %} {% if cookiecutter.use_docker == 'n' %}
To build and serve docs, use the command:: To build and serve docs, use the command::
make livehtml uv run make livehtml
from inside the `{{cookiecutter.project_slug}}/docs` directory. from inside the `{{cookiecutter.project_slug}}/docs` directory.
{% else %} {% else %}
@ -35,7 +35,7 @@ For an in-use example, see the `page source <_sources/users.rst.txt>`_ for :ref:
To compile all docstrings automatically into documentation source files, use the command: To compile all docstrings automatically into documentation source files, use the command:
:: ::
make apidocs uv run make apidocs
{% if cookiecutter.use_docker == 'y' %} {% if cookiecutter.use_docker == 'y' %}
This can be done in the docker container: This can be done in the docker container:

View File

@ -26,9 +26,9 @@ warn_redundant_casts = true
warn_unused_configs = true warn_unused_configs = true
plugins = [ plugins = [
"mypy_django_plugin.main", "mypy_django_plugin.main",
{%- if cookiecutter.use_drf == "y" %} {%- if cookiecutter.use_drf == "y" %}
"mypy_drf_plugin.main", "mypy_drf_plugin.main",
{%- endif %} {%- endif %}
] ]
[[tool.mypy.overrides]] [[tool.mypy.overrides]]
@ -68,69 +68,69 @@ extend-exclude = [
[tool.ruff.lint] [tool.ruff.lint]
select = [ select = [
"F", "F",
"E", "E",
"W", "W",
"C90", "C90",
"I", "I",
"N", "N",
"UP", "UP",
"YTT", "YTT",
# "ANN", # flake8-annotations: we should support this in the future but 100+ errors atm # "ANN", # flake8-annotations: we should support this in the future but 100+ errors atm
"ASYNC", "ASYNC",
"S", "S",
"BLE", "BLE",
"FBT", "FBT",
"B", "B",
"A", "A",
"COM", "COM",
"C4", "C4",
"DTZ", "DTZ",
"T10", "T10",
"DJ", "DJ",
"EM", "EM",
"EXE", "EXE",
"FA", "FA",
'ISC', 'ISC',
"ICN", "ICN",
"G", "G",
'INP', 'INP',
'PIE', 'PIE',
"T20", "T20",
'PYI', 'PYI',
'PT', 'PT',
"Q", "Q",
"RSE", "RSE",
"RET", "RET",
"SLF", "SLF",
"SLOT", "SLOT",
"SIM", "SIM",
"TID", "TID",
"TC", "TC",
"INT", "INT",
# "ARG", # Unused function argument # "ARG", # Unused function argument
"PTH", "PTH",
"ERA", "ERA",
"PD", "PD",
"PGH", "PGH",
"PL", "PL",
"TRY", "TRY",
"FLY", "FLY",
# "NPY", # "NPY",
# "AIR", # "AIR",
"PERF", "PERF",
# "FURB", # "FURB",
# "LOG", # "LOG",
"RUF", "RUF",
] ]
ignore = [ ignore = [
"S101", # Use of assert detected https://docs.astral.sh/ruff/rules/assert/ "S101", # Use of assert detected https://docs.astral.sh/ruff/rules/assert/
"RUF012", # Mutable class attributes should be annotated with `typing.ClassVar` "RUF012", # Mutable class attributes should be annotated with `typing.ClassVar`
"SIM102", # sometimes it's better to nest "SIM102", # sometimes it's better to nest
"UP038", # Checks for uses of isinstance/issubclass that take a tuple "UP038", # Checks for uses of isinstance/issubclass that take a tuple
# of types for comparison. # of types for comparison.
# Deactivated because it can make the code slow: # Deactivated because it can make the code slow:
# https://github.com/astral-sh/ruff/issues/7871 # https://github.com/astral-sh/ruff/issues/7871
] ]
# The fixes in extend-unsafe-fixes will require # The fixes in extend-unsafe-fixes will require
# provide the `--unsafe-fixes` flag when fixing. # provide the `--unsafe-fixes` flag when fixing.
@ -140,3 +140,19 @@ extend-unsafe-fixes = [
[tool.ruff.lint.isort] [tool.ruff.lint.isort]
force-single-line = true force-single-line = true
[dependency-groups]
dev = []
[project]
name = "{{ cookiecutter.project_slug }}"
version = "{{ cookiecutter.version }}"
description = "{{ cookiecutter.description }}"
readme = "README.md"
license = { text = "{{ cookiecutter.open_source_license }}" }
authors = [
{ name = "{{ cookiecutter.author_name }}", email = "{{ cookiecutter.email }}" },
]
requires-python = "==3.12.*"
dependencies = []

View File

@ -1,3 +0,0 @@
# This file is expected by Heroku.
-r requirements/production.txt

View File

@ -1,5 +1,3 @@
-r production.txt
Werkzeug[watchdog]==3.1.3 # https://github.com/pallets/werkzeug Werkzeug[watchdog]==3.1.3 # https://github.com/pallets/werkzeug
ipdb==0.13.13 # https://github.com/gotcha/ipdb ipdb==0.13.13 # https://github.com/gotcha/ipdb
{%- if cookiecutter.use_docker == 'y' %} {%- if cookiecutter.use_docker == 'y' %}

View File

@ -33,7 +33,7 @@ if [ -z "$VIRTUAL_ENV" ]; then
echo >&2 -e "\n" echo >&2 -e "\n"
exit 1; exit 1;
else else
pip install -r $PROJECT_DIR/requirements/local.txt uv sync --frozen
{%- if cookiecutter.use_heroku == "y" -%} {%- if cookiecutter.use_heroku == "y" -%}
pip install -r $PROJECT_DIR/requirements.txt pip install -r $PROJECT_DIR/requirements.txt
{%- endif %} {%- endif %}

View File

@ -0,0 +1,2 @@
version = 1
requires-python = "==3.12.*"