Merge branch 'refs/heads/master' into all-oss-licenses

This commit is contained in:
Bruno Alla 2024-05-13 19:23:16 +01:00
commit d6dea41482
No known key found for this signature in database
214 changed files with 7669 additions and 2104 deletions

4
.flake8 Normal file
View File

@ -0,0 +1,4 @@
[flake8]
exclude = docs
max-line-length = 119
extend-ignore = E203

View File

@ -22,8 +22,8 @@ accept and merge pull requests.
{%- endfor %} {%- endfor %}
</table> </table>
*Audrey is also the creator of Cookiecutter. Audrey and Daniel are on _Audrey is also the creator of Cookiecutter. Audrey and Daniel are on
the Cookiecutter core team.* the Cookiecutter core team._
## Other Contributors ## Other Contributors
@ -51,6 +51,6 @@ Listed in alphabetical order.
The following haven't provided code directly, but have provided The following haven't provided code directly, but have provided
guidance and advice. guidance and advice.
- Jannis Leidel - Jannis Leidel
- Nate Aune - Nate Aune
- Barry Morrison - Barry Morrison

9
.github/FUNDING.yml vendored
View File

@ -2,11 +2,4 @@
github: [pydanny, browniebroke] github: [pydanny, browniebroke]
patreon: feldroy patreon: feldroy
open_collective: # Replace with a single Open Collective username open_collective: cookiecutter-django
ko_fi: # Replace with a single Ko-fi username
tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
liberapay: # Replace with a single Liberapay username
issuehunt: # Replace with a single IssueHunt username
otechie: # Replace with a single Otechie username
custom: ["https://www.patreon.com/browniebroke"]

View File

@ -12,41 +12,47 @@ labels: bug
<!-- To assist you best, please include commands that you've run, options you've selected and any relevant logs --> <!-- To assist you best, please include commands that you've run, options you've selected and any relevant logs -->
* Host system configuration: - Host system configuration:
* Version of cookiecutter CLI (get it with `cookiecutter --version`):
* OS name and version:
On Linux, run - Version of cookiecutter CLI (get it with `cookiecutter --version`):
```bash - OS name and version:
lsb_release -a 2> /dev/null || cat /etc/redhat-release 2> /dev/null || cat /etc/*-release 2> /dev/null || cat /etc/issue 2> /dev/null
```
On MacOs, run On Linux, run
```bash
sw_vers
```
On Windows, via CMD, run ```bash
``` lsb_release -a 2> /dev/null || cat /etc/redhat-release 2> /dev/null || cat /etc/*-release 2> /dev/null || cat /etc/issue 2> /dev/null
systeminfo | findstr /B /C:"OS Name" /C:"OS Version"
```
```bash
# Insert here the OS name and version
```
* Python version, run `python3 -V`:
* Docker version (if using Docker), run `docker --version`:
* docker-compose version (if using Docker), run `docker-compose --version`:
* ...
* Options selected and/or [replay file](https://cookiecutter.readthedocs.io/en/latest/advanced/replay.html):
On Linux and MacOS: `cat ${HOME}/.cookiecutter_replay/cookiecutter-django.json`
(Please, take care to remove sensitive information)
```json
# Insert here the replay file content
``` ```
On MacOs, run
```bash
sw_vers
```
On Windows, via CMD, run
```
systeminfo | findstr /B /C:"OS Name" /C:"OS Version"
```
```bash
# Insert here the OS name and version
```
- Python version, run `python3 -V`:
- Docker version (if using Docker), run `docker --version`:
- docker compose version (if using Docker), run `docker compose --version`:
- ...
- Options selected and/or [replay file](https://cookiecutter.readthedocs.io/en/latest/advanced/replay.html):
On Linux and macOS: `cat ${HOME}/.cookiecutter_replay/cookiecutter-django.json`
(Please, take care to remove sensitive information)
```json
```
<summary> <summary>
Logs: Logs:
<details> <details>

View File

@ -5,8 +5,8 @@ about: Ask Core Team members to help you out
Provided your question goes beyond [regular support](https://github.com/cookiecutter/cookiecutter-django/issues/new?template=question.md), and/or the task at hand is of timely/high priority nature use the below information to reach out for contributors directly. Provided your question goes beyond [regular support](https://github.com/cookiecutter/cookiecutter-django/issues/new?template=question.md), and/or the task at hand is of timely/high priority nature use the below information to reach out for contributors directly.
* Daniel Roy Greenfeld, Project Lead ([GitHub](https://github.com/pydanny), [Patreon](https://www.patreon.com/danielroygreenfeld)): expertise in Django and AWS ELB. - Bruno Alla, Core Developer ([GitHub](https://github.com/sponsors/browniebroke)).
* Nikita Shupeyko, Core Developer ([GitHub](https://github.com/webyneter)): expertise in Python/Django, hands-on DevOps and frontend experience. - Daniel Roy Greenfeld, Project Lead ([GitHub](https://github.com/pydanny), [Patreon](https://www.patreon.com/danielroygreenfeld)): expertise in Django and AWS ELB.
* Bruno Alla, Core Developer ([GitHub](https://github.com/sponsors/browniebroke)). - Nikita Shupeyko, Core Developer ([GitHub](https://github.com/webyneter)): expertise in Python/Django, hands-on DevOps and frontend experience.

View File

@ -1,6 +1,5 @@
<!-- Thank you for helping us out: your efforts mean a great deal to the project and the community as a whole! --> <!-- Thank you for helping us out: your efforts mean a great deal to the project and the community as a whole! -->
## Description ## Description
<!-- What's it you're proposing? --> <!-- What's it you're proposing? -->

View File

@ -1,8 +1,11 @@
{%- for change_type, pulls in grouped_pulls.items() %} {%- for change_type, pulls in grouped_pulls.items() %}
{%- if pulls %} {%- if pulls %}
### {{ change_type }} ### {{ change_type }}
{%- for pull_request in pulls %} {%- for pull_request in pulls %}
- {{ pull_request.title }} ([#{{ pull_request.number }}]({{ pull_request.html_url }})) - {{ pull_request.title }} ([#{{ pull_request.number }}]({{ pull_request.html_url }}))
{%- endfor -%} {%- endfor -%}
{% endif -%} {% endif -%}
{% endfor -%} {% endfor -%}

View File

@ -53,6 +53,12 @@
"twitter_username": "sfdye", "twitter_username": "sfdye",
"is_core": true "is_core": true
}, },
{
"name": "Jelmer Draaijer",
"github_login": "foarsitter",
"twitter_username": "",
"is_core": true
},
{ {
"name": "18", "name": "18",
"github_login": "dezoito", "github_login": "dezoito",
@ -553,11 +559,6 @@
"github_login": "jvanbrug", "github_login": "jvanbrug",
"twitter_username": "" "twitter_username": ""
}, },
{
"name": "Jelmer Draaijer",
"github_login": "foarsitter",
"twitter_username": ""
},
{ {
"name": "Jerome Caisip", "name": "Jerome Caisip",
"github_login": "jeromecaisip", "github_login": "jeromecaisip",
@ -1114,7 +1115,7 @@
"twitter_username": "Qoyyuum" "twitter_username": "Qoyyuum"
}, },
{ {
"name": "mfosterw", "name": "Matthew Foster Walsh",
"github_login": "mfosterw", "github_login": "mfosterw",
"twitter_username": "" "twitter_username": ""
}, },
@ -1302,5 +1303,270 @@
"name": "Abe Hanoka", "name": "Abe Hanoka",
"github_login": "abe-101", "github_login": "abe-101",
"twitter_username": "abe__101" "twitter_username": "abe__101"
},
{
"name": "Adin Hodovic",
"github_login": "adinhodovic",
"twitter_username": ""
},
{
"name": "Leifur Halldor Asgeirsson",
"github_login": "leifurhauks",
"twitter_username": ""
},
{
"name": "David",
"github_login": "buckldav",
"twitter_username": ""
},
{
"name": "rguptar",
"github_login": "rguptar",
"twitter_username": ""
},
{
"name": "Omer-5",
"github_login": "Omer-5",
"twitter_username": ""
},
{
"name": "TAKAHASHI Shuuji",
"github_login": "shuuji3",
"twitter_username": ""
},
{
"name": "Thomas Booij",
"github_login": "ThomasBooij95",
"twitter_username": ""
},
{
"name": "Pamela Fox",
"github_login": "pamelafox",
"twitter_username": "pamelafox"
},
{
"name": "Robin",
"github_login": "Kaffeetasse",
"twitter_username": ""
},
{
"name": "Patrick Tran",
"github_login": "theptrk",
"twitter_username": ""
},
{
"name": "tildebox",
"github_login": "tildebox",
"twitter_username": ""
},
{
"name": "duffn",
"github_login": "duffn",
"twitter_username": ""
},
{
"name": "Delphine LEMIRE",
"github_login": "DelphineLemire",
"twitter_username": ""
},
{
"name": "Hoai-Thu Vuong",
"github_login": "thuvh",
"twitter_username": ""
},
{
"name": "Arkadiusz Michał Ryś",
"github_login": "arrys",
"twitter_username": ""
},
{
"name": "mpsantos",
"github_login": "mpsantos",
"twitter_username": ""
},
{
"name": "Morten Kaae",
"github_login": "MortenKaae",
"twitter_username": ""
},
{
"name": "Birtibu",
"github_login": "Birtibu",
"twitter_username": ""
},
{
"name": "Matheus Jardim Bernardes",
"github_login": "matheusjardimb",
"twitter_username": ""
},
{
"name": "masavini",
"github_login": "masavini",
"twitter_username": ""
},
{
"name": "Joseph Hanna",
"github_login": "sanchimenea",
"twitter_username": ""
},
{
"name": "tmajerech",
"github_login": "tmajerech",
"twitter_username": ""
},
{
"name": "villancikos",
"github_login": "villancikos",
"twitter_username": ""
},
{
"name": "Imran Rahman",
"github_login": "infraredCoding",
"twitter_username": ""
},
{
"name": "hleroy",
"github_login": "hleroy",
"twitter_username": ""
},
{
"name": "Shayan Karimi",
"github_login": "shywn-mrk",
"twitter_username": "shywn_mrk"
},
{
"name": "Sadra Yahyapour",
"github_login": "lnxpy",
"twitter_username": "lnxpylnxpy"
},
{
"name": "Tharushan",
"github_login": "Tharushan",
"twitter_username": ""
},
{
"name": "Fateme Fouladkar",
"github_login": "FatemeFouladkar",
"twitter_username": ""
},
{
"name": "zhaoruibing",
"github_login": "zhaoruibing",
"twitter_username": ""
},
{
"name": "MinWoo Sung",
"github_login": "SungMinWoo",
"twitter_username": ""
},
{
"name": "itisnotyourenv",
"github_login": "itisnotyourenv",
"twitter_username": ""
},
{
"name": "Vageeshan Mankala",
"github_login": "vagi8",
"twitter_username": ""
},
{
"name": "Jakub Boukal",
"github_login": "SukiCZ",
"twitter_username": ""
},
{
"name": "Christian Jauvin",
"github_login": "cjauvin",
"twitter_username": ""
},
{
"name": "Plurific",
"github_login": "paulschwenn",
"twitter_username": ""
},
{
"name": "GitBib",
"github_login": "GitBib",
"twitter_username": ""
},
{
"name": "Freddy",
"github_login": "Hraesvelg",
"twitter_username": ""
},
{
"name": "aiden",
"github_login": "anyidea",
"twitter_username": ""
},
{
"name": "Michael V. Battista",
"github_login": "mvbattista",
"twitter_username": "mvbattista"
},
{
"name": "Nix Siow",
"github_login": "nixsiow",
"twitter_username": "nixsiow"
},
{
"name": "Jens Kaeske",
"github_login": "jkaeske",
"twitter_username": ""
},
{
"name": "henningbra",
"github_login": "henningbra",
"twitter_username": ""
},
{
"name": "Paul Wulff",
"github_login": "mtmpaulwulff",
"twitter_username": ""
},
{
"name": "Mounir",
"github_login": "mounirmesselmeni",
"twitter_username": ""
},
{
"name": "JAEGYUN JUNG",
"github_login": "TGoddessana",
"twitter_username": ""
},
{
"name": "Simeon Emanuilov",
"github_login": "s-emanuilov",
"twitter_username": "s_emanuilov"
},
{
"name": "Patrick Zhang",
"github_login": "PatDuJour",
"twitter_username": ""
},
{
"name": "GvS",
"github_login": "GvS666",
"twitter_username": ""
},
{
"name": "David Păcioianu",
"github_login": "DavidPacioianu",
"twitter_username": ""
},
{
"name": "farwill",
"github_login": "farwill",
"twitter_username": ""
},
{
"name": "quroom",
"github_login": "quroom",
"twitter_username": ""
},
{
"name": "Marios Frixou",
"github_login": "frixou89",
"twitter_username": ""
} }
] ]

View File

@ -3,13 +3,29 @@
version: 2 version: 2
updates: updates:
# Update Python deps for the template (not the generated project)
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "daily"
labels:
- "project infrastructure"
# Update Python deps for the documentation
- package-ecosystem: "pip"
directory: "docs/"
schedule:
interval: "daily"
labels:
- "project infrastructure"
# Update GitHub actions in workflows # Update GitHub actions in workflows
- package-ecosystem: "github-actions" - package-ecosystem: "github-actions"
directory: "/" directory: "/"
schedule: schedule:
interval: "daily" interval: "daily"
labels: labels:
- "update" - "project infrastructure"
# Update npm packages # Update npm packages
- package-ecosystem: "npm" - package-ecosystem: "npm"
@ -18,3 +34,71 @@ updates:
interval: "daily" interval: "daily"
labels: labels:
- "update" - "update"
# Enable version updates for Docker
# We need to specify each Dockerfile in a separate entry because Dependabot doesn't
# support wildcards or recursively checking subdirectories. Check this issue for updates:
# https://github.com/dependabot/dependabot-core/issues/2178
- package-ecosystem: "docker"
directory: "{{cookiecutter.project_slug}}/compose/local/django/"
schedule:
interval: "daily"
ignore:
- dependency-name: "*"
update-types:
- "version-update:semver-major"
- "version-update:semver-minor"
labels:
- "update"
- package-ecosystem: "docker"
directory: "{{cookiecutter.project_slug}}/compose/local/docs/"
schedule:
interval: "daily"
ignore:
- dependency-name: "*"
update-types:
- "version-update:semver-major"
- "version-update:semver-minor"
labels:
- "update"
- package-ecosystem: "docker"
directory: "{{cookiecutter.project_slug}}/compose/local/node/"
schedule:
interval: "daily"
labels:
- "update"
- package-ecosystem: "docker"
directory: "{{cookiecutter.project_slug}}/compose/production/aws/"
schedule:
interval: "daily"
labels:
- "update"
- package-ecosystem: "docker"
directory: "{{cookiecutter.project_slug}}/compose/production/django/"
schedule:
interval: "daily"
ignore:
- dependency-name: "*"
update-types:
- "version-update:semver-major"
- "version-update:semver-minor"
labels:
- "update"
- package-ecosystem: "docker"
directory: "{{cookiecutter.project_slug}}/compose/production/postgres/"
schedule:
interval: "daily"
labels:
- "update"
- package-ecosystem: "docker"
directory: "{{cookiecutter.project_slug}}/compose/production/traefik/"
schedule:
interval: "daily"
labels:
- "update"

View File

@ -2,6 +2,7 @@ name: CI
on: on:
push: push:
branches: ["master", "main"]
pull_request: pull_request:
concurrency: concurrency:
@ -9,17 +10,6 @@ concurrency:
cancel-in-progress: true cancel-in-progress: true
jobs: jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.10"
cache: pip
- name: Run pre-commit
uses: pre-commit/action@v3.0.0
tests: tests:
strategy: strategy:
fail-fast: false fail-fast: false
@ -29,18 +19,18 @@ jobs:
- windows-latest - windows-latest
- macOS-latest - macOS-latest
name: "Run tests" name: "pytest ${{ matrix.os }}"
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v4
- uses: actions/setup-python@v4 - uses: actions/setup-python@v5
with: with:
python-version: "3.10" python-version: "3.12"
cache: pip cache: pip
- name: Install dependencies - name: Install dependencies
run: pip install -r requirements.txt run: pip install -r requirements.txt
- name: Run tests - name: Run tests
run: pytest tests run: pytest -n auto tests
docker: docker:
strategy: strategy:
@ -48,21 +38,25 @@ jobs:
matrix: matrix:
script: script:
- name: Basic - name: Basic
args: "" args: "ci_tool=Gitlab"
- name: Extended - name: Celery & DRF
args: "use_celery=y use_drf=y frontend_pipeline=Gulp" args: "use_celery=y use_drf=y"
- name: Gulp
args: "frontend_pipeline=Gulp"
- name: Webpack
args: "frontend_pipeline=Webpack"
name: "${{ matrix.script.name }} Docker" name: "Docker ${{ matrix.script.name }}"
runs-on: ubuntu-latest runs-on: ubuntu-latest
env: env:
DOCKER_BUILDKIT: 1 DOCKER_BUILDKIT: 1
COMPOSE_DOCKER_CLI_BUILD: 1 COMPOSE_DOCKER_CLI_BUILD: 1
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v4
- uses: actions/setup-python@v4 - uses: actions/setup-python@v5
with: with:
python-version: "3.10" python-version: "3.12"
cache: pip cache: pip
- name: Install dependencies - name: Install dependencies
run: pip install -r requirements.txt run: pip install -r requirements.txt
@ -74,12 +68,16 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
script: script:
- name: With Celery - name: Celery
args: "use_celery=y frontend_pipeline='Django Compressor'" args: "use_celery=y frontend_pipeline='Django Compressor'"
- name: With Gulp - name: Gulp
args: "frontend_pipeline='Gulp'" args: "frontend_pipeline=Gulp"
- name: Webpack
args: "frontend_pipeline=Webpack use_heroku=y"
- name: Email Username
args: "username_type=email ci_tool=Github project_name='Something superduper long - the great amazing project' project_slug=my_awesome_project"
name: "${{ matrix.script.name }} Bare metal" name: "Bare metal ${{ matrix.script.name }}"
runs-on: ubuntu-latest runs-on: ubuntu-latest
services: services:
redis: redis:
@ -99,10 +97,10 @@ jobs:
DATABASE_URL: "postgres://postgres:postgres@localhost:5432/postgres" DATABASE_URL: "postgres://postgres:postgres@localhost:5432/postgres"
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v4
- uses: actions/setup-python@v4 - uses: actions/setup-python@v5
with: with:
python-version: "3.10" python-version: "3.12"
cache: pip cache: pip
cache-dependency-path: | cache-dependency-path: |
requirements.txt requirements.txt
@ -110,8 +108,8 @@ jobs:
{{cookiecutter.project_slug}}/requirements/local.txt {{cookiecutter.project_slug}}/requirements/local.txt
- name: Install dependencies - name: Install dependencies
run: pip install -r requirements.txt run: pip install -r requirements.txt
- uses: actions/setup-node@v3 - uses: actions/setup-node@v4
with: with:
node-version: "16" node-version: "20"
- name: Bare Metal ${{ matrix.script.name }} - name: Bare Metal ${{ matrix.script.name }}
run: sh tests/test_bare.sh ${{ matrix.script.args }} run: sh tests/test_bare.sh ${{ matrix.script.args }}

View File

@ -16,10 +16,10 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v4
- uses: actions/setup-python@v4 - uses: actions/setup-python@v5
with: with:
python-version: "3.10" python-version: "3.12"
- name: Install dependencies - name: Install dependencies
run: | run: |
python -m pip install --upgrade pip python -m pip install --upgrade pip

View File

@ -23,18 +23,25 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: tiangolo/issue-manager@0.4.0 - uses: tiangolo/issue-manager@0.5.0
with: with:
token: ${{ secrets.GITHUB_TOKEN }} token: ${{ secrets.GITHUB_TOKEN }}
config: > config: >
{ {
"answered": { "answered": {
"delay": 864000,
"message": "Assuming the question was answered, this will be automatically closed now." "message": "Assuming the question was answered, this will be automatically closed now."
}, },
"solved": { "solved": {
"delay": 864000,
"message": "Assuming the original issue was solved, it will be automatically closed now." "message": "Assuming the original issue was solved, it will be automatically closed now."
}, },
"waiting": { "waiting": {
"delay": 864000,
"message": "Automatically closing after waiting for additional info. To re-open, please provide the additional information requested." "message": "Automatically closing after waiting for additional info. To re-open, please provide the additional information requested."
},
"wontfix": {
"delay": 864000,
"message": "As discussed, we won't be implementing this. Automatically closing."
} }
} }

View File

@ -16,15 +16,15 @@ jobs:
# Disables this workflow from running in a repository that is not part of the indicated organization/user # Disables this workflow from running in a repository that is not part of the indicated organization/user
if: github.repository_owner == 'cookiecutter' if: github.repository_owner == 'cookiecutter'
permissions: permissions:
contents: write # for peter-evans/create-pull-request to create branch contents: write # for peter-evans/create-pull-request to create branch
pull-requests: write # for peter-evans/create-pull-request to create a PR pull-requests: write # for peter-evans/create-pull-request to create a PR
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v4
- uses: actions/setup-python@v4 - uses: actions/setup-python@v5
with: with:
python-version: "3.10" python-version: "3.12"
- name: Install pre-commit - name: Install pre-commit
run: pip install pre-commit run: pip install pre-commit
@ -37,7 +37,7 @@ jobs:
run: pre-commit autoupdate run: pre-commit autoupdate
- name: Create Pull Request - name: Create Pull Request
uses: peter-evans/create-pull-request@v4 uses: peter-evans/create-pull-request@v6
with: with:
token: ${{ secrets.GITHUB_TOKEN }} token: ${{ secrets.GITHUB_TOKEN }}
branch: update/pre-commit-autoupdate branch: update/pre-commit-autoupdate

View File

@ -8,18 +8,18 @@ on:
workflow_dispatch: workflow_dispatch:
jobs: jobs:
release: update:
# Disables this workflow from running in a repository that is not part of the indicated organization/user # Disables this workflow from running in a repository that is not part of the indicated organization/user
if: github.repository_owner == 'cookiecutter' if: github.repository_owner == 'cookiecutter'
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v4
- name: Set up Python - name: Set up Python
uses: actions/setup-python@v4 uses: actions/setup-python@v5
with: with:
python-version: "3.10" python-version: "3.12"
- name: Install dependencies - name: Install dependencies
run: | run: |
python -m pip install --upgrade pip python -m pip install --upgrade pip

View File

@ -13,25 +13,27 @@ jobs:
# Disables this workflow from running in a repository that is not part of the indicated organization/user # Disables this workflow from running in a repository that is not part of the indicated organization/user
if: github.repository_owner == 'cookiecutter' if: github.repository_owner == 'cookiecutter'
permissions: permissions:
contents: write # for stefanzweifel/git-auto-commit-action to push code in repo contents: write # for stefanzweifel/git-auto-commit-action to push code in repo
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v4
- name: Set up Python - name: Set up Python
uses: actions/setup-python@v4 uses: actions/setup-python@v5
with: with:
python-version: "3.10" python-version: "3.12"
- name: Install dependencies - name: Install dependencies
run: | run: |
python -m pip install --upgrade pip python -m pip install --upgrade pip
pip install -r requirements.txt pip install -r requirements.txt
- name: Update list - name: Update list
run: python scripts/update_contributors.py run: python scripts/update_contributors.py
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Commit changes - name: Commit changes
uses: stefanzweifel/git-auto-commit-action@v4.15.3 uses: stefanzweifel/git-auto-commit-action@v5.0.1
with: with:
commit_message: Update Contributors commit_message: Update Contributors
file_pattern: CONTRIBUTORS.md .github/contributors.json file_pattern: CONTRIBUTORS.md .github/contributors.json

View File

@ -1,32 +1,49 @@
exclude: "{{cookiecutter.project_slug}}" exclude: "{{cookiecutter.project_slug}}|.github/contributors.json|CHANGELOG.md|CONTRIBUTORS.md"
default_stages: [commit] default_stages: [commit]
default_language_version:
python: python3.12
repos: repos:
- repo: https://github.com/pre-commit/pre-commit-hooks - repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0 rev: v4.6.0
hooks: hooks:
- id: trailing-whitespace - id: trailing-whitespace
- id: end-of-file-fixer
- id: check-json
- id: check-toml
- id: check-xml
- id: check-yaml - id: check-yaml
- id: debug-statements
- id: check-builtin-literals
- id: check-case-conflict
- id: detect-private-key
- repo: https://github.com/pre-commit/mirrors-prettier
rev: "v4.0.0-alpha.8"
hooks:
- id: prettier
args: ["--tab-width", "2"]
- repo: https://github.com/asottile/pyupgrade - repo: https://github.com/asottile/pyupgrade
rev: v3.2.0 rev: v3.15.2
hooks: hooks:
- id: pyupgrade - id: pyupgrade
args: [--py310-plus] args: [--py312-plus]
exclude: hooks/ exclude: hooks/
- repo: https://github.com/psf/black - repo: https://github.com/psf/black
rev: 22.10.0 rev: 24.4.2
hooks: hooks:
- id: black - id: black
- repo: https://github.com/PyCQA/isort - repo: https://github.com/PyCQA/isort
rev: 5.10.1 rev: 5.13.2
hooks: hooks:
- id: isort - id: isort
- repo: https://github.com/PyCQA/flake8 - repo: https://github.com/PyCQA/flake8
rev: 5.0.4 rev: 7.0.0
hooks: hooks:
- id: flake8 - id: flake8

View File

@ -14,8 +14,6 @@ pin: True
label_prs: update label_prs: update
requirements: requirements:
- "requirements.txt"
- "docs/requirements.txt"
- "{{cookiecutter.project_slug}}/requirements/base.txt" - "{{cookiecutter.project_slug}}/requirements/base.txt"
- "{{cookiecutter.project_slug}}/requirements/local.txt" - "{{cookiecutter.project_slug}}/requirements/local.txt"
- "{{cookiecutter.project_slug}}/requirements/production.txt" - "{{cookiecutter.project_slug}}/requirements/production.txt"

View File

@ -4,12 +4,17 @@
# Required # Required
version: 2 version: 2
# Set the version of Python and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.12"
# Build documentation in the docs/ directory with Sphinx # Build documentation in the docs/ directory with Sphinx
sphinx: sphinx:
configuration: docs/conf.py configuration: docs/conf.py
# Version of Python and requirements required to build the docs # Declare the Python requirements required to build your docs
python: python:
version: "3.8"
install: install:
- requirements: docs/requirements.txt - requirements: docs/requirements.txt

File diff suppressed because it is too large Load Diff

View File

@ -1,3 +1,3 @@
## Code of Conduct ## Code of Conduct
Everyone who interacts in the Cookiecutter project's codebase, issue trackers, chat rooms, and mailing lists is expected to follow the [PyPA Code of Conduct](https://www.pypa.io/en/latest/code-of-conduct/). Everyone who interacts in the Cookiecutter project's codebase, issue trackers, chat rooms, and mailing lists is expected to follow the [PSF Code of Conduct](https://www.python.org/psf/conduct/)

View File

@ -2,41 +2,81 @@
Always happy to get issues identified and pull requests! Always happy to get issues identified and pull requests!
## Getting your pull request merged in ## General considerations
1. Keep it small. The smaller the pull request, the more likely we are to accept. 1. Keep it small. The smaller the change, the more likely we are to accept.
2. Pull requests that fix a current issue get priority for review. 2. Changes that fix a current issue get priority for review.
3. Check out [GitHub guide][submit-a-pr] if you've never created a pull request before.
## Getting started
1. Fork the repo
2. Clone your fork
3. Create a branch for your changes
This last step is very important, don't start developing from master, it'll cause pain if you need to send another change later.
## Testing ## Testing
You'll need to run the tests using Python 3.12. We recommend using [tox](https://tox.readthedocs.io/en/latest/) to run the tests. It will automatically create a fresh virtual environment and install our test dependencies, such as [pytest-cookies](https://pypi.python.org/pypi/pytest-cookies/) and [flake8](https://pypi.python.org/pypi/flake8/).
We'll also run the tests on GitHub actions when you send your pull request, but it's a good idea to run them locally before you send it.
### Installation ### Installation
Please install [tox](https://tox.readthedocs.io/en/latest/), which is a generic virtualenv management and test command line tool. First, make sure that your version of Python is 3.12:
[tox](https://tox.readthedocs.io/en/latest/) is available for download from [PyPI](https://pypi.python.org/pypi) via [pip](https://pypi.python.org/pypi/pip/): ```bash
$ python --version
Python 3.12.2
```
$ pip install tox Any version that starts with 3.12 will do. If you need to install it, you can get it from [python.org](https://www.python.org/downloads/).
It will automatically create a fresh virtual environment and install our test dependencies, Then install `tox`, if not already installed:
such as [pytest-cookies](https://pypi.python.org/pypi/pytest-cookies/) and [flake8](https://pypi.python.org/pypi/flake8/).
### Run the Tests ```bash
$ python -m pip install tox
```
Tox uses pytest under the hood, hence it supports the same syntax for selecting tests. ### Run the template's test suite
For further information please consult the [pytest usage docs](https://pytest.org/latest/usage.html#specifying-tests-selecting-tests). To run the tests of the template using the current Python version:
To run all tests using various versions of python in virtualenvs defined in tox.ini, just run tox.: ```bash
$ tox -e py
```
$ tox This uses `pytest `under the hood, and you can pass options to it after a `--`. So to run a particular test:
It is possible to test with a specific version of python. To do this, the command ```bash
is: $ tox -e py -- -k test_default_configuration
```
$ tox -e py310 For further information, please consult the [pytest usage docs](https://pytest.org/en/latest/how-to/usage.html#specifying-which-tests-to-run).
This will run pytest with the python3.10 interpreter, for example. ### Run the generated project tests
To run a particular test with tox for against your current Python version: The template tests are checking that the generated project is fully rendered and that it passes `flake8`. We also have some test scripts which generate a specific project combination, install the dependencies, run the tests of the generated project, install FE dependencies and generate the docs. They will install the template dependencies, so make sure you create and activate a virtual environment first.
$ tox -e py -- -k test_default_configuration ```bash
$ python -m venv venv
$ source venv/bin/activate
```
These tests are slower and can be run with or without Docker:
- Without Docker: `tests/test_bare.sh` (for bare metal)
- With Docker: `tests/test_docker.sh`
All arguments to these scripts will be passed to the `cookiecutter` CLI, letting you set options, for example:
```bash
$ tests/test_bare.sh use_celery=y
```
## Submitting a pull request
Once you're happy with your changes and they look ok locally, push and send send [a pull request][submit-a-pr] to the main repo, which will trigger the tests on GitHub actions. If they fail, try to fix them. A maintainer should take a look at your change and give you feedback or merge it.
[submit-a-pr]: https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request

View File

@ -74,10 +74,17 @@ accept and merge pull requests.
</td> </td>
<td>sfdye</td> <td>sfdye</td>
</tr> </tr>
<tr>
<td>Jelmer Draaijer</td>
<td>
<a href="https://github.com/foarsitter">foarsitter</a>
</td>
<td></td>
</tr>
</table> </table>
*Audrey is also the creator of Cookiecutter. Audrey and Daniel are on _Audrey is also the creator of Cookiecutter. Audrey and Daniel are on
the Cookiecutter core team.* the Cookiecutter core team._
## Other Contributors ## Other Contributors
@ -166,6 +173,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Adin Hodovic</td>
<td>
<a href="https://github.com/adinhodovic">adinhodovic</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Agam Dua</td> <td>Agam Dua</td>
<td> <td>
@ -180,6 +194,13 @@ Listed in alphabetical order.
</td> </td>
<td>scaramagus</td> <td>scaramagus</td>
</tr> </tr>
<tr>
<td>aiden</td>
<td>
<a href="https://github.com/anyidea">anyidea</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Alberto Sanchez</td> <td>Alberto Sanchez</td>
<td> <td>
@ -285,6 +306,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Arkadiusz Michał Ryś</td>
<td>
<a href="https://github.com/arrys">arrys</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Arnav Choudhury</td> <td>Arnav Choudhury</td>
<td> <td>
@ -355,6 +383,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Birtibu</td>
<td>
<a href="https://github.com/Birtibu">Birtibu</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Bo Lopker</td> <td>Bo Lopker</td>
<td> <td>
@ -481,6 +516,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Christian Jauvin</td>
<td>
<a href="https://github.com/cjauvin">cjauvin</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Christopher Clarke</td> <td>Christopher Clarke</td>
<td> <td>
@ -600,6 +642,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>David</td>
<td>
<a href="https://github.com/buckldav">buckldav</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>David Díaz</td> <td>David Díaz</td>
<td> <td>
@ -607,6 +656,13 @@ Listed in alphabetical order.
</td> </td>
<td>DavidDiazPinto</td> <td>DavidDiazPinto</td>
</tr> </tr>
<tr>
<td>David Păcioianu</td>
<td>
<a href="https://github.com/DavidPacioianu">DavidPacioianu</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Davit Tovmasyan</td> <td>Davit Tovmasyan</td>
<td> <td>
@ -628,6 +684,13 @@ Listed in alphabetical order.
</td> </td>
<td>jangeador</td> <td>jangeador</td>
</tr> </tr>
<tr>
<td>Delphine LEMIRE</td>
<td>
<a href="https://github.com/DelphineLemire">DelphineLemire</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Demetris Stavrou</td> <td>Demetris Stavrou</td>
<td> <td>
@ -691,6 +754,13 @@ Listed in alphabetical order.
</td> </td>
<td>dudanogueira</td> <td>dudanogueira</td>
</tr> </tr>
<tr>
<td>duffn</td>
<td>
<a href="https://github.com/duffn">duffn</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Dónal Adams</td> <td>Dónal Adams</td>
<td> <td>
@ -747,6 +817,20 @@ Listed in alphabetical order.
</td> </td>
<td>fabaff</td> <td>fabaff</td>
</tr> </tr>
<tr>
<td>farwill</td>
<td>
<a href="https://github.com/farwill">farwill</a>
</td>
<td></td>
</tr>
<tr>
<td>Fateme Fouladkar</td>
<td>
<a href="https://github.com/FatemeFouladkar">FatemeFouladkar</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Felipe Arruda</td> <td>Felipe Arruda</td>
<td> <td>
@ -768,6 +852,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Freddy</td>
<td>
<a href="https://github.com/Hraesvelg">Hraesvelg</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Fuzzwah</td> <td>Fuzzwah</td>
<td> <td>
@ -810,6 +901,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>GitBib</td>
<td>
<a href="https://github.com/GitBib">GitBib</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Glenn Wiskur</td> <td>Glenn Wiskur</td>
<td> <td>
@ -831,6 +929,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>GvS</td>
<td>
<a href="https://github.com/GvS666">GvS666</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Hamish Durkin</td> <td>Hamish Durkin</td>
<td> <td>
@ -880,6 +985,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>henningbra</td>
<td>
<a href="https://github.com/henningbra">henningbra</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Henrique G. G. Pereira</td> <td>Henrique G. G. Pereira</td>
<td> <td>
@ -887,6 +999,20 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>hleroy</td>
<td>
<a href="https://github.com/hleroy">hleroy</a>
</td>
<td></td>
</tr>
<tr>
<td>Hoai-Thu Vuong</td>
<td>
<a href="https://github.com/thuvh">thuvh</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Howie Zhao</td> <td>Howie Zhao</td>
<td> <td>
@ -901,6 +1027,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Imran Rahman</td>
<td>
<a href="https://github.com/infraredCoding">infraredCoding</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>innicoder</td> <td>innicoder</td>
<td> <td>
@ -922,6 +1055,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>itisnotyourenv</td>
<td>
<a href="https://github.com/itisnotyourenv">itisnotyourenv</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Ivan Khomutov</td> <td>Ivan Khomutov</td>
<td> <td>
@ -929,6 +1069,20 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>JAEGYUN JUNG</td>
<td>
<a href="https://github.com/TGoddessana">TGoddessana</a>
</td>
<td></td>
</tr>
<tr>
<td>Jakub Boukal</td>
<td>
<a href="https://github.com/SukiCZ">SukiCZ</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Jakub Musko</td> <td>Jakub Musko</td>
<td> <td>
@ -958,9 +1112,9 @@ Listed in alphabetical order.
<td></td> <td></td>
</tr> </tr>
<tr> <tr>
<td>Jelmer Draaijer</td> <td>Jens Kaeske</td>
<td> <td>
<a href="https://github.com/foarsitter">foarsitter</a> <a href="https://github.com/jkaeske">jkaeske</a>
</td> </td>
<td></td> <td></td>
</tr> </tr>
@ -1020,6 +1174,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Joseph Hanna</td>
<td>
<a href="https://github.com/sanchimenea">sanchimenea</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>jugglinmike</td> <td>jugglinmike</td>
<td> <td>
@ -1153,6 +1314,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Leifur Halldor Asgeirsson</td>
<td>
<a href="https://github.com/leifurhauks">leifurhauks</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Leo won</td> <td>Leo won</td>
<td> <td>
@ -1237,6 +1405,13 @@ Listed in alphabetical order.
</td> </td>
<td>marciomazza</td> <td>marciomazza</td>
</tr> </tr>
<tr>
<td>Marios Frixou</td>
<td>
<a href="https://github.com/frixou89">frixou89</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Martin Blech</td> <td>Martin Blech</td>
<td> <td>
@ -1251,6 +1426,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>masavini</td>
<td>
<a href="https://github.com/masavini">masavini</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Mateusz Ostaszewski</td> <td>Mateusz Ostaszewski</td>
<td> <td>
@ -1258,6 +1440,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Matheus Jardim Bernardes</td>
<td>
<a href="https://github.com/matheusjardimb">matheusjardimb</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Mathijs Hoogland</td> <td>Mathijs Hoogland</td>
<td> <td>
@ -1300,6 +1489,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Matthew Foster Walsh</td>
<td>
<a href="https://github.com/mfosterw">mfosterw</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Matthew Sisley</td> <td>Matthew Sisley</td>
<td> <td>
@ -1335,13 +1531,6 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>mfosterw</td>
<td>
<a href="https://github.com/mfosterw">mfosterw</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Michael Gecht</td> <td>Michael Gecht</td>
<td> <td>
@ -1356,6 +1545,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Michael V. Battista</td>
<td>
<a href="https://github.com/mvbattista">mvbattista</a>
</td>
<td>mvbattista</td>
</tr>
<tr> <tr>
<td>Mike97M</td> <td>Mike97M</td>
<td> <td>
@ -1370,6 +1566,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>MinWoo Sung</td>
<td>
<a href="https://github.com/SungMinWoo">SungMinWoo</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>monosans</td> <td>monosans</td>
<td> <td>
@ -1377,6 +1580,20 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Morten Kaae</td>
<td>
<a href="https://github.com/MortenKaae">MortenKaae</a>
</td>
<td></td>
</tr>
<tr>
<td>Mounir</td>
<td>
<a href="https://github.com/mounirmesselmeni">mounirmesselmeni</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>mozillazg</td> <td>mozillazg</td>
<td> <td>
@ -1391,6 +1608,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>mpsantos</td>
<td>
<a href="https://github.com/mpsantos">mpsantos</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Naveen</td> <td>Naveen</td>
<td> <td>
@ -1412,6 +1636,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Nix Siow</td>
<td>
<a href="https://github.com/nixsiow">nixsiow</a>
</td>
<td>nixsiow</td>
</tr>
<tr> <tr>
<td>Noah H</td> <td>Noah H</td>
<td> <td>
@ -1426,6 +1657,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Omer-5</td>
<td>
<a href="https://github.com/Omer-5">Omer-5</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Pablo</td> <td>Pablo</td>
<td> <td>
@ -1433,6 +1671,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Pamela Fox</td>
<td>
<a href="https://github.com/pamelafox">pamelafox</a>
</td>
<td>pamelafox</td>
</tr>
<tr> <tr>
<td>Parbhat Puri</td> <td>Parbhat Puri</td>
<td> <td>
@ -1440,6 +1685,27 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Patrick Tran</td>
<td>
<a href="https://github.com/theptrk">theptrk</a>
</td>
<td></td>
</tr>
<tr>
<td>Patrick Zhang</td>
<td>
<a href="https://github.com/PatDuJour">PatDuJour</a>
</td>
<td></td>
</tr>
<tr>
<td>Paul Wulff</td>
<td>
<a href="https://github.com/mtmpaulwulff">mtmpaulwulff</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Pawan Chaurasia</td> <td>Pawan Chaurasia</td>
<td> <td>
@ -1489,6 +1755,20 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Plurific</td>
<td>
<a href="https://github.com/paulschwenn">paulschwenn</a>
</td>
<td></td>
</tr>
<tr>
<td>quroom</td>
<td>
<a href="https://github.com/quroom">quroom</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Raony Guimarães Corrêa</td> <td>Raony Guimarães Corrêa</td>
<td> <td>
@ -1524,6 +1804,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>rguptar</td>
<td>
<a href="https://github.com/rguptar">rguptar</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Richard Hajdu</td> <td>Richard Hajdu</td>
<td> <td>
@ -1531,6 +1818,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Robin</td>
<td>
<a href="https://github.com/Kaffeetasse">Kaffeetasse</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Roman Afanaskin</td> <td>Roman Afanaskin</td>
<td> <td>
@ -1559,6 +1853,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Sadra Yahyapour</td>
<td>
<a href="https://github.com/lnxpy">lnxpy</a>
</td>
<td>lnxpylnxpy</td>
</tr>
<tr> <tr>
<td>Sam Collins</td> <td>Sam Collins</td>
<td> <td>
@ -1580,6 +1881,20 @@ Listed in alphabetical order.
</td> </td>
<td>sebastianreyese</td> <td>sebastianreyese</td>
</tr> </tr>
<tr>
<td>Shayan Karimi</td>
<td>
<a href="https://github.com/shywn-mrk">shywn-mrk</a>
</td>
<td>shywn_mrk</td>
</tr>
<tr>
<td>Simeon Emanuilov</td>
<td>
<a href="https://github.com/s-emanuilov">s-emanuilov</a>
</td>
<td>s_emanuilov</td>
</tr>
<tr> <tr>
<td>Simon Rey</td> <td>Simon Rey</td>
<td> <td>
@ -1636,6 +1951,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>TAKAHASHI Shuuji</td>
<td>
<a href="https://github.com/shuuji3">shuuji3</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Tames McTigue</td> <td>Tames McTigue</td>
<td> <td>
@ -1657,6 +1979,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>Tharushan</td>
<td>
<a href="https://github.com/Tharushan">Tharushan</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Thibault J.</td> <td>Thibault J.</td>
<td> <td>
@ -1664,6 +1993,13 @@ Listed in alphabetical order.
</td> </td>
<td>thibault</td> <td>thibault</td>
</tr> </tr>
<tr>
<td>Thomas Booij</td>
<td>
<a href="https://github.com/ThomasBooij95">ThomasBooij95</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Théo Segonds</td> <td>Théo Segonds</td>
<td> <td>
@ -1671,6 +2007,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>tildebox</td>
<td>
<a href="https://github.com/tildebox">tildebox</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Tim Claessens</td> <td>Tim Claessens</td>
<td> <td>
@ -1692,6 +2035,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>tmajerech</td>
<td>
<a href="https://github.com/tmajerech">tmajerech</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Tom Atkins</td> <td>Tom Atkins</td>
<td> <td>
@ -1734,6 +2084,13 @@ Listed in alphabetical order.
</td> </td>
<td>egregors</td> <td>egregors</td>
</tr> </tr>
<tr>
<td>Vageeshan Mankala</td>
<td>
<a href="https://github.com/vagi8">vagi8</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>vascop</td> <td>vascop</td>
<td> <td>
@ -1755,6 +2112,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>villancikos</td>
<td>
<a href="https://github.com/villancikos">villancikos</a>
</td>
<td></td>
</tr>
<tr> <tr>
<td>Vitaly Babiy</td> <td>Vitaly Babiy</td>
<td> <td>
@ -1839,6 +2203,13 @@ Listed in alphabetical order.
</td> </td>
<td></td> <td></td>
</tr> </tr>
<tr>
<td>zhaoruibing</td>
<td>
<a href="https://github.com/zhaoruibing">zhaoruibing</a>
</td>
<td></td>
</tr>
</table> </table>
### Special Thanks ### Special Thanks
@ -1846,6 +2217,6 @@ Listed in alphabetical order.
The following haven't provided code directly, but have provided The following haven't provided code directly, but have provided
guidance and advice. guidance and advice.
- Jannis Leidel - Jannis Leidel
- Nate Aune - Nate Aune
- Barry Morrison - Barry Morrison

152
README.md
View File

@ -1,67 +1,71 @@
# Cookiecutter Django # Cookiecutter Django
[![Build Status](https://img.shields.io/github/workflow/status/cookiecutter/cookiecutter-django/CI/master)](https://github.com/cookiecutter/cookiecutter-django/actions?query=workflow%3ACI) [![Build Status](https://img.shields.io/github/actions/workflow/status/cookiecutter/cookiecutter-django/ci.yml?branch=master)](https://github.com/cookiecutter/cookiecutter-django/actions/workflows/ci.yml?query=branch%3Amaster)
[![Documentation Status](https://readthedocs.org/projects/cookiecutter-django/badge/?version=latest)](https://cookiecutter-django.readthedocs.io/en/latest/?badge=latest) [![Documentation Status](https://readthedocs.org/projects/cookiecutter-django/badge/?version=latest)](https://cookiecutter-django.readthedocs.io/en/latest/?badge=latest)
[![Updates](https://pyup.io/repos/github/cookiecutter/cookiecutter-django/shield.svg)](https://pyup.io/repos/github/cookiecutter/cookiecutter-django/) [![pre-commit.ci status](https://results.pre-commit.ci/badge/github/cookiecutter/cookiecutter-django/master.svg)](https://results.pre-commit.ci/latest/github/cookiecutter/cookiecutter-django/master)
[![Join our Discord](https://img.shields.io/badge/Discord-cookiecutter-5865F2?style=flat&logo=discord&logoColor=white)](https://discord.gg/uFXweDQc5a)
[![Code Helpers Badge](https://www.codetriage.com/cookiecutter/cookiecutter-django/badges/users.svg)](https://www.codetriage.com/cookiecutter/cookiecutter-django)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black)
[![Updates](https://pyup.io/repos/github/cookiecutter/cookiecutter-django/shield.svg)](https://pyup.io/repos/github/cookiecutter/cookiecutter-django/)
[![Join our Discord](https://img.shields.io/badge/Discord-cookiecutter-5865F2?style=flat&logo=discord&logoColor=white)](https://discord.gg/rAWFUP47d2)
[![Code Helpers Badge](https://www.codetriage.com/cookiecutter/cookiecutter-django/badges/users.svg)](https://www.codetriage.com/cookiecutter/cookiecutter-django)
Powered by [Cookiecutter](https://github.com/cookiecutter/cookiecutter), Cookiecutter Django is a framework for jumpstarting Powered by [Cookiecutter](https://github.com/cookiecutter/cookiecutter), Cookiecutter Django is a framework for jumpstarting
production-ready Django projects quickly. production-ready Django projects quickly.
- Documentation: <https://cookiecutter-django.readthedocs.io/en/latest/> - Documentation: <https://cookiecutter-django.readthedocs.io/en/latest/>
- See [Troubleshooting](https://cookiecutter-django.readthedocs.io/en/latest/troubleshooting.html) for common errors and obstacles - See [Troubleshooting](https://cookiecutter-django.readthedocs.io/en/latest/troubleshooting.html) for common errors and obstacles
- If you have problems with Cookiecutter Django, please open [issues](https://github.com/cookiecutter/cookiecutter-django/issues/new) don't send - If you have problems with Cookiecutter Django, please open [issues](https://github.com/cookiecutter/cookiecutter-django/issues/new) don't send
emails to the maintainers. emails to the maintainers.
## Features ## Features
- For Django 4.0 - For Django 4.2
- Works with Python 3.10 - Works with Python 3.12
- Renders Django projects with 100% starting test coverage - Renders Django projects with 100% starting test coverage
- Twitter [Bootstrap](https://github.com/twbs/bootstrap) v5 - Twitter [Bootstrap](https://github.com/twbs/bootstrap) v5
- [12-Factor](http://12factor.net/) based settings via [django-environ](https://github.com/joke2k/django-environ) - [12-Factor](https://12factor.net) based settings via [django-environ](https://github.com/joke2k/django-environ)
- Secure by default. We believe in SSL. - Secure by default. We believe in SSL.
- Optimized development and production settings - Optimized development and production settings
- Registration via [django-allauth](https://github.com/pennersr/django-allauth) - Registration via [django-allauth](https://github.com/pennersr/django-allauth)
- Comes with custom user model ready to go - Comes with custom user model ready to go
- Optional basic ASGI setup for Websockets - Optional basic ASGI setup for Websockets
- Optional custom static build using Gulp and livereload - Optional custom static build using Gulp or Webpack
- Send emails via [Anymail](https://github.com/anymail/django-anymail) (using [Mailgun](http://www.mailgun.com/) by default or Amazon SES if AWS is selected cloud provider, but switchable) - Send emails via [Anymail](https://github.com/anymail/django-anymail) (using [Mailgun](http://www.mailgun.com/) by default or Amazon SES if AWS is selected cloud provider, but switchable)
- Media storage using Amazon S3 or Google Cloud Storage - Media storage using Amazon S3, Google Cloud Storage, Azure Storage or nginx
- Docker support using [docker-compose](https://github.com/docker/compose) for development and production (using [Traefik](https://traefik.io/) with [LetsEncrypt](https://letsencrypt.org/) support) - Docker support using [docker-compose](https://github.com/docker/compose) for development and production (using [Traefik](https://traefik.io/) with [LetsEncrypt](https://letsencrypt.org/) support)
- [Procfile](https://devcenter.heroku.com/articles/procfile) for deploying to Heroku - [Procfile](https://devcenter.heroku.com/articles/procfile) for deploying to Heroku
- Instructions for deploying to [PythonAnywhere](https://www.pythonanywhere.com/) - Instructions for deploying to [PythonAnywhere](https://www.pythonanywhere.com/)
- Run tests with unittest or pytest - Run tests with unittest or pytest
- Customizable PostgreSQL version - Customizable PostgreSQL version
- Default integration with [pre-commit](https://github.com/pre-commit/pre-commit) for identifying simple issues before submission to code review - Default integration with [pre-commit](https://github.com/pre-commit/pre-commit) for identifying simple issues before submission to code review
## Optional Integrations ## Optional Integrations
*These features can be enabled during initial project setup.* _These features can be enabled during initial project setup._
- Serve static files from Amazon S3, Google Cloud Storage or [Whitenoise](https://whitenoise.readthedocs.io/) - Serve static files from Amazon S3, Google Cloud Storage, Azure Storage or [Whitenoise](https://whitenoise.readthedocs.io/)
- Configuration for [Celery](https://docs.celeryq.dev) and [Flower](https://github.com/mher/flower) (the latter in Docker setup only) - Configuration for [Celery](https://docs.celeryq.dev) and [Flower](https://github.com/mher/flower) (the latter in Docker setup only)
- Integration with [MailHog](https://github.com/mailhog/MailHog) for local email testing - Integration with [Mailpit](https://github.com/axllent/mailpit/) for local email testing
- Integration with [Sentry](https://sentry.io/welcome/) for error logging - Integration with [Sentry](https://sentry.io/welcome/) for error logging
## Constraints ## Constraints
- Only maintained 3rd party libraries are used. - Only maintained 3rd party libraries are used.
- Uses PostgreSQL everywhere: 10.19 - 14.1 ([MySQL fork](https://github.com/mabdullahadeel/cookiecutter-django-mysql) also available). - Uses PostgreSQL everywhere: 12 - 16 ([MySQL fork](https://github.com/mabdullahadeel/cookiecutter-django-mysql) also available).
- Environment variables for configuration (This won't work with Apache/mod_wsgi). - Environment variables for configuration (This won't work with Apache/mod_wsgi).
## Support this Project! ## Support this Project!
This project is run by volunteers. Please support them in their efforts to maintain and improve Cookiecutter Django: This project is an open source project run by volunteers. You can sponsor us via [OpenCollective](https://opencollective.com/cookiecutter-django) or individually via GitHub Sponsors:
- Daniel Roy Greenfeld, Project Lead ([GitHub](https://github.com/pydanny), [Patreon](https://www.patreon.com/danielroygreenfeld)): expertise in Django and AWS ELB. - Daniel Roy Greenfeld, Project Lead ([GitHub](https://github.com/pydanny), [Patreon](https://www.patreon.com/danielroygreenfeld)): expertise in Django and AWS ELB.
- Nikita Shupeyko, Core Developer ([GitHub](https://github.com/webyneter)): expertise in Python/Django, hands-on DevOps and frontend experience. - Fabio C. Barrionuevo, Core Developer ([GitHub](https://github.com/luzfcb)): expertise in Python/Django, hands-on DevOps and frontend experience.
- Bruno Alla, Core Developer ([GitHub](https://github.com/browniebroke)): expertise in Python/Django and DevOps.
- Nikita Shupeyko, Core Developer ([GitHub](https://github.com/webyneter)): expertise in Python/Django, hands-on DevOps and frontend experience.
Projects that provide financial support to the maintainers: Projects that provide financial support to the maintainers:
------------------------------------------------------------------------ ---
<p align="center"> <p align="center">
<a href="https://www.feldroy.com/products//two-scoops-of-django-3-x"><img src="https://cdn.shopify.com/s/files/1/0304/6901/products/Two-Scoops-of-Django-3-Alpha-Cover_540x_26507b15-e489-470b-8a97-02773dd498d1_1080x.jpg"></a> <a href="https://www.feldroy.com/products//two-scoops-of-django-3-x"><img src="https://cdn.shopify.com/s/files/1/0304/6901/products/Two-Scoops-of-Django-3-Alpha-Cover_540x_26507b15-e489-470b-8a97-02773dd498d1_1080x.jpg"></a>
@ -116,16 +120,24 @@ Answer the prompts with your own desired [options](http://cookiecutter-django.re
4 - Apache Software License 2.0 4 - Apache Software License 2.0
5 - Not open source 5 - Not open source
Choose from 1, 2, 3, 4, 5 [1]: 1 Choose from 1, 2, 3, 4, 5 [1]: 1
Select username_type:
1 - username
2 - email
Choose from 1, 2 [1]: 1
timezone [UTC]: America/Los_Angeles timezone [UTC]: America/Los_Angeles
windows [n]: n windows [n]: n
use_pycharm [n]: y Select an editor to use. The choices are:
1 - None
2 - PyCharm
3 - VS Code
Choose from 1, 2, 3 [1]: 1
use_docker [n]: n use_docker [n]: n
Select postgresql_version: Select postgresql_version:
1 - 14 1 - 16
2 - 13 2 - 15
3 - 12 3 - 14
4 - 11 4 - 13
5 - 10 5 - 12
Choose from 1, 2, 3, 4, 5 [1]: 1 Choose from 1, 2, 3, 4, 5 [1]: 1
Select cloud_provider: Select cloud_provider:
1 - AWS 1 - AWS
@ -149,9 +161,10 @@ Answer the prompts with your own desired [options](http://cookiecutter-django.re
1 - None 1 - None
2 - Django Compressor 2 - Django Compressor
3 - Gulp 3 - Gulp
4 - Webpack
Choose from 1, 2, 3, 4 [1]: 1 Choose from 1, 2, 3, 4 [1]: 1
use_celery [n]: y use_celery [n]: y
use_mailhog [n]: n use_mailpit [n]: n
use_sentry [n]: y use_sentry [n]: y
use_whitenoise [n]: n use_whitenoise [n]: n
use_heroku [n]: y use_heroku [n]: y
@ -181,14 +194,16 @@ Now take a look at your repo. Don't forget to carefully look at the generated RE
For local development, see the following: For local development, see the following:
- [Developing locally](http://cookiecutter-django.readthedocs.io/en/latest/developing-locally.html) - [Developing locally](http://cookiecutter-django.readthedocs.io/en/latest/developing-locally.html)
- [Developing locally using docker](http://cookiecutter-django.readthedocs.io/en/latest/developing-locally-docker.html) - [Developing locally using docker](http://cookiecutter-django.readthedocs.io/en/latest/developing-locally-docker.html)
## Community ## Community
- Have questions? **Before you ask questions anywhere else**, please post your question on [Stack Overflow](http://stackoverflow.com/questions/tagged/cookiecutter-django) under the *cookiecutter-django* tag. We check there periodically for questions. - Have questions? **Before you ask questions anywhere else**, please post your question on [Stack Overflow](http://stackoverflow.com/questions/tagged/cookiecutter-django) under the _cookiecutter-django_ tag. We check there periodically for questions.
- If you think you found a bug or want to request a feature, please open an [issue](https://github.com/cookiecutter/cookiecutter-django/issues). - If you think you found a bug or want to request a feature, please open an [issue](https://github.com/cookiecutter/cookiecutter-django/issues).
- For anything else, you can chat with us on [Discord](https://discord.gg/uFXweDQc5a). - For anything else, you can chat with us on [Discord](https://discord.gg/uFXweDQc5a).
<img src="https://opencollective.com/cookiecutter-django/contributors.svg?width=890&button=false" alt="Contributors">
## For Readers of Two Scoops of Django ## For Readers of Two Scoops of Django
@ -196,13 +211,14 @@ You may notice that some elements of this project do not exactly match what we d
## For PyUp Users ## For PyUp Users
If you are using [PyUp](https://pyup.io) to keep your dependencies updated and secure, use the code *cookiecutter* during checkout to get 15% off every month. If you are using [PyUp](https://pyup.io) to keep your dependencies updated and secure, use the code _cookiecutter_ during checkout to get 15% off every month.
## "Your Stuff" ## "Your Stuff"
Scattered throughout the Python and HTML of this project are places marked with "your stuff". This is where third-party libraries are to be integrated with your project. Scattered throughout the Python and HTML of this project are places marked with "your stuff". This is where third-party libraries are to be integrated with your project.
## For MySQL users ## For MySQL users
To get full MySQL support in addition to the default Postgresql, you can use this fork of the cookiecutter-django: To get full MySQL support in addition to the default Postgresql, you can use this fork of the cookiecutter-django:
https://github.com/mabdullahadeel/cookiecutter-django-mysql https://github.com/mabdullahadeel/cookiecutter-django-mysql
@ -212,18 +228,18 @@ Need a stable release? You can find them at <https://github.com/cookiecutter/coo
## Not Exactly What You Want? ## Not Exactly What You Want?
This is what I want. *It might not be what you want.* Don't worry, you have options: This is what I want. _It might not be what you want._ Don't worry, you have options:
### Fork This ### Fork This
If you have differences in your preferred setup, I encourage you to fork this to create your own version. If you have differences in your preferred setup, I encourage you to fork this to create your own version.
Once you have your fork working, let me know and I'll add it to a '*Similar Cookiecutter Templates*' list here. Once you have your fork working, let me know and I'll add it to a '_Similar Cookiecutter Templates_' list here.
It's up to you whether to rename your fork. It's up to you whether to rename your fork.
If you do rename your fork, I encourage you to submit it to the following places: If you do rename your fork, I encourage you to submit it to the following places:
- [cookiecutter](https://github.com/cookiecutter/cookiecutter) so it gets listed in the README as a template. - [cookiecutter](https://github.com/cookiecutter/cookiecutter) so it gets listed in the README as a template.
- The cookiecutter [grid](https://www.djangopackages.com/grids/g/cookiecutters/) on Django Packages. - The cookiecutter [grid](https://www.djangopackages.com/grids/g/cookiecutters/) on Django Packages.
### Submit a Pull Request ### Submit a Pull Request
@ -232,16 +248,18 @@ experience better.
## Articles ## Articles
- [Cookiecutter Django With Amazon RDS](https://haseeburrehman.com/posts/cookiecutter-django-with-amazon-rds/) - Apr, 2, 2021 - [How to Make Your Own Django Cookiecutter Template!](https://medium.com/@FatemeFouladkar/how-to-make-your-own-django-cookiecutter-template-a753d4cbb8c2) - Aug. 10, 2023
- [Using cookiecutter-django with Google Cloud Storage](https://ahhda.github.io/cloud/gce/django/2019/03/12/using-django-cookiecutter-cloud-storage.html) - Mar. 12, 2019 - [Cookiecutter Django With Amazon RDS](https://haseeburrehman.com/posts/cookiecutter-django-with-amazon-rds/) - Apr, 2, 2021
- [cookiecutter-django with Nginx, Route 53 and ELB](https://msaizar.com/blog/cookiecutter-django-nginx-route-53-and-elb/) - Feb. 12, 2018 - [Complete Walkthrough: Blue/Green Deployment to AWS ECS using GitHub actions](https://github.com/Andrew-Chen-Wang/cookiecutter-django-ecs-github) - June 10, 2020
- [cookiecutter-django and Amazon RDS](https://msaizar.com/blog/cookiecutter-django-and-amazon-rds/) - Feb. 7, 2018 - [Using cookiecutter-django with Google Cloud Storage](https://ahhda.github.io/cloud/gce/django/2019/03/12/using-django-cookiecutter-cloud-storage.html) - Mar. 12, 2019
- [Using Cookiecutter to Jumpstart a Django Project on Windows with PyCharm](https://joshuahunter.com/posts/using-cookiecutter-to-jumpstart-a-django-project-on-windows-with-pycharm/) - May 19, 2017 - [cookiecutter-django with Nginx, Route 53 and ELB](https://msaizar.com/blog/cookiecutter-django-nginx-route-53-and-elb/) - Feb. 12, 2018
- [Exploring with Cookiecutter](http://www.snowboardingcoder.com/django/2016/12/03/exploring-with-cookiecutter/) - Dec. 3, 2016 - [cookiecutter-django and Amazon RDS](https://msaizar.com/blog/cookiecutter-django-and-amazon-rds/) - Feb. 7, 2018
- [Introduction to Cookiecutter-Django](http://krzysztofzuraw.com/blog/2016/django-cookiecutter.html) - Feb. 19, 2016 - [Using Cookiecutter to Jumpstart a Django Project on Windows with PyCharm](https://joshuahunter.com/posts/using-cookiecutter-to-jumpstart-a-django-project-on-windows-with-pycharm/) - May 19, 2017
- [Django and GitLab - Running Continuous Integration and tests with your FREE account](http://dezoito.github.io/2016/05/11/django-gitlab-continuous-integration-phantomjs.html) - May. 11, 2016 - [Exploring with Cookiecutter](http://www.snowboardingcoder.com/django/2016/12/03/exploring-with-cookiecutter/) - Dec. 3, 2016
- [Development and Deployment of Cookiecutter-Django on Fedora](https://realpython.com/blog/python/development-and-deployment-of-cookiecutter-django-on-fedora/) - Jan. 18, 2016 - [Introduction to Cookiecutter-Django](http://krzysztofzuraw.com/blog/2016/django-cookiecutter.html) - Feb. 19, 2016
- [Development and Deployment of Cookiecutter-Django via Docker](https://realpython.com/blog/python/development-and-deployment-of-cookiecutter-django-via-docker/) - Dec. 29, 2015 - [Django and GitLab - Running Continuous Integration and tests with your FREE account](http://dezoito.github.io/2016/05/11/django-gitlab-continuous-integration-phantomjs.html) - May. 11, 2016
- [How to create a Django Application using Cookiecutter and Django 1.8](https://www.swapps.io/blog/how-to-create-a-django-application-using-cookiecutter-and-django-1-8/) - Sept. 12, 2015 - [Development and Deployment of Cookiecutter-Django on Fedora](https://realpython.com/blog/python/development-and-deployment-of-cookiecutter-django-on-fedora/) - Jan. 18, 2016
- [Development and Deployment of Cookiecutter-Django via Docker](https://realpython.com/blog/python/development-and-deployment-of-cookiecutter-django-via-docker/) - Dec. 29, 2015
- [How to create a Django Application using Cookiecutter and Django 1.8](https://www.swapps.io/blog/how-to-create-a-django-application-using-cookiecutter-and-django-1-8/) - Sept. 12, 2015
Have a blog or online publication? Write about your cookiecutter-django tips and tricks, then send us a pull request with the link. Have a blog or online publication? Write about your cookiecutter-django tips and tricks, then send us a pull request with the link.

View File

@ -4,7 +4,7 @@
"description": "Behold My Awesome Project!", "description": "Behold My Awesome Project!",
"author_name": "Daniel Roy Greenfeld", "author_name": "Daniel Roy Greenfeld",
"domain_name": "example.com", "domain_name": "example.com",
"email": "{{ cookiecutter.author_name.lower()|replace(' ', '-') }}@example.com", "email": "{{ cookiecutter.author_name.lower() | trim() |replace(' ', '-') }}@{{ cookiecutter.domain_name.lower() | trim() }}",
"version": "0.1.0", "version": "0.1.0",
"open_source_license": [ "open_source_license": [
"Not open source", "Not open source",
@ -54,22 +54,13 @@
"Vim License", "Vim License",
"zlib License" "zlib License"
], ],
"username_type": ["username", "email"],
"timezone": "UTC", "timezone": "UTC",
"windows": "n", "windows": "n",
"use_pycharm": "n", "editor": ["None", "PyCharm", "VS Code"],
"use_docker": "n", "use_docker": "n",
"postgresql_version": [ "postgresql_version": ["16", "15", "14", "13", "12"],
"14", "cloud_provider": ["AWS", "GCP", "Azure", "None"],
"13",
"12",
"11",
"10"
],
"cloud_provider": [
"AWS",
"GCP",
"None"
],
"mail_service": [ "mail_service": [
"Mailgun", "Mailgun",
"Amazon SES", "Amazon SES",
@ -83,22 +74,13 @@
], ],
"use_async": "n", "use_async": "n",
"use_drf": "n", "use_drf": "n",
"frontend_pipeline": [ "frontend_pipeline": ["None", "Django Compressor", "Gulp", "Webpack"],
"None",
"Django Compressor",
"Gulp"
],
"use_celery": "n", "use_celery": "n",
"use_mailhog": "n", "use_mailpit": "n",
"use_sentry": "n", "use_sentry": "n",
"use_whitenoise": "n", "use_whitenoise": "n",
"use_heroku": "n", "use_heroku": "n",
"ci_tool": [ "ci_tool": ["None", "Travis", "Gitlab", "Github", "Drone"],
"None",
"Travis",
"Gitlab",
"Github"
],
"keep_local_envs_in_vcs": "y", "keep_local_envs_in_vcs": "y",
"debug": "n" "debug": "n"
} }

View File

@ -23,13 +23,13 @@ now = datetime.now()
# Add any Sphinx extension module names here, as strings. They can be extensions # Add any Sphinx extension module names here, as strings. They can be extensions
# coming with Sphinx (named 'sphinx.ext.*') or your custom ones. # coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = [] extensions = ["myst_parser"]
# Add any paths that contain templates here, relative to this directory. # Add any paths that contain templates here, relative to this directory.
templates_path = ["_templates"] templates_path = ["_templates"]
# The suffix of source filenames. # The suffix of source filenames.
source_suffix = ".rst" source_suffix = [".rst", ".md"]
# The encoding of source files. # The encoding of source files.
# source_encoding = 'utf-8-sig' # source_encoding = 'utf-8-sig'
@ -239,8 +239,7 @@ texinfo_documents = [
"Cookiecutter Django documentation", "Cookiecutter Django documentation",
"Daniel Roy Greenfeld", "Daniel Roy Greenfeld",
"Cookiecutter Django", "Cookiecutter Django",
"A Cookiecutter template for creating production-ready " "A Cookiecutter template for creating production-ready " "Django projects quickly.",
"Django projects quickly.",
"Miscellaneous", "Miscellaneous",
) )
] ]

3
docs/contributing.md Normal file
View File

@ -0,0 +1,3 @@
```{include} ../CONTRIBUTING.md
```

View File

@ -12,13 +12,13 @@ Run these commands to deploy the project to Heroku:
heroku create --buildpack heroku/python heroku create --buildpack heroku/python
heroku addons:create heroku-postgresql:hobby-dev heroku addons:create heroku-postgresql:mini
# On Windows use double quotes for the time zone, e.g. # On Windows use double quotes for the time zone, e.g.
# heroku pg:backups schedule --at "02:00 America/Los_Angeles" DATABASE_URL # heroku pg:backups schedule --at "02:00 America/Los_Angeles" DATABASE_URL
heroku pg:backups schedule --at '02:00 America/Los_Angeles' DATABASE_URL heroku pg:backups schedule --at '02:00 America/Los_Angeles' DATABASE_URL
heroku pg:promote DATABASE_URL heroku pg:promote DATABASE_URL
heroku addons:create heroku-redis:hobby-dev heroku addons:create heroku-redis:mini
# Assuming you chose Mailgun as mail service (see below for others) # Assuming you chose Mailgun as mail service (see below for others)
heroku addons:create mailgun:starter heroku addons:create mailgun:starter
@ -46,7 +46,7 @@ Run these commands to deploy the project to Heroku:
# Assign with AWS_STORAGE_BUCKET_NAME # Assign with AWS_STORAGE_BUCKET_NAME
heroku config:set DJANGO_AWS_STORAGE_BUCKET_NAME= heroku config:set DJANGO_AWS_STORAGE_BUCKET_NAME=
git push heroku master git push heroku main
heroku run python manage.py createsuperuser heroku run python manage.py createsuperuser
@ -109,10 +109,10 @@ Or add the DSN for your account, if you already have one:
.. _Sentry add-on: https://elements.heroku.com/addons/sentry .. _Sentry add-on: https://elements.heroku.com/addons/sentry
Gulp & Bootstrap compilation Gulp or Webpack
++++++++++++++++++++++++++++ +++++++++++++++
If you've opted for Gulp, you'll most likely need to setup If you've opted for Gulp or Webpack as frontend pipeline, you'll most likely need to setup
your app to use `multiple buildpacks`_: one for Python & one for Node.js: your app to use `multiple buildpacks`_: one for Python & one for Node.js:
.. code-block:: bash .. code-block:: bash
@ -121,8 +121,8 @@ your app to use `multiple buildpacks`_: one for Python & one for Node.js:
At time of writing, this should do the trick: during deployment, At time of writing, this should do the trick: during deployment,
the Heroku should run ``npm install`` and then ``npm build``, the Heroku should run ``npm install`` and then ``npm build``,
which runs Gulp in cookiecutter-django. which run the SASS compilation & JS bundling.
If things don't work, please refer to the Heroku docs. If things don't work, please refer to the Heroku docs.
.. _multiple buildpacks: https://devcenter.heroku.com/articles/using-multiple-buildpacks-for-an-app .. _multiple buildpacks: https://devcenter.heroku.com/articles/using-multiple-buildpacks-for-an-app

View File

@ -37,6 +37,7 @@ Make sure your project is fully committed and pushed up to Bitbucket or Github o
mkvirtualenv --python=/usr/bin/python3.10 my-project-name mkvirtualenv --python=/usr/bin/python3.10 my-project-name
pip install -r requirements/production.txt # may take a few minutes pip install -r requirements/production.txt # may take a few minutes
.. note:: We're creating the virtualenv using Python 3.10 (``--python=/usr/bin/python3.10```), although Cookiecutter Django generates a project for Python 3.12. This is because, at time of writing, PythonAnywhere only supports Python 3.10. It shouldn't be a problem, but if is, you may try changing the Python version to 3.12 and see if it works. If it does, please let us know, or even better, submit a pull request to update this section.
Setting environment variables in the console Setting environment variables in the console
-------------------------------------------- --------------------------------------------

View File

@ -1,7 +1,7 @@
Deployment with Docker Deployment with Docker
====================== ======================
.. index:: deployment, docker, docker-compose, compose .. index:: deployment, docker, docker compose, compose
Prerequisites Prerequisites
@ -14,7 +14,7 @@ Prerequisites
Understanding the Docker Compose Setup Understanding the Docker Compose Setup
-------------------------------------- --------------------------------------
Before you begin, check out the ``production.yml`` file in the root of this project. Keep note of how it provides configuration for the following services: Before you begin, check out the ``docker-compose.production.yml`` file in the root of this project. Keep note of how it provides configuration for the following services:
* ``django``: your application running behind ``Gunicorn``; * ``django``: your application running behind ``Gunicorn``;
* ``postgres``: PostgreSQL database with the application's relational data; * ``postgres``: PostgreSQL database with the application's relational data;
@ -84,6 +84,32 @@ You can read more about this feature and how to configure it, at `Automatic HTTP
.. _Automatic HTTPS: https://docs.traefik.io/https/acme/ .. _Automatic HTTPS: https://docs.traefik.io/https/acme/
.. _webpack-whitenoise-limitation:
Webpack without Whitenoise limitation
-------------------------------------
If you opt for Webpack without Whitenoise, Webpack needs to know the static URL at build time, when running ``docker compose build`` (See ``webpack/prod.config.js``). Depending on your setup, this URL may come from the following environment variables:
- ``AWS_STORAGE_BUCKET_NAME``
- ``DJANGO_AWS_S3_CUSTOM_DOMAIN``
- ``DJANGO_GCP_STORAGE_BUCKET_NAME``
- ``DJANGO_AZURE_CONTAINER_NAME``
The Django settings are getting these values at runtime via the ``.envs/.production/.django`` file , but Docker does not read this file at build time, it only look for a ``.env`` in the root of the project. Failing to pass the values correctly will result in a page without CSS styles nor javascript.
To solve this, you can either:
1. merge all the env files into ``.env`` by running::
merge_production_dotenvs_in_dotenv.py
2. create a ``.env`` file in the root of the project with just variables you need. You'll need to also define them in ``.envs/.production/.django`` (hence duplicating them).
3. set these variables when running the build command::
DJANGO_AWS_S3_CUSTOM_DOMAIN=example.com docker compose -f docker-compose.production.yml build``.
None of these options are ideal, we're open to suggestions on how to improve this. If you think you have one, please open an issue or a pull request.
(Optional) Postgres Data Volume Modifications (Optional) Postgres Data Volume Modifications
--------------------------------------------- ---------------------------------------------
@ -96,42 +122,42 @@ Building & Running Production Stack
You will need to build the stack first. To do that, run:: You will need to build the stack first. To do that, run::
docker-compose -f production.yml build docker compose -f docker-compose.production.yml build
Once this is ready, you can run it with:: Once this is ready, you can run it with::
docker-compose -f production.yml up docker compose -f docker-compose.production.yml up
To run the stack and detach the containers, run:: To run the stack and detach the containers, run::
docker-compose -f production.yml up -d docker compose -f docker-compose.production.yml up -d
To run a migration, open up a second terminal and run:: To run a migration, open up a second terminal and run::
docker-compose -f production.yml run --rm django python manage.py migrate docker compose -f docker-compose.production.yml run --rm django python manage.py migrate
To create a superuser, run:: To create a superuser, run::
docker-compose -f production.yml run --rm django python manage.py createsuperuser docker compose -f docker-compose.production.yml run --rm django python manage.py createsuperuser
If you need a shell, run:: If you need a shell, run::
docker-compose -f production.yml run --rm django python manage.py shell docker compose -f docker-compose.production.yml run --rm django python manage.py shell
To check the logs out, run:: To check the logs out, run::
docker-compose -f production.yml logs docker compose -f docker-compose.production.yml logs
If you want to scale your application, run:: If you want to scale your application, run::
docker-compose -f production.yml up --scale django=4 docker compose -f docker-compose.production.yml up --scale django=4
docker-compose -f production.yml up --scale celeryworker=2 docker compose -f docker-compose.production.yml up --scale celeryworker=2
.. warning:: don't try to scale ``postgres``, ``celerybeat``, or ``traefik``. .. warning:: don't try to scale ``postgres``, ``celerybeat``, or ``traefik``.
To see how your containers are doing run:: To see how your containers are doing run::
docker-compose -f production.yml ps docker compose -f docker-compose.production.yml ps
Example: Supervisor Example: Supervisor
@ -139,12 +165,12 @@ Example: Supervisor
Once you are ready with your initial setup, you want to make sure that your application is run by a process manager to Once you are ready with your initial setup, you want to make sure that your application is run by a process manager to
survive reboots and auto restarts in case of an error. You can use the process manager you are most familiar with. All survive reboots and auto restarts in case of an error. You can use the process manager you are most familiar with. All
it needs to do is to run ``docker-compose -f production.yml up`` in your projects root directory. it needs to do is to run ``docker compose -f docker-compose.production.yml up`` in your projects root directory.
If you are using ``supervisor``, you can use this file as a starting point:: If you are using ``supervisor``, you can use this file as a starting point::
[program:{{cookiecutter.project_slug}}] [program:{{cookiecutter.project_slug}}]
command=docker-compose -f production.yml up command=docker compose -f docker-compose.production.yml up
directory=/path/to/{{cookiecutter.project_slug}} directory=/path/to/{{cookiecutter.project_slug}}
redirect_stderr=true redirect_stderr=true
autostart=true autostart=true
@ -161,3 +187,7 @@ For status check, run::
supervisorctl status supervisorctl status
Media files without cloud provider
----------------------------------
If you chose no cloud provider and Docker, the media files will be served by an nginx service, from a ``production_django_media`` volume. Make sure to keep this around to avoid losing any media files.

View File

@ -3,9 +3,6 @@ Getting Up and Running Locally With Docker
.. index:: Docker .. index:: Docker
The steps below will get you up and running with a local development environment.
All of these commands assume you are in the root of your generated project.
.. note:: .. note::
If you're new to Docker, please be aware that some resources are cached system-wide If you're new to Docker, please be aware that some resources are cached system-wide
@ -19,19 +16,25 @@ Prerequisites
* Docker; if you don't have it yet, follow the `installation instructions`_; * Docker; if you don't have it yet, follow the `installation instructions`_;
* Docker Compose; refer to the official documentation for the `installation guide`_. * Docker Compose; refer to the official documentation for the `installation guide`_.
* Pre-commit; refer to the official documentation for the `pre-commit`_. * Pre-commit; refer to the official documentation for the `pre-commit`_.
* Cookiecutter; refer to the official GitHub repository of `Cookiecutter`_
.. _`installation instructions`: https://docs.docker.com/install/#supported-platforms .. _`installation instructions`: https://docs.docker.com/install/#supported-platforms
.. _`installation guide`: https://docs.docker.com/compose/install/ .. _`installation guide`: https://docs.docker.com/compose/install/
.. _`pre-commit`: https://pre-commit.com/#install .. _`pre-commit`: https://pre-commit.com/#install
.. _`Cookiecutter`: https://github.com/cookiecutter/cookiecutter
Before Getting Started
----------------------
.. include:: generate-project-block.rst
Build the Stack Build the Stack
--------------- ---------------
This can take a while, especially the first time you run this particular command on your development system:: This can take a while, especially the first time you run this particular command on your development system::
$ docker-compose -f local.yml build $ docker compose -f docker-compose.local.yml build
Generally, if you want to emulate production environment use ``production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it! Generally, if you want to emulate production environment use ``docker-compose.production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it!
Before doing any git commit, `pre-commit`_ should be installed globally on your local machine, and then:: Before doing any git commit, `pre-commit`_ should be installed globally on your local machine, and then::
@ -48,31 +51,40 @@ This brings up both Django and PostgreSQL. The first time it is run it might tak
Open a terminal at the project root and run the following for local development:: Open a terminal at the project root and run the following for local development::
$ docker-compose -f local.yml up $ docker compose -f docker-compose.local.yml up
You can also set the environment variable ``COMPOSE_FILE`` pointing to ``local.yml`` like this:: You can also set the environment variable ``COMPOSE_FILE`` pointing to ``docker-compose.local.yml`` like this::
$ export COMPOSE_FILE=local.yml $ export COMPOSE_FILE=docker-compose.local.yml
And then run:: And then run::
$ docker-compose up $ docker compose up
To run in a detached (background) mode, just:: To run in a detached (background) mode, just::
$ docker-compose up -d $ docker compose up -d
These commands don't run the docs service. In order to run docs service you can run::
$ docker compose -f docker-compose.docs.yml up
To run the docs with local services just use::
$ docker compose -f docker-compose.local.yml -f docker-compose.docs.yml up
The site should start and be accessible at http://localhost:3000 if you selected Webpack or Gulp as frontend pipeline and http://localhost:8000 otherwise.
Execute Management Commands Execute Management Commands
--------------------------- ---------------------------
As with any shell command that we wish to run in our container, this is done using the ``docker-compose -f local.yml run --rm`` command: :: As with any shell command that we wish to run in our container, this is done using the ``docker compose -f docker-compose.local.yml run --rm`` command: ::
$ docker-compose -f local.yml run --rm django python manage.py migrate $ docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
$ docker-compose -f local.yml run --rm django python manage.py createsuperuser $ docker compose -f docker-compose.local.yml run --rm django python manage.py createsuperuser
Here, ``django`` is the target service we are executing the commands against. Here, ``django`` is the target service we are executing the commands against.
Also, please note that the ``docker exec`` does not work for running management commands.
(Optionally) Designate your Docker Development Server IP (Optionally) Designate your Docker Development Server IP
-------------------------------------------------------- --------------------------------------------------------
@ -85,7 +97,7 @@ When ``DEBUG`` is set to ``True``, the host is validated against ``['localhost',
Configuring the Environment Configuring the Environment
--------------------------- ---------------------------
This is the excerpt from your project's ``local.yml``: :: This is the excerpt from your project's ``docker-compose.local.yml``: ::
# ... # ...
@ -141,6 +153,19 @@ This tells our computer that all future commands are specifically for the dev1 m
$ eval "$(docker-machine env dev1)" $ eval "$(docker-machine env dev1)"
Add 3rd party python packages
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
To install a new 3rd party python package, you cannot use ``pip install <package_name>``, that would only add the package to the container. The container is ephemeral, so that new library won't be persisted if you run another container. Instead, you should modify the Docker image:
You have to modify the relevant requirement file: base, local or production by adding: ::
<package_name>==<package_version>
To get this change picked up, you'll need to rebuild the image(s) and restart the running container: ::
docker compose -f docker-compose.local.yml build
docker compose -f docker-compose.local.yml up
Debugging Debugging
~~~~~~~~~ ~~~~~~~~~
@ -153,7 +178,7 @@ If you are using the following within your code to debug: ::
Then you may need to run the following for it to work as desired: :: Then you may need to run the following for it to work as desired: ::
$ docker-compose -f local.yml run --rm --service-ports django $ docker compose -f docker-compose.local.yml run --rm --service-ports django
django-debug-toolbar django-debug-toolbar
@ -173,16 +198,16 @@ The ``container_name`` from the yml file can be used to check on containers with
Notice that the ``container_name`` is generated dynamically using your project slug as a prefix Notice that the ``container_name`` is generated dynamically using your project slug as a prefix
Mailhog Mailpit
~~~~~~~ ~~~~~~~
When developing locally you can go with MailHog_ for email testing provided ``use_mailhog`` was set to ``y`` on setup. To proceed, When developing locally you can go with Mailpit_ for email testing provided ``use_mailpit`` was set to ``y`` on setup. To proceed,
#. make sure ``<project_slug>_local_mailhog`` container is up and running; #. make sure ``<project_slug>_local_mailpit`` container is up and running;
#. open up ``http://127.0.0.1:8025``. #. open up ``http://127.0.0.1:8025``.
.. _Mailhog: https://github.com/mailhog/MailHog/ .. _Mailpit: https://github.com/axllent/mailpit/
.. _`CeleryTasks`: .. _`CeleryTasks`:
@ -206,10 +231,20 @@ Prerequisites:
* ``use_docker`` was set to ``y`` on project initialization; * ``use_docker`` was set to ``y`` on project initialization;
* ``use_celery`` was set to ``y`` on project initialization. * ``use_celery`` was set to ``y`` on project initialization.
By default, it's enabled both in local and production environments (``local.yml`` and ``production.yml`` Docker Compose configs, respectively) through a ``flower`` service. For added security, ``flower`` requires its clients to provide authentication credentials specified as the corresponding environments' ``.envs/.local/.django`` and ``.envs/.production/.django`` ``CELERY_FLOWER_USER`` and ``CELERY_FLOWER_PASSWORD`` environment variables. Check out ``localhost:5555`` and see for yourself. By default, it's enabled both in local and production environments (``docker-compose.local.yml`` and ``docker-compose.production.yml`` Docker Compose configs, respectively) through a ``flower`` service. For added security, ``flower`` requires its clients to provide authentication credentials specified as the corresponding environments' ``.envs/.local/.django`` and ``.envs/.production/.django`` ``CELERY_FLOWER_USER`` and ``CELERY_FLOWER_PASSWORD`` environment variables. Check out ``localhost:5555`` and see for yourself.
.. _`Flower`: https://github.com/mher/flower .. _`Flower`: https://github.com/mher/flower
Using Webpack or Gulp
~~~~~~~~~~~~~~~~~~~~~
If you've opted for Gulp or Webpack as front-end pipeline, the project comes configured with `Sass`_ compilation and `live reloading`_. As you change your Sass/JS source files, the task runner will automatically rebuild the corresponding CSS and JS assets and reload them in your browser without refreshing the page.
The stack comes with a dedicated node service to build the static assets, watch for changes and proxy requests to the Django app with live reloading scripts injected in the response. For everything to work smoothly, you need to access the application at the port served by the node service, which is http://localhost:3000 by default.
.. _Sass: https://sass-lang.com/
.. _live reloading: https://browsersync.io
Developing locally with HTTPS Developing locally with HTTPS
----------------------------- -----------------------------
@ -244,7 +279,7 @@ certs
Take the certificates that you generated and place them in a folder called ``certs`` in the project's root folder. Assuming that you registered your local hostname as ``my-dev-env.local``, the certificates you will put in the folder should have the names ``my-dev-env.local.crt`` and ``my-dev-env.local.key``. Take the certificates that you generated and place them in a folder called ``certs`` in the project's root folder. Assuming that you registered your local hostname as ``my-dev-env.local``, the certificates you will put in the folder should have the names ``my-dev-env.local.crt`` and ``my-dev-env.local.key``.
local.yml docker-compose.local.yml
~~~~~~~~~ ~~~~~~~~~
#. Add the ``nginx-proxy`` service. :: #. Add the ``nginx-proxy`` service. ::
@ -288,7 +323,7 @@ You should allow the new hostname. ::
Rebuild your ``docker`` application. :: Rebuild your ``docker`` application. ::
$ docker-compose -f local.yml up -d --build $ docker compose -f docker-compose.local.yml up -d --build
Go to your browser and type in your URL bar ``https://my-dev-env.local`` Go to your browser and type in your URL bar ``https://my-dev-env.local``
@ -302,3 +337,26 @@ See `https with nginx`_ for more information on this configuration.
Add ``certs/*`` to the ``.gitignore`` file. This allows the folder to be included in the repo but its contents to be ignored. Add ``certs/*`` to the ``.gitignore`` file. This allows the folder to be included in the repo but its contents to be ignored.
*This configuration is for local development environments only. Do not use this for production since you might expose your local* ``rootCA-key.pem``. *This configuration is for local development environments only. Do not use this for production since you might expose your local* ``rootCA-key.pem``.
Webpack
~~~~~~~
If you are using Webpack:
1. On the ``nginx-proxy`` service in ``docker-compose.local.yml``, change ``depends_on`` to ``node`` instead of ``django``.
2. On the ``node`` service in ``docker-compose.local.yml``, add the following environment configuration:
::
environment:
- VIRTUAL_HOST=my-dev-env.local
- VIRTUAL_PORT=3000
3. Add the following configuration to the ``devServer`` section of ``webpack/dev.config.js``:
::
client: {
webSocketURL: 'auto://0.0.0.0:0/ws', // note the `:0` after `0.0.0.0`
},

View File

@ -9,7 +9,7 @@ Setting Up Development Environment
Make sure to have the following on your host: Make sure to have the following on your host:
* Python 3.10 * Python 3.12
* PostgreSQL_. * PostgreSQL_.
* Redis_, if using Celery * Redis_, if using Celery
* Cookiecutter_ * Cookiecutter_
@ -18,15 +18,14 @@ First things first.
#. Create a virtualenv: :: #. Create a virtualenv: ::
$ python3.10 -m venv <virtual env path> $ python3.12 -m venv <virtual env path>
#. Activate the virtualenv you have just created: :: #. Activate the virtualenv you have just created: ::
$ source <virtual env path>/bin/activate $ source <virtual env path>/bin/activate
#. Install cookiecutter-django: :: #.
.. include:: generate-project-block.rst
$ cookiecutter gh:cookiecutter/cookiecutter-django
#. Install development requirements: :: #. Install development requirements: ::
@ -43,6 +42,7 @@ First things first.
#. Create a new PostgreSQL database using createdb_: :: #. Create a new PostgreSQL database using createdb_: ::
$ createdb --username=postgres <project_slug> $ createdb --username=postgres <project_slug>
``project_slug`` is what you have entered as the project_slug at the setup stage. ``project_slug`` is what you have entered as the project_slug at the setup stage.
.. note:: .. note::
@ -80,10 +80,12 @@ First things first.
$ python manage.py runserver 0.0.0.0:8000 $ python manage.py runserver 0.0.0.0:8000
or if you're running asynchronously: :: or if you're running asynchronously: ::
$ uvicorn config.asgi:application --host 0.0.0.0 --reload --reload-include '*.html' $ uvicorn config.asgi:application --host 0.0.0.0 --reload --reload-include '*.html'
If you've opted for Webpack or Gulp as frontend pipeline, please see the :ref:`dedicated section <bare-metal-webpack-gulp>` below.
.. _PostgreSQL: https://www.postgresql.org/download/ .. _PostgreSQL: https://www.postgresql.org/download/
.. _Redis: https://redis.io/download .. _Redis: https://redis.io/download
.. _CookieCutter: https://github.com/cookiecutter/cookiecutter .. _CookieCutter: https://github.com/cookiecutter/cookiecutter
@ -94,42 +96,95 @@ or if you're running asynchronously: ::
.. _direnv: https://direnv.net/ .. _direnv: https://direnv.net/
Creating Your First Django App
-------------------------------
After setting up your environment, you're ready to add your first app. This project uses the setup from "Two Scoops of Django" with a two-tier layout:
- **Top Level Repository Root** has config files, documentation, `manage.py`, and more.
- **Second Level Django Project Root** is where your Django apps live.
- **Second Level Configuration Root** holds settings and URL configurations.
The project layout looks something like this: ::
<repository_root>/
├── config/
│ ├── settings/
│ │ ├── __init__.py
│ │ ├── base.py
│ │ ├── local.py
│ │ └── production.py
│ ├── urls.py
│ └── wsgi.py
├── <django_project_root>/
│ ├── <name_of_the_app>/
│ │ ├── migrations/
│ │ ├── admin.py
│ │ ├── apps.py
│ │ ├── models.py
│ │ ├── tests.py
│ │ └── views.py
│ ├── __init__.py
│ └── ...
├── requirements/
│ ├── base.txt
│ ├── local.txt
│ └── production.txt
├── manage.py
├── README.md
└── ...
Following this structured approach, here's how to add a new app:
#. **Create the app** using Django's ``startapp`` command, replacing ``<name-of-the-app>`` with your desired app name: ::
$ python manage.py startapp <name-of-the-app>
#. **Move the app** to the Django Project Root, maintaining the project's two-tier structure: ::
$ mv <name-of-the-app> <django_project_root>/
#. **Edit the app's apps.py** change ``name = '<name-of-the-app>'`` to ``name = '<django_project_root>.<name-of-the-app>'``.
#. **Register the new app** by adding it to the ``LOCAL_APPS`` list in ``config/settings/base.py``, integrating it as an official component of your project.
Setup Email Backend Setup Email Backend
------------------- -------------------
MailHog Mailpit
~~~~~~~ ~~~~~~~
.. note:: In order for the project to support MailHog_ it must have been bootstrapped with ``use_mailhog`` set to ``y``. .. note:: In order for the project to support Mailpit_ it must have been bootstrapped with ``use_mailpit`` set to ``y``.
MailHog is used to receive emails during development, it is written in Go and has no external dependencies. Mailpit is used to receive emails during development, it is written in Go and has no external dependencies.
For instance, one of the packages we depend upon, ``django-allauth`` sends verification emails to new users signing up as well as to the existing ones who have not yet verified themselves. For instance, one of the packages we depend upon, ``django-allauth`` sends verification emails to new users signing up as well as to the existing ones who have not yet verified themselves.
#. `Download the latest MailHog release`_ for your OS. #. `Download the latest Mailpit release`_ for your OS.
#. Rename the build to ``MailHog``. #. Copy the binary file to the project root.
#. Copy the file to the project root.
#. Make it executable: :: #. Make it executable: ::
$ chmod +x MailHog $ chmod +x mailpit
#. Spin up another terminal window and start it there: :: #. Spin up another terminal window and start it there: ::
./MailHog ./mailpit
#. Check out `<http://127.0.0.1:8025/>`_ to see how it goes. #. Check out `<http://127.0.0.1:8025/>`_ to see how it goes.
Now you have your own mail server running locally, ready to receive whatever you send it. Now you have your own mail server running locally, ready to receive whatever you send it.
.. _`Download the latest MailHog release`: https://github.com/mailhog/MailHog .. _`Download the latest Mailpit release`: https://github.com/axllent/mailpit
Console Console
~~~~~~~ ~~~~~~~
.. note:: If you have generated your project with ``use_mailhog`` set to ``n`` this will be a default setup. .. note:: If you have generated your project with ``use_mailpit`` set to ``n`` this will be a default setup.
Alternatively, deliver emails over console via ``EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'``. Alternatively, deliver emails over console via ``EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'``.
@ -141,23 +196,42 @@ In production, we have Mailgun_ configured to have your back!
Celery Celery
------ ------
If the project is configured to use Celery as a task scheduler then by default tasks are set to run on the main thread If the project is configured to use Celery as a task scheduler then, by default, tasks are set to run on the main thread when developing locally instead of getting sent to a broker. However, if you have Redis setup on your local machine, you can set the following in ``config/settings/local.py``::
when developing locally. If you have the appropriate setup on your local machine then set the following
in ``config/settings/local.py``::
CELERY_TASK_ALWAYS_EAGER = False CELERY_TASK_ALWAYS_EAGER = False
To run Celery locally, make sure redis-server is installed (instructions are available at https://redis.io/topics/quickstart), run the server in one terminal with `redis-server`, and then start celery in another terminal with the following command:: Next, make sure `redis-server` is installed (per the `Getting started with Redis`_ guide) and run the server in one terminal::
celery -A config.celery_app worker --loglevel=info $ redis-server
Start the Celery worker by running the following command in another terminal::
$ celery -A config.celery_app worker --loglevel=info
That Celery worker should be running whenever your app is running, typically as a background process,
so that it can pick up any tasks that get queued. Learn more from the `Celery Workers Guide`_.
The project comes with a simple task for manual testing purposes, inside `<project_slug>/users/tasks.py`. To queue that task locally, start the Django shell, import the task, and call `delay()` on it::
$ python manage.py shell
>> from <project_slug>.users.tasks import get_users_count
>> get_users_count.delay()
You can also use Django admin to queue up tasks, thanks to the `django-celerybeat`_ package.
.. _Getting started with Redis guide: https://redis.io/docs/getting-started/
.. _Celery Workers Guide: https://docs.celeryq.dev/en/stable/userguide/workers.html
.. _django-celerybeat: https://django-celery-beat.readthedocs.io/en/latest/
Sass Compilation & Live Reloading .. _bare-metal-webpack-gulp:
---------------------------------
If you've opted for Gulp as front-end pipeline, the project comes configured with `Sass`_ compilation and `live reloading`_. As you change you Sass/JS source files, the task runner will automatically rebuild the corresponding CSS and JS assets and reload them in your browser without refreshing the page. Using Webpack or Gulp
---------------------
#. Make sure that `Node.js`_ v16 is installed on your machine. If you've opted for Gulp or Webpack as front-end pipeline, the project comes configured with `Sass`_ compilation and `live reloading`_. As you change your Sass/JS source files, the task runner will automatically rebuild the corresponding CSS and JS assets and reload them in your browser without refreshing the page.
#. Make sure that `Node.js`_ v18 is installed on your machine.
#. In the project root, install the JS dependencies with:: #. In the project root, install the JS dependencies with::
$ npm install $ npm install
@ -166,9 +240,12 @@ If you've opted for Gulp as front-end pipeline, the project comes configured wit
$ npm run dev $ npm run dev
The app will now run with live reloading enabled, applying front-end changes dynamically. This will start 2 processes in parallel: the static assets build loop on one side, and the Django server on the other.
#. Access your application at the address of the ``node`` service in order to see your correct styles. This is http://localhost:3000 by default.
.. note:: Do NOT access the application using the Django port (8000 by default), as it will result in broken styles and 404s when accessing static assets.
.. note:: The task will start 2 processes in parallel: the static assets build loop on one side, and the Django server on the other. You do NOT need to run Django as your would normally with ``manage.py runserver``.
.. _Node.js: http://nodejs.org/download/ .. _Node.js: http://nodejs.org/download/
.. _Sass: https://sass-lang.com/ .. _Sass: https://sass-lang.com/

View File

@ -1,14 +1,14 @@
PostgreSQL Backups with Docker PostgreSQL Backups with Docker
============================== ==============================
.. note:: For brevity it is assumed that you will be running the below commands against local environment, however, this is by no means mandatory so feel free to switch to ``production.yml`` when needed. .. note:: For brevity it is assumed that you will be running the below commands against local environment, however, this is by no means mandatory so feel free to switch to ``docker-compose.production.yml`` when needed.
Prerequisites Prerequisites
------------- -------------
#. the project was generated with ``use_docker`` set to ``y``; #. the project was generated with ``use_docker`` set to ``y``;
#. the stack is up and running: ``docker-compose -f local.yml up -d postgres``. #. the stack is up and running: ``docker compose -f docker-compose.local.yml up -d postgres``.
Creating a Backup Creating a Backup
@ -16,7 +16,7 @@ Creating a Backup
To create a backup, run:: To create a backup, run::
$ docker-compose -f local.yml exec postgres backup $ docker compose -f docker-compose.local.yml exec postgres backup
Assuming your project's database is named ``my_project`` here is what you will see: :: Assuming your project's database is named ``my_project`` here is what you will see: ::
@ -31,7 +31,7 @@ Viewing the Existing Backups
To list existing backups, :: To list existing backups, ::
$ docker-compose -f local.yml exec postgres backups $ docker compose -f docker-compose.local.yml exec postgres backups
These are the sample contents of ``/backups``: :: These are the sample contents of ``/backups``: ::
@ -55,9 +55,9 @@ With a single backup file copied to ``.`` that would be ::
$ docker cp 9c5c3f055843:/backups/backup_2018_03_13T09_05_07.sql.gz . $ docker cp 9c5c3f055843:/backups/backup_2018_03_13T09_05_07.sql.gz .
You can also get the container ID using ``docker-compose -f local.yml ps -q postgres`` so if you want to automate your backups, you don't have to check the container ID manually every time. Here is the full command :: You can also get the container ID using ``docker compose -f docker-compose.local.yml ps -q postgres`` so if you want to automate your backups, you don't have to check the container ID manually every time. Here is the full command ::
$ docker cp $(docker-compose -f local.yml ps -q postgres):/backups ./backups $ docker cp $(docker compose -f docker-compose.local.yml ps -q postgres):/backups ./backups
.. _`command`: https://docs.docker.com/engine/reference/commandline/cp/ .. _`command`: https://docs.docker.com/engine/reference/commandline/cp/
@ -66,7 +66,7 @@ Restoring from the Existing Backup
To restore from one of the backups you have already got (take the ``backup_2018_03_13T09_05_07.sql.gz`` for example), :: To restore from one of the backups you have already got (take the ``backup_2018_03_13T09_05_07.sql.gz`` for example), ::
$ docker-compose -f local.yml exec postgres restore backup_2018_03_13T09_05_07.sql.gz $ docker compose -f docker-compose.local.yml exec postgres restore backup_2018_03_13T09_05_07.sql.gz
You will see something like :: You will see something like ::
@ -92,7 +92,36 @@ You will see something like ::
Backup to Amazon S3 Backup to Amazon S3
---------------------------------- ----------------------------------
For uploading your backups to Amazon S3 you can use the aws cli container. There is an upload command for uploading the postgres /backups directory recursively and there is a download command for downloading a specific backup. The default S3 environment variables are used. :: For uploading your backups to Amazon S3 you can use the aws cli container. There is an upload command for uploading the postgres /backups directory recursively and there is a download command for downloading a specific backup. The default S3 environment variables are used. ::
$ docker-compose -f production.yml run --rm awscli upload $ docker compose -f docker-compose.production.yml run --rm awscli upload
$ docker-compose -f production.yml run --rm awscli download backup_2018_03_13T09_05_07.sql.gz $ docker compose -f docker-compose.production.yml run --rm awscli download backup_2018_03_13T09_05_07.sql.gz
Remove Backup
----------------------------------
To remove backup you can use the ``rmbackup`` command. This will remove the backup from the ``/backups`` directory. ::
$ docker compose -f docker-compose.local.yml exec postgres rmbackup backup_2018_03_13T09_05_07.sql.gz
Upgrading PostgreSQL
----------------------------------
Upgrading PostgreSQL in your project requires a series of carefully executed steps. Start by halting all containers, excluding the postgres container. Following this, create a backup and proceed to remove the outdated data volume. ::
$ docker compose -f docker-compose.local.yml down
$ docker compose -f docker-compose.local.yml up -d postgres
$ docker compose -f docker-compose.local.yml run --rm postgres backup
$ docker compose -f docker-compose.local.yml down
$ docker volume rm my_project_postgres_data
.. note:: Neglecting to remove the old data volume may lead to issues, such as the new postgres container failing to start with errors like ``FATAL: database files are incompatible with server``, and ``could not translate host name "postgres" to address: Name or service not known``.
To complete the upgrade, update the PostgreSQL version in the corresponding Dockerfile (e.g. ``compose/production/postgres/Dockerfile``) and build a new version of PostgreSQL. ::
$ docker compose -f docker-compose.local.yml build postgres
$ docker compose -f docker-compose.local.yml up -d postgres
$ docker compose -f docker-compose.local.yml run --rm postgres restore backup_2018_03_13T09_05_07.sql.gz
$ docker compose -f docker-compose.local.yml up -d

View File

@ -11,7 +11,7 @@ After you have set up to `develop locally`_, run the following command from the
If you set up your project to `develop locally with docker`_, run the following command: :: If you set up your project to `develop locally with docker`_, run the following command: ::
$ docker-compose -f local.yml up docs $ docker compose -f docker-compose.docs.yml up
Navigate to port 9000 on your host to see the documentation. This will be opened automatically at `localhost`_ for local, non-docker development. Navigate to port 9000 on your host to see the documentation. This will be opened automatically at `localhost`_ for local, non-docker development.

View File

@ -22,6 +22,6 @@ TODO
Why doesn't this follow the layout from Two Scoops of Django? Why doesn't this follow the layout from Two Scoops of Django?
------------------------------------------------------------- -------------------------------------------------------------
You may notice that some elements of this project do not exactly match what we describe in chapter 3 of `Two Scoops of Django 1.11`_. The reason for that is this project, amongst other things, serves as a test bed for trying out new ideas and concepts. Sometimes they work, sometimes they don't, but the end result is that it won't necessarily match precisely what is described in the book I co-authored. You may notice that some elements of this project do not exactly match what we describe in chapter 3 of `Two Scoops of Django 3.x`_. The reason for that is this project, amongst other things, serves as a test bed for trying out new ideas and concepts. Sometimes they work, sometimes they don't, but the end result is that it won't necessarily match precisely what is described in the book I co-authored.
.. _Two Scoops of Django 1.11: https://www.feldroy.com/collections/django/products/two-scoops-of-django-1-11 .. _Two Scoops of Django 3.x: https://www.feldroy.com/books/two-scoops-of-django-3-x

View File

@ -0,0 +1,6 @@
Generate a new cookiecutter-django project: ::
$ cookiecutter gh:cookiecutter/cookiecutter-django
For more information refer to
:ref:`Project Generation Options <template-options>`.

View File

@ -27,6 +27,8 @@ Contents
websocket websocket
faq faq
troubleshooting troubleshooting
contributing
maintainer-guide
Indices and tables Indices and tables
------------------ ------------------

View File

@ -4,40 +4,30 @@ Linters
.. index:: linters .. index:: linters
flake8 ruff
------ ------
To run flake8: :: Ruff is a Python linter and code formatter, written in Rust.
It is a aggregation of flake8, pylint, pyupgrade and many more.
$ flake8 Ruff comes with a linter (``ruff check``) and a formatter (``ruff format``).
The linter is a wrapper around flake8, pylint, and other linters,
and the formatter is a wrapper around black, isort, and other formatters.
The config for flake8 is located in setup.cfg. It specifies: To run ruff without modifying your files: ::
* Set max line length to 120 chars $ ruff format --diff .
* Exclude ``.tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules`` $ ruff check .
pylint Ruff is capable of fixing most of the problems it encounters.
------ Be sure you commit first before running `ruff` so you can restore to a savepoint (and amend afterwards to prevent a double commit. : ::
To run pylint: :: $ ruff format .
$ ruff check --fix .
# be careful with the --unsafe-fixes option, it can break your code
$ ruff check --fix --unsafe-fixes .
$ pylint <python files that you wish to lint> The config for ruff is located in pyproject.toml.
On of the most important option is `tool.ruff.lint.select`.
The config for pylint is located in .pylintrc. It specifies: `select` determines which linters are run. In example, `DJ <https://docs.astral.sh/ruff/rules/#flake8-django-dj>`_ refers to flake8-django.
For a full list of available linters, see `https://docs.astral.sh/ruff/rules/ <https://docs.astral.sh/ruff/rules/>`_
* Use the pylint_django plugin. If using Celery, also use pylint_celery.
* Set max line length to 120 chars
* Disable linting messages for missing docstring and invalid name
* max-parents=13
pycodestyle
-----------
This is included in flake8's checks, but you can also run it separately to see a more detailed report: ::
$ pycodestyle <python files that you wish to lint>
The config for pycodestyle is located in setup.cfg. It specifies:
* Set max line length to 120 chars
* Exclude ``.tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules``

104
docs/maintainer-guide.md Normal file
View File

@ -0,0 +1,104 @@
# Maintainer guide
This document is intended for maintainers of the template.
## Automated updates
We use 2 separate services to keep our dependencies up-to-date:
- Dependabot, which manages updates of Python deps of the template, GitHub actions, npm packages and Docker images.
- PyUp, which manages the Python deps for the generated project.
We don't use Dependabot for the generated project deps because our requirements files are templated, and Dependabot fails to parse them. PyUp is -AFAIK- the only service out there that supports having Jinja tags in the requirements file.
Updates for the template should be labelled as `project infrastructure` while the ones about the generated project should be labelled as `update`. This is use to work in conjunction with our changelog script (see later).
## Automation scripts
We have a few workflows which have been automated over time. They usually run using GitHub actions and might need a few small manual actions to work nicely. Some have a few limitations which we should document here.
### CI
`ci.yml`
The CI workflow tries to cover 2 main aspects of the template:
- Check all combinations to make sure that valid files are generated with no major linting issues. Issues which are fixed by an auto-formatter after generation aren't considered major, and only aim for best effort. This is under the `test` job.
- Run more in-depth tests on a few combinations, by installing dependencies, running type checker and the test suite of the generated project. We try to cover docker (`docker` job) and non-docker (`bare` job) setups.
We also run the deployment checks, but we don't do much more beyond that for testing the production setup.
### Django issue checker
`django-issue-checker.yml`
This workflow runs daily, on schedule, and checks if there is a new major version of Django (not in the pure SemVer sense) released that we are not running, and list our dependencies compatibility.
For example, at time of writing, we use Django 4.2, but the latest version of Django is 5.0, so the workflow created a ["Django 5.0" issue](https://github.com/cookiecutter/cookiecutter-django/issues/4724) in GitHub, with a compatibility table and keeps it up to date every day.
#### Limitations
Here are a few current and past limitations of the script
- When a new dependency is added to the template, the script fails to update an existing issue
- Not sure what happens when a deps is removed
- ~~Unable to parse classifiers without minor version~~
- ~~Creates an issue even if we are on the latest version~~
### Issue manager
`issue-manager.yml`
A workflow that uses [Sebastian Ramirez' issue-manager](https://github.com/tiangolo/issue-manager) to help us automate issue management. The tag line from the repo explains it well:
> Automatically close issues or Pull Requests that have a label, after a custom delay, if no one replies back.
It runs on a schedule as well as when some actions are taken on issues and pull requests.
We wait 10 days before closing issues, and we have a few customised reasons, which are configured in the workflow itself. The config should be fairly self-explanatory.
### Pre-commit auto-update
`pre-commit-autoupdate.yml`
Run daily, to do `pre-commit autoupdate` on the template as well as the generated project, and opens a pull request with the changes.
#### Limitations
- The PR is open as GitHub action which means that CI does NOT run. The documentation for create-pull-request action [explains why](https://github.com/peter-evans/create-pull-request/blob/main/docs/concepts-guidelines.md#triggering-further-workflow-runs).
- Some hooks are also installed as local dependencies (via `requirements/local.txt`), but these are updated separately via PyUP.
### Update changelog
`update-changelog.yml`
Run daily at 2AM to update our changelog and create a GitHub release. This runs a custom script which:
- List all pull requests merged the day before
- The release name is calendar based, so `YYYY.MM.DD`
- For each PR:
- Get the PR title to summarize the change
- Look at the PR labels to classify it in a section of the release notes:
- anything labelled `project infrastructure` is excluded
- label `update` goes in section "Updated"
- label `bug` goes in section "Fixed"
- label `docs` goes in section "Documentation"
- Default to section "Changed"
With that in mind, when merging changes, it's a good idea to set the labels and rename the PR title to give a good summary of the change, in the context of the changelog.
#### Limitations
- Dependabot updates for npm & Docker have a verbose title, try to rename them to be more readable: `Bump webpack-dev-server from 4.15.1 to 5.0.2 in /{{cookiecutter.project_slug}}` -> `Bump webpack-dev-server to 5.0.2`
- ~~Dependencies updates for the template repo (tox, cookiecutter, etc...) don't need to appear in changelog, and need to be labelled as `project infrastructure` manually. By default, they come from PyUp labelled as `update`.~~
### Update contributors
`update-contributors.yml`
Runs on each push to master branch. List the 5 most recently merged pull requests and extract their author. If any of the authors is a new one, updates the `.github/contributors.json`, regenerate the `CONTRIBUTORS.md` from it, and push back the changes to master.
#### Limitations
- If you merge a pull request from a new contributor, and merge another one right after, the push to master will fail as the remote will be out of date.
- If you merge more than 5 pull requests in a row like this, the new contributor might fail to be added.

View File

@ -24,6 +24,13 @@ author_name:
email: email:
The email address you want to identify yourself in the project. The email address you want to identify yourself in the project.
username_type:
The type of username you want to use in the project. This can be either
``username`` or ``email``. If you choose ``username``, the ``email`` field
will be included. If you choose ``email``, the ``username`` field will be
excluded. It is best practice to always include an email field, so there is
no option for having just the ``username`` field.
domain_name: domain_name:
The domain name you plan to use for your project once it goes live. The domain name you plan to use for your project once it goes live.
Note that it can be safely changed later on whenever you need to. Note that it can be safely changed later on whenever you need to.
@ -46,29 +53,34 @@ timezone:
windows: windows:
Indicates whether the project should be configured for development on Windows. Indicates whether the project should be configured for development on Windows.
use_pycharm: editor:
Indicates whether the project should be configured for development with PyCharm_. Select an editor to use. The choices are:
1. None
2. PyCharm_
3. `VS Code`_
use_docker: use_docker:
Indicates whether the project should be configured to use Docker_ and `Docker Compose`_. Indicates whether the project should be configured to use Docker_, `Docker Compose`_ and `devcontainer`_.
postgresql_version: postgresql_version:
Select a PostgreSQL_ version to use. The choices are: Select a PostgreSQL_ version to use. The choices are:
1. 14 1. 16
2. 13 2. 15
3. 12 3. 14
4. 11 4. 13
5. 10 5. 12
cloud_provider: cloud_provider:
Select a cloud provider for static & media files. The choices are: Select a cloud provider for static & media files. The choices are:
1. AWS_ 1. AWS_
2. GCP_ 2. GCP_
3. None 3. Azure_
4. None
Note that if you choose no cloud provider, media files won't work. If you choose no cloud provider and docker, the production stack will serve the media files via an nginx Docker service. Without Docker, the media files won't work.
mail_service: mail_service:
Select an email service that Django-Anymail provides Select an email service that Django-Anymail provides
@ -94,13 +106,16 @@ frontend_pipeline:
1. None 1. None
2. `Django Compressor`_ 2. `Django Compressor`_
3. `Gulp`_: support Bootstrap recompilation with real-time variables alteration. 3. `Gulp`_
4. `Webpack`_
Both Gulp and Webpack support Bootstrap recompilation with real-time variables alteration.
use_celery: use_celery:
Indicates whether the project should be configured to use Celery_. Indicates whether the project should be configured to use Celery_.
use_mailhog: use_mailpit:
Indicates whether the project should be configured to use MailHog_. Indicates whether the project should be configured to use Mailpit_.
use_sentry: use_sentry:
Indicates whether the project should be configured to use Sentry_. Indicates whether the project should be configured to use Sentry_.
@ -119,6 +134,7 @@ ci_tool:
2. `Travis CI`_ 2. `Travis CI`_
3. `Gitlab CI`_ 3. `Gitlab CI`_
4. `Github Actions`_ 4. `Github Actions`_
5. `Drone CI`_
keep_local_envs_in_vcs: keep_local_envs_in_vcs:
Indicates whether the project's ``.envs/.local/`` should be kept in VCS Indicates whether the project's ``.envs/.local/`` should be kept in VCS
@ -137,16 +153,20 @@ debug:
.. _Apache Software License 2.0: http://www.apache.org/licenses/LICENSE-2.0 .. _Apache Software License 2.0: http://www.apache.org/licenses/LICENSE-2.0
.. _PyCharm: https://www.jetbrains.com/pycharm/ .. _PyCharm: https://www.jetbrains.com/pycharm/
.. _VS Code: https://github.com/microsoft/vscode
.. _Docker: https://github.com/docker/docker .. _Docker: https://github.com/docker/docker
.. _Docker Compose: https://docs.docker.com/compose/ .. _Docker Compose: https://docs.docker.com/compose/
.. _devcontainer: https://containers.dev/
.. _PostgreSQL: https://www.postgresql.org/docs/ .. _PostgreSQL: https://www.postgresql.org/docs/
.. _Gulp: https://github.com/gulpjs/gulp .. _Gulp: https://github.com/gulpjs/gulp
.. _Webpack: https://webpack.js.org
.. _AWS: https://aws.amazon.com/s3/ .. _AWS: https://aws.amazon.com/s3/
.. _GCP: https://cloud.google.com/storage/ .. _GCP: https://cloud.google.com/storage/
.. _Azure: https://azure.microsoft.com/en-us/products/storage/blobs/
.. _Amazon SES: https://aws.amazon.com/ses/ .. _Amazon SES: https://aws.amazon.com/ses/
.. _Mailgun: https://www.mailgun.com .. _Mailgun: https://www.mailgun.com
@ -164,7 +184,7 @@ debug:
.. _Celery: https://github.com/celery/celery .. _Celery: https://github.com/celery/celery
.. _MailHog: https://github.com/mailhog/MailHog .. _Mailpit: https://github.com/axllent/mailpit
.. _Sentry: https://github.com/getsentry/sentry .. _Sentry: https://github.com/getsentry/sentry
@ -176,4 +196,6 @@ debug:
.. _GitLab CI: https://docs.gitlab.com/ee/ci/ .. _GitLab CI: https://docs.gitlab.com/ee/ci/
.. _Drone CI: https://docs.drone.io/pipeline/overview/
.. _Github Actions: https://docs.github.com/en/actions .. _Github Actions: https://docs.github.com/en/actions

View File

@ -1,2 +1,3 @@
sphinx==5.3.0 sphinx==7.3.7
sphinx-rtd-theme==1.1.1 sphinx-rtd-theme==2.0.0
myst-parser==3.0.1

View File

@ -22,7 +22,6 @@ DATABASE_URL DATABASES auto w/ Dock
DJANGO_ADMIN_URL n/a 'admin/' raises error DJANGO_ADMIN_URL n/a 'admin/' raises error
DJANGO_DEBUG DEBUG True False DJANGO_DEBUG DEBUG True False
DJANGO_SECRET_KEY SECRET_KEY auto-generated raises error DJANGO_SECRET_KEY SECRET_KEY auto-generated raises error
DJANGO_SECURE_BROWSER_XSS_FILTER SECURE_BROWSER_XSS_FILTER n/a True
DJANGO_SECURE_SSL_REDIRECT SECURE_SSL_REDIRECT n/a True DJANGO_SECURE_SSL_REDIRECT SECURE_SSL_REDIRECT n/a True
DJANGO_SECURE_CONTENT_TYPE_NOSNIFF SECURE_CONTENT_TYPE_NOSNIFF n/a True DJANGO_SECURE_CONTENT_TYPE_NOSNIFF SECURE_CONTENT_TYPE_NOSNIFF n/a True
DJANGO_SECURE_FRAME_DENY SECURE_FRAME_DENY n/a True DJANGO_SECURE_FRAME_DENY SECURE_FRAME_DENY n/a True
@ -49,6 +48,9 @@ DJANGO_AWS_S3_CUSTOM_DOMAIN AWS_S3_CUSTOM_DOMAIN n/a
DJANGO_AWS_S3_MAX_MEMORY_SIZE AWS_S3_MAX_MEMORY_SIZE n/a 100_000_000 DJANGO_AWS_S3_MAX_MEMORY_SIZE AWS_S3_MAX_MEMORY_SIZE n/a 100_000_000
DJANGO_GCP_STORAGE_BUCKET_NAME GS_BUCKET_NAME n/a raises error DJANGO_GCP_STORAGE_BUCKET_NAME GS_BUCKET_NAME n/a raises error
GOOGLE_APPLICATION_CREDENTIALS n/a n/a raises error GOOGLE_APPLICATION_CREDENTIALS n/a n/a raises error
DJANGO_AZURE_ACCOUNT_KEY AZURE_ACCOUNT_KEY n/a raises error
DJANGO_AZURE_ACCOUNT_NAME AZURE_ACCOUNT_NAME n/a raises error
DJANGO_AZURE_CONTAINER_NAME AZURE_CONTAINER n/a raises error
SENTRY_DSN SENTRY_DSN n/a raises error SENTRY_DSN SENTRY_DSN n/a raises error
SENTRY_ENVIRONMENT n/a n/a production SENTRY_ENVIRONMENT n/a n/a production
SENTRY_TRACES_SAMPLE_RATE n/a n/a 0.0 SENTRY_TRACES_SAMPLE_RATE n/a n/a 0.0
@ -79,3 +81,6 @@ Other Environment Settings
DJANGO_ACCOUNT_ALLOW_REGISTRATION (=True) DJANGO_ACCOUNT_ALLOW_REGISTRATION (=True)
Allow enable or disable user registration through `django-allauth` without disabling other characteristics like authentication and account management. (Django Setting: ACCOUNT_ALLOW_REGISTRATION) Allow enable or disable user registration through `django-allauth` without disabling other characteristics like authentication and account management. (Django Setting: ACCOUNT_ALLOW_REGISTRATION)
DJANGO_ADMIN_FORCE_ALLAUTH (=False)
Force the `admin` sign in process to go through the `django-allauth` workflow.

View File

@ -19,7 +19,7 @@ You will get a readout of the `users` app that has already been set up with test
If you set up your project to `develop locally with docker`_, run the following command: :: If you set up your project to `develop locally with docker`_, run the following command: ::
$ docker-compose -f local.yml run --rm django pytest $ docker compose -f docker-compose.local.yml run --rm django pytest
Targeting particular apps for testing in ``docker`` follows a similar pattern as previously shown above. Targeting particular apps for testing in ``docker`` follows a similar pattern as previously shown above.
@ -28,17 +28,22 @@ Coverage
You should build your tests to provide the highest level of **code coverage**. You can run the ``pytest`` with code ``coverage`` by typing in the following command: :: You should build your tests to provide the highest level of **code coverage**. You can run the ``pytest`` with code ``coverage`` by typing in the following command: ::
$ docker-compose -f local.yml run --rm django coverage run -m pytest $ coverage run -m pytest
Once the tests are complete, in order to see the code coverage, run the following command: :: Once the tests are complete, in order to see the code coverage, run the following command: ::
$ docker-compose -f local.yml run --rm django coverage report $ coverage report
If you're running the project locally with Docker, use these commands instead: ::
$ docker compose -f docker-compose.local.yml run --rm django coverage run -m pytest
$ docker compose -f docker-compose.local.yml run --rm django coverage report
.. note:: .. note::
At the root of the project folder, you will find the `pytest.ini` file. You can use this to customize_ the ``pytest`` to your liking. At the root of the project folder, you will find the `pytest.ini` file. You can use this to customize_ the ``pytest`` to your liking.
There is also the `.coveragerc`. This is the configuration file for the ``coverage`` tool. You can find out more about `configuring`_ ``coverage``. The configuration for ``coverage`` can be found in ``pyproject.toml``. You can find out more about `configuring`_ ``coverage``.
.. seealso:: .. seealso::
@ -53,4 +58,4 @@ Once the tests are complete, in order to see the code coverage, run the followin
.. _develop locally with docker: ./developing-locally-docker.html .. _develop locally with docker: ./developing-locally-docker.html
.. _customize: https://docs.pytest.org/en/latest/customize.html .. _customize: https://docs.pytest.org/en/latest/customize.html
.. _unittest: https://docs.python.org/3/library/unittest.html#module-unittest .. _unittest: https://docs.python.org/3/library/unittest.html#module-unittest
.. _configuring: https://coverage.readthedocs.io/en/v4.5.x/config.html .. _configuring: https://coverage.readthedocs.io/en/latest/config.html

View File

@ -1,5 +1,5 @@
Troubleshooting Troubleshooting
===================================== ===============
This page contains some advice about errors and problems commonly encountered during the development of Cookiecutter Django applications. This page contains some advice about errors and problems commonly encountered during the development of Cookiecutter Django applications.
@ -24,13 +24,13 @@ Examples of logs::
If you recreate the project multiple times with the same name, Docker would preserve the volumes for the postgres container between projects. Here is what happens: If you recreate the project multiple times with the same name, Docker would preserve the volumes for the postgres container between projects. Here is what happens:
#. You generate the project the first time. The .env postgres file is populated with the random password #. You generate the project the first time. The .env postgres file is populated with the random password
#. You run the docker-compose and the containers are created. The postgres container creates the database based on the .env file credentials #. You run the docker compose and the containers are created. The postgres container creates the database based on the .env file credentials
#. You "regenerate" the project with the same name, so the postgres .env file is populated with a new random password #. You "regenerate" the project with the same name, so the postgres .env file is populated with a new random password
#. You run docker-compose. Since the names of the containers are the same, docker will try to start them (not create them from scratch i.e. it won't execute the Dockerfile to recreate the database). When this happens, it tries to start the database based on the new credentials which do not match the ones that the database was created with, and you get the error message above. #. You run docker compose. Since the names of the containers are the same, docker will try to start them (not create them from scratch i.e. it won't execute the Dockerfile to recreate the database). When this happens, it tries to start the database based on the new credentials which do not match the ones that the database was created with, and you get the error message above.
To fix this, you can either: To fix this, you can either:
- Clear your project-related Docker cache with ``docker-compose -f local.yml down --volumes --rmi all``. - Clear your project-related Docker cache with ``docker compose -f docker-compose.local.yml down --volumes --rmi all``.
- Use the Docker volume sub-commands to find volumes (`ls`_) and remove them (`rm`_). - Use the Docker volume sub-commands to find volumes (`ls`_) and remove them (`rm`_).
- Use the `prune`_ command to clear system-wide (use with care!). - Use the `prune`_ command to clear system-wide (use with care!).
@ -38,6 +38,16 @@ To fix this, you can either:
.. _rm: https://docs.docker.com/engine/reference/commandline/volume_rm/ .. _rm: https://docs.docker.com/engine/reference/commandline/volume_rm/
.. _prune: https://docs.docker.com/v17.09/engine/reference/commandline/system_prune/ .. _prune: https://docs.docker.com/v17.09/engine/reference/commandline/system_prune/
Variable is not set. Defaulting to a blank string
-------------------------------------------------
Example::
WARN[0000] The "DJANGO_AWS_STORAGE_BUCKET_NAME" variable is not set. Defaulting to a blank string.
WARN[0000] The "DJANGO_AWS_S3_CUSTOM_DOMAIN" variable is not set. Defaulting to a blank string.
You have probably opted for Docker + Webpack without Whitenoise. This is a know limitation of the combination, which needs a little bit of manual intervention. See the :ref:`dedicated section about it <webpack-whitenoise-limitation>`.
Others Others
------ ------

View File

@ -8,6 +8,7 @@ NOTE:
TODO: restrict Cookiecutter Django project initialization to TODO: restrict Cookiecutter Django project initialization to
Python 3.x environments only Python 3.x environments only
""" """
from __future__ import print_function from __future__ import print_function
import json import json
@ -33,6 +34,24 @@ SUCCESS = "\x1b[1;32m [SUCCESS]: "
DEBUG_VALUE = "debug" DEBUG_VALUE = "debug"
def remove_custom_user_manager_files():
os.remove(
os.path.join(
"{{cookiecutter.project_slug}}",
"users",
"managers.py",
)
)
os.remove(
os.path.join(
"{{cookiecutter.project_slug}}",
"users",
"tests",
"test_managers.py",
)
)
def remove_pycharm_files(): def remove_pycharm_files():
idea_dir_path = ".idea" idea_dir_path = ".idea"
if os.path.exists(idea_dir_path): if os.path.exists(idea_dir_path):
@ -44,12 +63,17 @@ def remove_pycharm_files():
def remove_docker_files(): def remove_docker_files():
shutil.rmtree(".devcontainer")
shutil.rmtree("compose") shutil.rmtree("compose")
file_names = ["local.yml", "production.yml", ".dockerignore"] file_names = [
"docker-compose.local.yml",
"docker-compose.production.yml",
".dockerignore",
]
for file_name in file_names: for file_name in file_names:
os.remove(file_name) os.remove(file_name)
if "{{ cookiecutter.use_pycharm }}".lower() == "y": if "{{ cookiecutter.editor }}" == "PyCharm":
file_names = ["docker_compose_up_django.xml", "docker_compose_up_docs.xml"] file_names = ["docker_compose_up_django.xml", "docker_compose_up_docs.xml"]
for file_name in file_names: for file_name in file_names:
os.remove(os.path.join(".idea", "runConfigurations", file_name)) os.remove(os.path.join(".idea", "runConfigurations", file_name))
@ -62,29 +86,37 @@ def remove_utility_files():
def remove_heroku_files(): def remove_heroku_files():
file_names = ["Procfile", "runtime.txt", "requirements.txt"] file_names = ["Procfile", "runtime.txt", "requirements.txt"]
for file_name in file_names: for file_name in file_names:
if ( if file_name == "requirements.txt" and "{{ cookiecutter.ci_tool }}".lower() == "travis":
file_name == "requirements.txt"
and "{{ cookiecutter.ci_tool }}".lower() == "travis"
):
# don't remove the file if we are using travisci but not using heroku # don't remove the file if we are using travisci but not using heroku
continue continue
os.remove(file_name) os.remove(file_name)
remove_heroku_build_hooks()
def remove_heroku_build_hooks():
shutil.rmtree("bin") shutil.rmtree("bin")
def remove_sass_files():
shutil.rmtree(os.path.join("{{cookiecutter.project_slug}}", "static", "sass"))
def remove_gulp_files(): def remove_gulp_files():
file_names = ["gulpfile.js"] file_names = ["gulpfile.js"]
for file_name in file_names: for file_name in file_names:
os.remove(file_name) os.remove(file_name)
remove_sass_files()
def remove_sass_files(): def remove_webpack_files():
shutil.rmtree(os.path.join("{{cookiecutter.project_slug}}", "static", "sass")) shutil.rmtree("webpack")
remove_vendors_js()
def remove_vendors_js():
vendors_js_path = os.path.join(
"{{ cookiecutter.project_slug }}",
"static",
"js",
"vendors.js",
)
if os.path.exists(vendors_js_path):
os.remove(vendors_js_path)
def remove_packagejson_file(): def remove_packagejson_file():
@ -93,13 +125,105 @@ def remove_packagejson_file():
os.remove(file_name) os.remove(file_name)
def update_package_json(remove_dev_deps=None, remove_keys=None, scripts=None):
remove_dev_deps = remove_dev_deps or []
remove_keys = remove_keys or []
scripts = scripts or {}
with open("package.json", mode="r") as fd:
content = json.load(fd)
for package_name in remove_dev_deps:
content["devDependencies"].pop(package_name)
for key in remove_keys:
content.pop(key)
content["scripts"].update(scripts)
with open("package.json", mode="w") as fd:
json.dump(content, fd, ensure_ascii=False, indent=2)
fd.write("\n")
def handle_js_runner(choice, use_docker, use_async):
if choice == "Gulp":
update_package_json(
remove_dev_deps=[
"@babel/core",
"@babel/preset-env",
"babel-loader",
"concurrently",
"css-loader",
"mini-css-extract-plugin",
"postcss-loader",
"postcss-preset-env",
"sass-loader",
"webpack",
"webpack-bundle-tracker",
"webpack-cli",
"webpack-dev-server",
"webpack-merge",
],
remove_keys=["babel"],
scripts={
"dev": "gulp",
"build": "gulp generate-assets",
},
)
remove_webpack_files()
elif choice == "Webpack":
scripts = {
"dev": "webpack serve --config webpack/dev.config.js",
"build": "webpack --config webpack/prod.config.js",
}
remove_dev_deps = [
"browser-sync",
"cssnano",
"gulp",
"gulp-concat",
"gulp-imagemin",
"gulp-plumber",
"gulp-postcss",
"gulp-rename",
"gulp-sass",
"gulp-uglify-es",
]
if not use_docker:
dev_django_cmd = (
"uvicorn config.asgi:application --reload" if use_async else "python manage.py runserver_plus"
)
scripts.update(
{
"dev": "concurrently npm:dev:*",
"dev:webpack": "webpack serve --config webpack/dev.config.js",
"dev:django": dev_django_cmd,
}
)
else:
remove_dev_deps.append("concurrently")
update_package_json(remove_dev_deps=remove_dev_deps, scripts=scripts)
remove_gulp_files()
def remove_prettier_pre_commit():
with open(".pre-commit-config.yaml", "r") as fd:
content = fd.readlines()
removing = False
new_lines = []
for line in content:
if removing and "- repo:" in line:
removing = False
if "mirrors-prettier" in line:
removing = True
if not removing:
new_lines.append(line)
with open(".pre-commit-config.yaml", "w") as fd:
fd.writelines(new_lines)
def remove_celery_files(): def remove_celery_files():
file_names = [ file_names = [
os.path.join("config", "celery_app.py"), os.path.join("config", "celery_app.py"),
os.path.join("{{ cookiecutter.project_slug }}", "users", "tasks.py"), os.path.join("{{ cookiecutter.project_slug }}", "users", "tasks.py"),
os.path.join( os.path.join("{{ cookiecutter.project_slug }}", "users", "tests", "test_tasks.py"),
"{{ cookiecutter.project_slug }}", "users", "tests", "test_tasks.py"
),
] ]
for file_name in file_names: for file_name in file_names:
os.remove(file_name) os.remove(file_name)
@ -126,9 +250,11 @@ def remove_dotgithub_folder():
shutil.rmtree(".github") shutil.rmtree(".github")
def generate_random_string( def remove_dotdrone_file():
length, using_digits=False, using_ascii_letters=False, using_punctuation=False os.remove(".drone.yml")
):
def generate_random_string(length, using_digits=False, using_ascii_letters=False, using_punctuation=False):
""" """
Example: Example:
opting out for 50 symbol-long, [a-z][A-Z][0-9] string opting out for 50 symbol-long, [a-z][A-Z][0-9] string
@ -222,9 +348,7 @@ def set_postgres_password(file_path, value=None):
def set_celery_flower_user(file_path, value): def set_celery_flower_user(file_path, value):
celery_flower_user = set_flag( celery_flower_user = set_flag(file_path, "!!!SET CELERY_FLOWER_USER!!!", value=value)
file_path, "!!!SET CELERY_FLOWER_USER!!!", value=value
)
return celery_flower_user return celery_flower_user
@ -256,22 +380,14 @@ def set_flags_in_envs(postgres_user, celery_flower_user, debug=False):
set_django_admin_url(production_django_envs_path) set_django_admin_url(production_django_envs_path)
set_postgres_user(local_postgres_envs_path, value=postgres_user) set_postgres_user(local_postgres_envs_path, value=postgres_user)
set_postgres_password( set_postgres_password(local_postgres_envs_path, value=DEBUG_VALUE if debug else None)
local_postgres_envs_path, value=DEBUG_VALUE if debug else None
)
set_postgres_user(production_postgres_envs_path, value=postgres_user) set_postgres_user(production_postgres_envs_path, value=postgres_user)
set_postgres_password( set_postgres_password(production_postgres_envs_path, value=DEBUG_VALUE if debug else None)
production_postgres_envs_path, value=DEBUG_VALUE if debug else None
)
set_celery_flower_user(local_django_envs_path, value=celery_flower_user) set_celery_flower_user(local_django_envs_path, value=celery_flower_user)
set_celery_flower_password( set_celery_flower_password(local_django_envs_path, value=DEBUG_VALUE if debug else None)
local_django_envs_path, value=DEBUG_VALUE if debug else None
)
set_celery_flower_user(production_django_envs_path, value=celery_flower_user) set_celery_flower_user(production_django_envs_path, value=celery_flower_user)
set_celery_flower_password( set_celery_flower_password(production_django_envs_path, value=DEBUG_VALUE if debug else None)
production_django_envs_path, value=DEBUG_VALUE if debug else None
)
def set_flags_in_settings_files(): def set_flags_in_settings_files():
@ -282,6 +398,7 @@ def set_flags_in_settings_files():
def remove_envs_and_associated_files(): def remove_envs_and_associated_files():
shutil.rmtree(".envs") shutil.rmtree(".envs")
os.remove("merge_production_dotenvs_in_dotenv.py") os.remove("merge_production_dotenvs_in_dotenv.py")
shutil.rmtree("tests")
def remove_celery_compose_dirs(): def remove_celery_compose_dirs():
@ -300,25 +417,9 @@ def remove_aws_dockerfile():
def remove_drf_starter_files(): def remove_drf_starter_files():
os.remove(os.path.join("config", "api_router.py")) os.remove(os.path.join("config", "api_router.py"))
shutil.rmtree(os.path.join("{{cookiecutter.project_slug}}", "users", "api")) shutil.rmtree(os.path.join("{{cookiecutter.project_slug}}", "users", "api"))
os.remove( os.remove(os.path.join("{{cookiecutter.project_slug}}", "users", "tests", "test_drf_urls.py"))
os.path.join( os.remove(os.path.join("{{cookiecutter.project_slug}}", "users", "tests", "test_drf_views.py"))
"{{cookiecutter.project_slug}}", "users", "tests", "test_drf_urls.py" os.remove(os.path.join("{{cookiecutter.project_slug}}", "users", "tests", "test_swagger.py"))
)
)
os.remove(
os.path.join(
"{{cookiecutter.project_slug}}", "users", "tests", "test_drf_views.py"
)
)
os.remove(
os.path.join(
"{{cookiecutter.project_slug}}", "users", "tests", "test_swagger.py"
)
)
def remove_storages_module():
os.remove(os.path.join("{{cookiecutter.project_slug}}", "utils", "storages.py"))
def handle_licenses(): def handle_licenses():
@ -365,7 +466,10 @@ def main():
handle_licenses() handle_licenses()
if "{{ cookiecutter.use_pycharm }}".lower() == "n": if "{{ cookiecutter.username_type }}" == "username":
remove_custom_user_manager_files()
if "{{ cookiecutter.editor }}" != "PyCharm":
remove_pycharm_files() remove_pycharm_files()
if "{{ cookiecutter.use_docker }}".lower() == "y": if "{{ cookiecutter.use_docker }}".lower() == "y":
@ -373,26 +477,18 @@ def main():
else: else:
remove_docker_files() remove_docker_files()
if ( if "{{ cookiecutter.use_docker }}".lower() == "y" and "{{ cookiecutter.cloud_provider}}" != "AWS":
"{{ cookiecutter.use_docker }}".lower() == "y"
and "{{ cookiecutter.cloud_provider}}" != "AWS"
):
remove_aws_dockerfile() remove_aws_dockerfile()
if "{{ cookiecutter.use_heroku }}".lower() == "n": if "{{ cookiecutter.use_heroku }}".lower() == "n":
remove_heroku_files() remove_heroku_files()
elif "{{ cookiecutter.frontend_pipeline }}" != "Django Compressor":
remove_heroku_build_hooks()
if ( if "{{ cookiecutter.use_docker }}".lower() == "n" and "{{ cookiecutter.use_heroku }}".lower() == "n":
"{{ cookiecutter.use_docker }}".lower() == "n"
and "{{ cookiecutter.use_heroku }}".lower() == "n"
):
if "{{ cookiecutter.keep_local_envs_in_vcs }}".lower() == "y": if "{{ cookiecutter.keep_local_envs_in_vcs }}".lower() == "y":
print( print(
INFO + ".env(s) are only utilized when Docker Compose and/or " INFO + ".env(s) are only utilized when Docker Compose and/or "
"Heroku support is enabled so keeping them does not " "Heroku support is enabled so keeping them does not make sense "
"make sense given your current setup." + TERMINATOR "given your current setup." + TERMINATOR
) )
remove_envs_and_associated_files() remove_envs_and_associated_files()
else: else:
@ -401,18 +497,26 @@ def main():
if "{{ cookiecutter.keep_local_envs_in_vcs }}".lower() == "y": if "{{ cookiecutter.keep_local_envs_in_vcs }}".lower() == "y":
append_to_gitignore_file("!.envs/.local/") append_to_gitignore_file("!.envs/.local/")
if "{{ cookiecutter.frontend_pipeline }}" != "Gulp": if "{{ cookiecutter.frontend_pipeline }}" in ["None", "Django Compressor"]:
remove_gulp_files() remove_gulp_files()
remove_webpack_files()
remove_sass_files()
remove_packagejson_file() remove_packagejson_file()
remove_prettier_pre_commit()
if "{{ cookiecutter.use_docker }}".lower() == "y": if "{{ cookiecutter.use_docker }}".lower() == "y":
remove_node_dockerfile() remove_node_dockerfile()
else:
handle_js_runner(
"{{ cookiecutter.frontend_pipeline }}",
use_docker=("{{ cookiecutter.use_docker }}".lower() == "y"),
use_async=("{{ cookiecutter.use_async }}".lower() == "y"),
)
if "{{ cookiecutter.cloud_provider}}" == "None": if "{{ cookiecutter.cloud_provider }}" == "None" and "{{ cookiecutter.use_docker }}".lower() == "n":
print( print(
WARNING + "You chose not to use a cloud provider, " WARNING + "You chose to not use any cloud providers nor Docker, "
"media files won't be served in production." + TERMINATOR "media files won't be served in production." + TERMINATOR
) )
remove_storages_module()
if "{{ cookiecutter.use_celery }}".lower() == "n": if "{{ cookiecutter.use_celery }}".lower() == "n":
remove_celery_files() remove_celery_files()
@ -428,6 +532,9 @@ def main():
if "{{ cookiecutter.ci_tool }}" != "Github": if "{{ cookiecutter.ci_tool }}" != "Github":
remove_dotgithub_folder() remove_dotgithub_folder()
if "{{ cookiecutter.ci_tool }}" != "Drone":
remove_dotdrone_file()
if "{{ cookiecutter.use_drf }}".lower() == "n": if "{{ cookiecutter.use_drf }}".lower() == "n":
remove_drf_starter_files() remove_drf_starter_files()

View File

@ -7,6 +7,7 @@ NOTE:
TODO: restrict Cookiecutter Django project initialization TODO: restrict Cookiecutter Django project initialization
to Python 3.x environments only to Python 3.x environments only
""" """
from __future__ import print_function from __future__ import print_function
import sys import sys
@ -17,26 +18,28 @@ INFO = "\x1b[1;33m [INFO]: "
HINT = "\x1b[3;33m" HINT = "\x1b[3;33m"
SUCCESS = "\x1b[1;32m [SUCCESS]: " SUCCESS = "\x1b[1;32m [SUCCESS]: "
# The content of this string is evaluated by Jinja, and plays an important role.
# It updates the cookiecutter context to trim leading and trailing spaces
# from domain/email values
"""
{{ cookiecutter.update({ "domain_name": cookiecutter.domain_name | trim }) }}
{{ cookiecutter.update({ "email": cookiecutter.email | trim }) }}
"""
project_slug = "{{ cookiecutter.project_slug }}" project_slug = "{{ cookiecutter.project_slug }}"
if hasattr(project_slug, "isidentifier"): if hasattr(project_slug, "isidentifier"):
assert ( assert project_slug.isidentifier(), "'{}' project slug is not a valid Python identifier.".format(project_slug)
project_slug.isidentifier()
), "'{}' project slug is not a valid Python identifier.".format(project_slug)
assert ( assert project_slug == project_slug.lower(), "'{}' project slug should be all lowercase".format(project_slug)
project_slug == project_slug.lower()
), "'{}' project slug should be all lowercase".format(project_slug)
assert ( assert "\\" not in "{{ cookiecutter.author_name }}", "Don't include backslashes in author name."
"\\" not in "{{ cookiecutter.author_name }}"
), "Don't include backslashes in author name."
if "{{ cookiecutter.use_docker }}".lower() == "n": if "{{ cookiecutter.use_docker }}".lower() == "n":
python_major_version = sys.version_info[0] python_major_version = sys.version_info[0]
if python_major_version == 2: if python_major_version == 2:
print( print(
WARNING + "You're running cookiecutter under Python 2, but the generated " WARNING + "You're running cookiecutter under Python 2, but the generated "
"project requires Python 3.10+. Do you want to proceed (y/n)? " + TERMINATOR "project requires Python 3.12+. Do you want to proceed (y/n)? " + TERMINATOR
) )
yes_options, no_options = frozenset(["y"]), frozenset(["n"]) yes_options, no_options = frozenset(["y"]), frozenset(["n"])
while True: while True:
@ -51,35 +54,16 @@ if "{{ cookiecutter.use_docker }}".lower() == "n":
print( print(
HINT HINT
+ "Please respond with {} or {}: ".format( + "Please respond with {} or {}: ".format(
", ".join( ", ".join(["'{}'".format(o) for o in yes_options if not o == ""]),
["'{}'".format(o) for o in yes_options if not o == ""] ", ".join(["'{}'".format(o) for o in no_options if not o == ""]),
),
", ".join(
["'{}'".format(o) for o in no_options if not o == ""]
),
) )
+ TERMINATOR + TERMINATOR
) )
if ( if "{{ cookiecutter.use_whitenoise }}".lower() == "n" and "{{ cookiecutter.cloud_provider }}" == "None":
"{{ cookiecutter.use_whitenoise }}".lower() == "n" print("You should either use Whitenoise or select a " "Cloud Provider to serve static files")
and "{{ cookiecutter.cloud_provider }}" == "None"
):
print(
"You should either use Whitenoise or select a "
"Cloud Provider to serve static files"
)
sys.exit(1) sys.exit(1)
if ( if "{{ cookiecutter.mail_service }}" == "Amazon SES" and "{{ cookiecutter.cloud_provider }}" != "AWS":
"{{ cookiecutter.cloud_provider }}" == "GCP" print("You should either use AWS or select a different " "Mail Service for sending emails.")
and "{{ cookiecutter.mail_service }}" == "Amazon SES"
) or (
"{{ cookiecutter.cloud_provider }}" == "None"
and "{{ cookiecutter.mail_service }}" == "Amazon SES"
):
print(
"You should either use AWS or select a different "
"Mail Service for sending emails."
)
sys.exit(1) sys.exit(1)

50
pyproject.toml Normal file
View File

@ -0,0 +1,50 @@
# ==== pytest ====
[tool.pytest.ini_options]
addopts = "-v --tb=short"
norecursedirs = [
".tox",
".git",
"*/migrations/*",
"*/static/*",
"docs",
"venv",
"*/{{cookiecutter.project_slug}}/*",
]
# ==== black ====
[tool.black]
line-length = 119
target-version = ['py312']
# ==== isort ====
[tool.isort]
profile = "black"
line_length = 119
known_first_party = [
"tests",
"scripts",
"hooks",
]
# ==== djLint ====
[tool.djlint]
blank_line_after_tag = "load,extends"
close_void_tags = true
format_css = true
format_js = true
# TODO: remove T002 when fixed https://github.com/Riverside-Healthcare/djLint/issues/687
ignore = "H006,H030,H031,T002,T028"
ignore_blocks = "raw"
include = "H017,H035"
indent = 2
max_line_length = 119
profile = "jinja"
[tool.djlint.css]
indent_size = 2
[tool.djlint.js]
indent_size = 2

View File

@ -1,3 +0,0 @@
[pytest]
addopts = -v --tb=short
norecursedirs = .tox .git */migrations/* */static/* docs venv */{{cookiecutter.project_slug}}/*

View File

@ -1,26 +1,26 @@
cookiecutter==2.1.1 cookiecutter==2.6.0
sh==1.14.3; sys_platform != "win32" sh==2.0.6; sys_platform != "win32"
binaryornot==0.4.4 binaryornot==0.4.4
# Code quality # Code quality
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
black==22.10.0 ruff==0.4.4
isort==5.10.1 django-upgrade==1.17.0
flake8==5.0.4 djlint==1.34.1
flake8-isort==5.0.0 pre-commit==3.7.1
pre-commit==2.20.0
# Testing # Testing
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
tox==3.27.0 tox==4.15.0
pytest==7.2.0 pytest==8.2.0
pytest-cookies==0.6.1 pytest-xdist==3.6.1
pytest-instafail==0.4.2 pytest-cookies==0.7.0
pyyaml==6.0 pytest-instafail==0.5.0
pyyaml==6.0.1
# Scripting # Scripting
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
PyGithub==1.57 PyGithub==2.3.0
gitpython==3.1.29 gitpython==3.1.43
jinja2==3.1.2 jinja2==3.1.4
requests==2.28.1 requests==2.31.0

View File

@ -6,6 +6,7 @@ patches, only comparing major and minor version numbers.
This script handles when there are multiple Django versions that need This script handles when there are multiple Django versions that need
to keep up to date. to keep up to date.
""" """
from __future__ import annotations from __future__ import annotations
import os import os
@ -141,9 +142,7 @@ class GitHubManager:
self.requirements_files = ["base", "local", "production"] self.requirements_files = ["base", "local", "production"]
# Format: # Format:
# requirement file name: {package name: (master_version, package_info)} # requirement file name: {package name: (master_version, package_info)}
self.requirements: dict[str, dict[str, tuple[str, dict]]] = { self.requirements: dict[str, dict[str, tuple[str, dict]]] = {x: {} for x in self.requirements_files}
x: {} for x in self.requirements_files
}
def setup(self) -> None: def setup(self) -> None:
self.load_requirements() self.load_requirements()
@ -177,26 +176,19 @@ class GitHubManager:
"is": "issue", "is": "issue",
"in": "title", "in": "title",
} }
issues = list( issues = list(self.github.search_issues("[Django Update]", "created", "desc", **qualifiers))
self.github.search_issues(
"[Django Update]", "created", "desc", **qualifiers
)
)
print(f"Found {len(issues)} issues matching search") print(f"Found {len(issues)} issues matching search")
for issue in issues: for issue in issues:
matches = re.match(r"\[Update Django] Django (\d+.\d+)$", issue.title) matches = re.match(r"\[Update Django] Django (\d+.\d+)$", issue.title)
if not matches: if not matches:
continue continue
issue_version = DjVersion.parse(matches.group(1)) issue_version = DjVersion.parse(matches.group(1))
if self.base_dj_version > issue_version: if self.base_dj_version >= issue_version:
issue.edit(state="closed") self.close_issue(issue)
print(f"Closed issue {issue.title} (ID: [{issue.id}]({issue.url}))")
else: else:
self.existing_issues[issue_version] = issue self.existing_issues[issue_version] = issue
def get_compatibility( def get_compatibility(self, package_name: str, package_info: dict, needed_dj_version: DjVersion):
self, package_name: str, package_info: dict, needed_dj_version: DjVersion
):
""" """
Verify compatibility via setup.py classifiers. If Django is not in the Verify compatibility via setup.py classifiers. If Django is not in the
classifiers, then default compatibility is n/a and OK is . classifiers, then default compatibility is n/a and OK is .
@ -209,9 +201,7 @@ class GitHubManager:
# updated packages, or known releases that will happen but haven't yet # updated packages, or known releases that will happen but haven't yet
if issue := self.existing_issues.get(needed_dj_version): if issue := self.existing_issues.get(needed_dj_version):
if index := issue.body.find(package_name): if index := issue.body.find(package_name):
name, _current, prev_compat, ok = ( name, _current, prev_compat, ok = (s.strip() for s in issue.body[index:].split("|", 4)[:4])
s.strip() for s in issue.body[index:].split("|", 4)[:4]
)
if ok in ("", "", "🕒"): if ok in ("", "", "🕒"):
return prev_compat, ok return prev_compat, ok
@ -223,7 +213,7 @@ class GitHubManager:
for classifier in package_info["info"]["classifiers"]: for classifier in package_info["info"]["classifiers"]:
# Usually in the form of "Framework :: Django :: 3.2" # Usually in the form of "Framework :: Django :: 3.2"
tokens = classifier.split(" ") tokens = classifier.split(" ")
if len(tokens) >= 5 and tokens[2].lower() == "django": if len(tokens) >= 5 and tokens[2].lower() == "django" and "." in tokens[4]:
version = DjVersion.parse(tokens[4]) version = DjVersion.parse(tokens[4])
if len(version) == 2: if len(version) == 2:
supported_dj_versions.append(version) supported_dj_versions.append(version)
@ -248,9 +238,7 @@ class GitHubManager:
] ]
def _get_md_home_page_url(self, package_info: dict): def _get_md_home_page_url(self, package_info: dict):
urls = [ urls = [package_info["info"].get(url_key) for url_key in self.HOME_PAGE_URL_KEYS]
package_info["info"].get(url_key) for url_key in self.HOME_PAGE_URL_KEYS
]
try: try:
return f"[{{}}]({next(item for item in urls if item)})" return f"[{{}}]({next(item for item in urls if item)})"
except StopIteration: except StopIteration:
@ -259,13 +247,9 @@ class GitHubManager:
def generate_markdown(self, needed_dj_version: DjVersion): def generate_markdown(self, needed_dj_version: DjVersion):
requirements = f"{needed_dj_version} requirements tables\n\n" requirements = f"{needed_dj_version} requirements tables\n\n"
for _file in self.requirements_files: for _file in self.requirements_files:
requirements += _TABLE_HEADER.format_map( requirements += _TABLE_HEADER.format_map({"file": _file, "dj_version": needed_dj_version})
{"file": _file, "dj_version": needed_dj_version}
)
for package_name, (version, info) in self.requirements[_file].items(): for package_name, (version, info) in self.requirements[_file].items():
compat_version, icon = self.get_compatibility( compat_version, icon = self.get_compatibility(package_name, info, needed_dj_version)
package_name, info, needed_dj_version
)
requirements += ( requirements += (
f"| {self._get_md_home_page_url(info).format(package_name)} " f"| {self._get_md_home_page_url(info).format(package_name)} "
f"| {version.strip()} " f"| {version.strip()} "
@ -282,11 +266,14 @@ class GitHubManager:
issue.edit(body=description) issue.edit(body=description)
else: else:
print(f"Creating new issue for Django {needed_dj_version}") print(f"Creating new issue for Django {needed_dj_version}")
issue = self.repo.create_issue( issue = self.repo.create_issue(f"[Update Django] Django {needed_dj_version}", description)
f"[Update Django] Django {needed_dj_version}", description
)
issue.add_to_labels(f"django{needed_dj_version}") issue.add_to_labels(f"django{needed_dj_version}")
@staticmethod
def close_issue(issue: Issue):
issue.edit(state="closed")
print(f"Closed issue {issue.title} (ID: [{issue.id}]({issue.url}))")
def generate(self): def generate(self):
for version in self.needed_dj_versions: for version in self.needed_dj_versions:
print(f"Handling GitHub issue for Django {version}") print(f"Handling GitHub issue for Django {version}")
@ -297,21 +284,22 @@ class GitHubManager:
def main(django_max_version=None) -> None: def main(django_max_version=None) -> None:
# Check if there are any djs # Check if there are any djs
current_dj, latest_djs = get_all_latest_django_versions( current_dj, latest_djs = get_all_latest_django_versions(django_max_version=django_max_version)
django_max_version=django_max_version
) # Run the setup, which might close old issues
if not latest_djs:
sys.exit(0)
manager = GitHubManager(current_dj, latest_djs) manager = GitHubManager(current_dj, latest_djs)
manager.setup() manager.setup()
if not latest_djs:
print("No new Django versions to update. Exiting...")
sys.exit(0)
manager.generate() manager.generate()
if __name__ == "__main__": if __name__ == "__main__":
if GITHUB_REPO is None: if GITHUB_REPO is None:
raise RuntimeError( raise RuntimeError("No github repo, please set the environment variable GITHUB_REPOSITORY")
"No github repo, please set the environment variable GITHUB_REPOSITORY"
)
max_version = None max_version = None
last_arg = sys.argv[-1] last_arg = sys.argv[-1]
if CURRENT_FILE.name not in last_arg: if CURRENT_FILE.name not in last_arg:

View File

@ -32,6 +32,9 @@ def main() -> None:
# Group pull requests by type of change # Group pull requests by type of change
grouped_pulls = group_pulls_by_change_type(merged_pulls) grouped_pulls = group_pulls_by_change_type(merged_pulls)
if not any(grouped_pulls.values()):
print("Pull requests merged aren't worth a changelog mention.")
return
# Generate portion of markdown # Generate portion of markdown
release_changes_summary = generate_md(grouped_pulls) release_changes_summary = generate_md(grouped_pulls)
@ -82,14 +85,20 @@ def group_pulls_by_change_type(
grouped_pulls = { grouped_pulls = {
"Changed": [], "Changed": [],
"Fixed": [], "Fixed": [],
"Documentation": [],
"Updated": [], "Updated": [],
} }
for pull in pull_requests_list: for pull in pull_requests_list:
label_names = {label.name for label in pull.labels} label_names = {label.name for label in pull.labels}
if "project infrastructure" in label_names:
# Don't mention it in the changelog
continue
if "update" in label_names: if "update" in label_names:
group_name = "Updated" group_name = "Updated"
elif "bug" in label_names: elif "bug" in label_names:
group_name = "Fixed" group_name = "Fixed"
elif "docs" in label_names:
group_name = "Documentation"
else: else:
group_name = "Changed" group_name = "Changed"
grouped_pulls[group_name].append(pull) grouped_pulls[group_name].append(pull)
@ -148,11 +157,7 @@ def update_git_repo(paths: list[Path], release: str) -> None:
if __name__ == "__main__": if __name__ == "__main__":
if GITHUB_REPO is None: if GITHUB_REPO is None:
raise RuntimeError( raise RuntimeError("No github repo, please set the environment variable GITHUB_REPOSITORY")
"No github repo, please set the environment variable GITHUB_REPOSITORY"
)
if GIT_BRANCH is None: if GIT_BRANCH is None:
raise RuntimeError( raise RuntimeError("No git branch set, please set the GITHUB_REF_NAME environment variable")
"No git branch set, please set the GITHUB_REF_NAME environment variable"
)
main() main()

View File

@ -40,19 +40,13 @@ def iter_recent_authors():
""" """
Fetch users who opened recently merged pull requests. Fetch users who opened recently merged pull requests.
Use Github API to fetch recent authors rather than Use GitHub API to fetch recent authors rather than
git CLI to work with Github usernames. git CLI to work with GitHub usernames.
""" """
repo = Github(login_or_token=GITHUB_TOKEN, per_page=5).get_repo(GITHUB_REPO) repo = Github(login_or_token=GITHUB_TOKEN, per_page=5).get_repo(GITHUB_REPO)
recent_pulls = repo.get_pulls( recent_pulls = repo.get_pulls(state="closed", sort="updated", direction="desc").get_page(0)
state="closed", sort="updated", direction="desc"
).get_page(0)
for pull in recent_pulls: for pull in recent_pulls:
if ( if pull.merged and pull.user.type == "User" and pull.user.login not in BOT_LOGINS:
pull.merged
and pull.user.type == "User"
and pull.user.login not in BOT_LOGINS
):
yield pull.user yield pull.user
@ -96,9 +90,7 @@ def write_md_file(contributors):
core_contributors = [c for c in contributors if c.get("is_core", False)] core_contributors = [c for c in contributors if c.get("is_core", False)]
other_contributors = (c for c in contributors if not c.get("is_core", False)) other_contributors = (c for c in contributors if not c.get("is_core", False))
other_contributors = sorted(other_contributors, key=lambda c: c["name"].lower()) other_contributors = sorted(other_contributors, key=lambda c: c["name"].lower())
content = template.render( content = template.render(core_contributors=core_contributors, other_contributors=other_contributors)
core_contributors=core_contributors, other_contributors=other_contributors
)
file_path = ROOT / "CONTRIBUTORS.md" file_path = ROOT / "CONTRIBUTORS.md"
file_path.write_text(content) file_path.write_text(content)
@ -106,7 +98,5 @@ def write_md_file(contributors):
if __name__ == "__main__": if __name__ == "__main__":
if GITHUB_REPO is None: if GITHUB_REPO is None:
raise RuntimeError( raise RuntimeError("No github repo, please set the environment variable GITHUB_REPOSITORY")
"No github repo, please set the environment variable GITHUB_REPOSITORY"
)
main() main()

View File

@ -1,7 +0,0 @@
[flake8]
exclude = docs
max-line-length = 88
[isort]
profile = black
known_first_party = tests,scripts,hooks

View File

@ -5,18 +5,15 @@ except ImportError:
from distutils.core import setup from distutils.core import setup
# We use calendar versioning # We use calendar versioning
version = "2022.11.07" version = "2024.05.11"
with open("README.rst") as readme_file: with open("README.md") as readme_file:
long_description = readme_file.read() long_description = readme_file.read()
setup( setup(
name="cookiecutter-django", name="cookiecutter-django",
version=version, version=version,
description=( description=("A Cookiecutter template for creating production-ready " "Django projects quickly."),
"A Cookiecutter template for creating production-ready "
"Django projects quickly."
),
long_description=long_description, long_description=long_description,
author="Daniel Roy Greenfeld", author="Daniel Roy Greenfeld",
author_email="pydanny@gmail.com", author_email="pydanny@gmail.com",
@ -27,13 +24,13 @@ setup(
classifiers=[ classifiers=[
"Development Status :: 4 - Beta", "Development Status :: 4 - Beta",
"Environment :: Console", "Environment :: Console",
"Framework :: Django :: 4.0", "Framework :: Django :: 4.2",
"Intended Audience :: Developers", "Intended Audience :: Developers",
"Natural Language :: English", "Natural Language :: English",
"License :: OSI Approved :: BSD License", "License :: OSI Approved :: BSD License",
"Programming Language :: Python", "Programming Language :: Python",
"Programming Language :: Python :: 3", "Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.12",
"Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: CPython",
"Topic :: Software Development", "Topic :: Software Development",
], ],

View File

@ -20,25 +20,17 @@ sudo utility/install_os_dependencies.sh install
# Install Python deps # Install Python deps
pip install -r requirements/local.txt pip install -r requirements/local.txt
# Lint by running pre-commit on all files
# Needs a git repo to find the project root
git init
git add .
pre-commit run --show-diff-on-failure -a
# run the project's tests # run the project's tests
pytest pytest
# Make sure the check doesn't raise any warnings # Make sure the check doesn't raise any warnings
python manage.py check --fail-level WARNING python manage.py check --fail-level WARNING
# Run npm build script if package.json is present
if [ -f "package.json" ] if [ -f "package.json" ]
then then
npm install npm install
if [ -f "gulpfile.js" ] npm run build
then
npm run build
fi
fi fi
# Generate the HTML for the documentation # Generate the HTML for the documentation

View File

@ -1,3 +1,4 @@
import glob
import os import os
import re import re
import sys import sys
@ -20,6 +21,12 @@ if sys.platform.startswith("win"):
elif sys.platform.startswith("darwin") and os.getenv("CI"): elif sys.platform.startswith("darwin") and os.getenv("CI"):
pytest.skip("skipping slow macOS tests on CI", allow_module_level=True) pytest.skip("skipping slow macOS tests on CI", allow_module_level=True)
# Run auto-fixable styles checks - skipped on CI by default. These can be fixed
# automatically by running pre-commit after generation however they are tedious
# to fix in the template, so we don't insist too much in fixing them.
AUTOFIXABLE_STYLES = os.getenv("AUTOFIXABLE_STYLES") == "1"
auto_fixable = pytest.mark.skipif(not AUTOFIXABLE_STYLES, reason="auto-fixable")
@pytest.fixture @pytest.fixture
def context(): def context():
@ -48,21 +55,26 @@ def generate_license_file_titles():
SUPPORTED_COMBINATIONS = [ SUPPORTED_COMBINATIONS = [
*[{"open_source_license": x} for x in generate_license_file_titles()], *[{"open_source_license": x} for x in generate_license_file_titles()],
{"username_type": "username"},
{"username_type": "email"},
{"windows": "y"}, {"windows": "y"},
{"windows": "n"}, {"windows": "n"},
{"use_pycharm": "y"}, {"editor": "None"},
{"use_pycharm": "n"}, {"editor": "PyCharm"},
{"editor": "VS Code"},
{"use_docker": "y"}, {"use_docker": "y"},
{"use_docker": "n"}, {"use_docker": "n"},
{"postgresql_version": "16"},
{"postgresql_version": "15"},
{"postgresql_version": "14"}, {"postgresql_version": "14"},
{"postgresql_version": "13"}, {"postgresql_version": "13"},
{"postgresql_version": "12"}, {"postgresql_version": "12"},
{"postgresql_version": "11"},
{"postgresql_version": "10"},
{"cloud_provider": "AWS", "use_whitenoise": "y"}, {"cloud_provider": "AWS", "use_whitenoise": "y"},
{"cloud_provider": "AWS", "use_whitenoise": "n"}, {"cloud_provider": "AWS", "use_whitenoise": "n"},
{"cloud_provider": "GCP", "use_whitenoise": "y"}, {"cloud_provider": "GCP", "use_whitenoise": "y"},
{"cloud_provider": "GCP", "use_whitenoise": "n"}, {"cloud_provider": "GCP", "use_whitenoise": "n"},
{"cloud_provider": "Azure", "use_whitenoise": "y"},
{"cloud_provider": "Azure", "use_whitenoise": "n"},
{"cloud_provider": "None", "use_whitenoise": "y", "mail_service": "Mailgun"}, {"cloud_provider": "None", "use_whitenoise": "y", "mail_service": "Mailgun"},
{"cloud_provider": "None", "use_whitenoise": "y", "mail_service": "Mailjet"}, {"cloud_provider": "None", "use_whitenoise": "y", "mail_service": "Mailjet"},
{"cloud_provider": "None", "use_whitenoise": "y", "mail_service": "Mandrill"}, {"cloud_provider": "None", "use_whitenoise": "y", "mail_service": "Mandrill"},
@ -89,7 +101,16 @@ SUPPORTED_COMBINATIONS = [
{"cloud_provider": "GCP", "mail_service": "SendinBlue"}, {"cloud_provider": "GCP", "mail_service": "SendinBlue"},
{"cloud_provider": "GCP", "mail_service": "SparkPost"}, {"cloud_provider": "GCP", "mail_service": "SparkPost"},
{"cloud_provider": "GCP", "mail_service": "Other SMTP"}, {"cloud_provider": "GCP", "mail_service": "Other SMTP"},
# Note: cloud_providers GCP and None with mail_service Amazon SES is not supported {"cloud_provider": "Azure", "mail_service": "Mailgun"},
{"cloud_provider": "Azure", "mail_service": "Mailjet"},
{"cloud_provider": "Azure", "mail_service": "Mandrill"},
{"cloud_provider": "Azure", "mail_service": "Postmark"},
{"cloud_provider": "Azure", "mail_service": "Sendgrid"},
{"cloud_provider": "Azure", "mail_service": "SendinBlue"},
{"cloud_provider": "Azure", "mail_service": "SparkPost"},
{"cloud_provider": "Azure", "mail_service": "Other SMTP"},
# Note: cloud_providers GCP, Azure, and None
# with mail_service Amazon SES is not supported
{"use_async": "y"}, {"use_async": "y"},
{"use_async": "n"}, {"use_async": "n"},
{"use_drf": "y"}, {"use_drf": "y"},
@ -97,10 +118,11 @@ SUPPORTED_COMBINATIONS = [
{"frontend_pipeline": "None"}, {"frontend_pipeline": "None"},
{"frontend_pipeline": "Django Compressor"}, {"frontend_pipeline": "Django Compressor"},
{"frontend_pipeline": "Gulp"}, {"frontend_pipeline": "Gulp"},
{"frontend_pipeline": "Webpack"},
{"use_celery": "y"}, {"use_celery": "y"},
{"use_celery": "n"}, {"use_celery": "n"},
{"use_mailhog": "y"}, {"use_mailpit": "y"},
{"use_mailhog": "n"}, {"use_mailpit": "n"},
{"use_sentry": "y"}, {"use_sentry": "y"},
{"use_sentry": "n"}, {"use_sentry": "n"},
{"use_whitenoise": "y"}, {"use_whitenoise": "y"},
@ -111,6 +133,7 @@ SUPPORTED_COMBINATIONS = [
{"ci_tool": "Travis"}, {"ci_tool": "Travis"},
{"ci_tool": "Gitlab"}, {"ci_tool": "Gitlab"},
{"ci_tool": "Github"}, {"ci_tool": "Github"},
{"ci_tool": "Drone"},
{"keep_local_envs_in_vcs": "y"}, {"keep_local_envs_in_vcs": "y"},
{"keep_local_envs_in_vcs": "n"}, {"keep_local_envs_in_vcs": "n"},
{"debug": "y"}, {"debug": "y"},
@ -120,6 +143,7 @@ SUPPORTED_COMBINATIONS = [
UNSUPPORTED_COMBINATIONS = [ UNSUPPORTED_COMBINATIONS = [
{"cloud_provider": "None", "use_whitenoise": "n"}, {"cloud_provider": "None", "use_whitenoise": "n"},
{"cloud_provider": "GCP", "mail_service": "Amazon SES"}, {"cloud_provider": "GCP", "mail_service": "Amazon SES"},
{"cloud_provider": "Azure", "mail_service": "Amazon SES"},
{"cloud_provider": "None", "mail_service": "Amazon SES"}, {"cloud_provider": "None", "mail_service": "Amazon SES"},
] ]
@ -129,13 +153,9 @@ def _fixture_id(ctx):
return "-".join(f"{key}:{value}" for key, value in ctx.items()) return "-".join(f"{key}:{value}" for key, value in ctx.items())
def build_files_list(root_dir): def build_files_list(base_dir):
"""Build a list containing absolute paths to the generated files.""" """Build a list containing absolute paths to the generated files."""
return [ return [os.path.join(dirpath, file_path) for dirpath, subdirs, files in os.walk(base_dir) for file_path in files]
os.path.join(dirpath, file_path)
for dirpath, subdirs, files in os.walk(root_dir)
for file_path in files
]
def check_paths(paths): def check_paths(paths):
@ -166,27 +186,78 @@ def test_project_generation(cookies, context, context_override):
@pytest.mark.parametrize("context_override", SUPPORTED_COMBINATIONS, ids=_fixture_id) @pytest.mark.parametrize("context_override", SUPPORTED_COMBINATIONS, ids=_fixture_id)
def test_flake8_passes(cookies, context_override): def test_ruff_check_passes(cookies, context_override):
"""Generated project should pass flake8.""" """Generated project should pass ruff check."""
result = cookies.bake(extra_context=context_override) result = cookies.bake(extra_context=context_override)
try: try:
sh.flake8(_cwd=str(result.project_path)) sh.ruff("check", ".", _cwd=str(result.project_path))
except sh.ErrorReturnCode as e:
pytest.fail(e.stdout.decode())
@auto_fixable
@pytest.mark.parametrize("context_override", SUPPORTED_COMBINATIONS, ids=_fixture_id)
def test_ruff_format_passes(cookies, context_override):
"""Check whether generated project passes ruff format."""
result = cookies.bake(extra_context=context_override)
try:
sh.ruff(
"format",
".",
_cwd=str(result.project_path),
)
except sh.ErrorReturnCode as e:
pytest.fail(e.stdout.decode())
@auto_fixable
@pytest.mark.parametrize("context_override", SUPPORTED_COMBINATIONS, ids=_fixture_id)
def test_isort_passes(cookies, context_override):
"""Check whether generated project passes isort style."""
result = cookies.bake(extra_context=context_override)
try:
sh.isort(_cwd=str(result.project_path))
except sh.ErrorReturnCode as e:
pytest.fail(e.stdout.decode())
@auto_fixable
@pytest.mark.parametrize("context_override", SUPPORTED_COMBINATIONS, ids=_fixture_id)
def test_django_upgrade_passes(cookies, context_override):
"""Check whether generated project passes django-upgrade."""
result = cookies.bake(extra_context=context_override)
python_files = [
file_path.removeprefix(f"{result.project_path}/")
for file_path in glob.glob(str(result.project_path / "**" / "*.py"), recursive=True)
]
try:
sh.django_upgrade(
"--target-version",
"4.2",
*python_files,
_cwd=str(result.project_path),
)
except sh.ErrorReturnCode as e: except sh.ErrorReturnCode as e:
pytest.fail(e.stdout.decode()) pytest.fail(e.stdout.decode())
@pytest.mark.parametrize("context_override", SUPPORTED_COMBINATIONS, ids=_fixture_id) @pytest.mark.parametrize("context_override", SUPPORTED_COMBINATIONS, ids=_fixture_id)
def test_black_passes(cookies, context_override): def test_djlint_lint_passes(cookies, context_override):
"""Generated project should pass black.""" """Check whether generated project passes djLint --lint."""
result = cookies.bake(extra_context=context_override) result = cookies.bake(extra_context=context_override)
autofixable_rules = "H014,T001"
# TODO: remove T002 when fixed https://github.com/Riverside-Healthcare/djLint/issues/687
ignored_rules = "H006,H030,H031,T002"
try: try:
sh.black( sh.djlint(
"--check", "--lint",
"--diff", "--ignore",
"--exclude", f"{autofixable_rules},{ignored_rules}",
"migrations",
".", ".",
_cwd=str(result.project_path), _cwd=str(result.project_path),
) )
@ -194,11 +265,23 @@ def test_black_passes(cookies, context_override):
pytest.fail(e.stdout.decode()) pytest.fail(e.stdout.decode())
@auto_fixable
@pytest.mark.parametrize("context_override", SUPPORTED_COMBINATIONS, ids=_fixture_id)
def test_djlint_check_passes(cookies, context_override):
"""Check whether generated project passes djLint --check."""
result = cookies.bake(extra_context=context_override)
try:
sh.djlint("--check", ".", _cwd=str(result.project_path))
except sh.ErrorReturnCode as e:
pytest.fail(e.stdout.decode())
@pytest.mark.parametrize( @pytest.mark.parametrize(
["use_docker", "expected_test_script"], ["use_docker", "expected_test_script"],
[ [
("n", "pytest"), ("n", "pytest"),
("y", "docker-compose -f local.yml run django pytest"), ("y", "docker compose -f docker-compose.local.yml run django pytest"),
], ],
) )
def test_travis_invokes_pytest(cookies, context, use_docker, expected_test_script): def test_travis_invokes_pytest(cookies, context, use_docker, expected_test_script):
@ -213,7 +296,7 @@ def test_travis_invokes_pytest(cookies, context, use_docker, expected_test_scrip
with open(f"{result.project_path}/.travis.yml") as travis_yml: with open(f"{result.project_path}/.travis.yml") as travis_yml:
try: try:
yml = yaml.safe_load(travis_yml)["jobs"]["include"] yml = yaml.safe_load(travis_yml)["jobs"]["include"]
assert yml[0]["script"] == ["flake8"] assert yml[0]["script"] == ["ruff check ."]
assert yml[1]["script"] == [expected_test_script] assert yml[1]["script"] == [expected_test_script]
except yaml.YAMLError as e: except yaml.YAMLError as e:
pytest.fail(str(e)) pytest.fail(str(e))
@ -223,12 +306,10 @@ def test_travis_invokes_pytest(cookies, context, use_docker, expected_test_scrip
["use_docker", "expected_test_script"], ["use_docker", "expected_test_script"],
[ [
("n", "pytest"), ("n", "pytest"),
("y", "docker-compose -f local.yml run django pytest"), ("y", "docker compose -f docker-compose.local.yml run django pytest"),
], ],
) )
def test_gitlab_invokes_flake8_and_pytest( def test_gitlab_invokes_precommit_and_pytest(cookies, context, use_docker, expected_test_script):
cookies, context, use_docker, expected_test_script
):
context.update({"ci_tool": "Gitlab", "use_docker": use_docker}) context.update({"ci_tool": "Gitlab", "use_docker": use_docker})
result = cookies.bake(extra_context=context) result = cookies.bake(extra_context=context)
@ -240,7 +321,9 @@ def test_gitlab_invokes_flake8_and_pytest(
with open(f"{result.project_path}/.gitlab-ci.yml") as gitlab_yml: with open(f"{result.project_path}/.gitlab-ci.yml") as gitlab_yml:
try: try:
gitlab_config = yaml.safe_load(gitlab_yml) gitlab_config = yaml.safe_load(gitlab_yml)
assert gitlab_config["flake8"]["script"] == ["flake8"] assert gitlab_config["precommit"]["script"] == [
"pre-commit run --show-diff-on-failure --color=always --all-files"
]
assert gitlab_config["pytest"]["script"] == [expected_test_script] assert gitlab_config["pytest"]["script"] == [expected_test_script]
except yaml.YAMLError as e: except yaml.YAMLError as e:
pytest.fail(e) pytest.fail(e)
@ -250,12 +333,10 @@ def test_gitlab_invokes_flake8_and_pytest(
["use_docker", "expected_test_script"], ["use_docker", "expected_test_script"],
[ [
("n", "pytest"), ("n", "pytest"),
("y", "docker-compose -f local.yml run django pytest"), ("y", "docker compose -f docker-compose.local.yml run django pytest"),
], ],
) )
def test_github_invokes_linter_and_pytest( def test_github_invokes_linter_and_pytest(cookies, context, use_docker, expected_test_script):
cookies, context, use_docker, expected_test_script
):
context.update({"ci_tool": "Github", "use_docker": use_docker}) context.update({"ci_tool": "Github", "use_docker": use_docker})
result = cookies.bake(extra_context=context) result = cookies.bake(extra_context=context)
@ -304,17 +385,37 @@ def test_error_if_incompatible(cookies, context, invalid_context):
@pytest.mark.parametrize( @pytest.mark.parametrize(
["use_pycharm", "pycharm_docs_exist"], ["editor", "pycharm_docs_exist"],
[ [
("n", False), ("None", False),
("y", True), ("PyCharm", True),
("VS Code", False),
], ],
) )
def test_pycharm_docs_removed(cookies, context, use_pycharm, pycharm_docs_exist): def test_pycharm_docs_removed(cookies, context, editor, pycharm_docs_exist):
""".""" context.update({"editor": editor})
context.update({"use_pycharm": use_pycharm})
result = cookies.bake(extra_context=context) result = cookies.bake(extra_context=context)
with open(f"{result.project_path}/docs/index.rst") as f: with open(f"{result.project_path}/docs/index.rst") as f:
has_pycharm_docs = "pycharm/configuration" in f.read() has_pycharm_docs = "pycharm/configuration" in f.read()
assert has_pycharm_docs is pycharm_docs_exist assert has_pycharm_docs is pycharm_docs_exist
def test_trim_domain_email(cookies, context):
"""Check that leading and trailing spaces are trimmed in domain and email."""
context.update(
{
"use_docker": "y",
"domain_name": " example.com ",
"email": " me@example.com ",
}
)
result = cookies.bake(extra_context=context)
assert result.exit_code == 0
prod_django_env = result.project_path / ".envs" / ".production" / ".django"
assert "DJANGO_ALLOWED_HOSTS=.example.com" in prod_django_env.read_text()
base_settings = result.project_path / "config" / "settings" / "base.py"
assert '"me@example.com"' in base_settings.read_text()

View File

@ -14,30 +14,39 @@ cd .cache/docker
cookiecutter ../../ --no-input --overwrite-if-exists use_docker=y "$@" cookiecutter ../../ --no-input --overwrite-if-exists use_docker=y "$@"
cd my_awesome_project cd my_awesome_project
# Lint by running pre-commit on all files
# Needs a git repo to find the project root
# We don't have git inside Docker, so run it outside
git init
git add .
pre-commit run --show-diff-on-failure -a
# make sure all images build # make sure all images build
docker-compose -f local.yml build docker compose -f docker-compose.local.yml build
# run the project's type checks # run the project's type checks
docker-compose -f local.yml run django mypy my_awesome_project docker compose -f docker-compose.local.yml run django mypy my_awesome_project
# run the project's tests # run the project's tests
docker-compose -f local.yml run django pytest docker compose -f docker-compose.local.yml run django pytest
# return non-zero status code if there are migrations that have not been created # return non-zero status code if there are migrations that have not been created
docker-compose -f local.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; } docker compose -f docker-compose.local.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; }
# Test support for translations # Test support for translations
docker-compose -f local.yml run django python manage.py makemessages --all docker compose -f docker-compose.local.yml run django python manage.py makemessages --all
# Make sure the check doesn't raise any warnings # Make sure the check doesn't raise any warnings
docker-compose -f local.yml run django python manage.py check --fail-level WARNING docker compose -f docker-compose.local.yml run \
-e DJANGO_SECRET_KEY="$(openssl rand -base64 64)" \
-e REDIS_URL=redis://redis:6379/0 \
-e CELERY_BROKER_URL=redis://redis:6379/0 \
-e DJANGO_AWS_ACCESS_KEY_ID=x \
-e DJANGO_AWS_SECRET_ACCESS_KEY=x \
-e DJANGO_AWS_STORAGE_BUCKET_NAME=x \
-e DJANGO_ADMIN_URL=x \
-e MAILGUN_API_KEY=x \
-e MAILGUN_DOMAIN=x \
django python manage.py check --settings=config.settings.production --deploy --database default --fail-level WARNING
# Generate the HTML for the documentation # Generate the HTML for the documentation
docker-compose -f local.yml run docs make html docker compose -f docker-compose.docs.yml run docs make html
# Run npm build script if package.json is present
if [ -f "package.json" ]
then
docker compose -f docker-compose.local.yml run node npm run build
fi

View File

@ -1,4 +1,5 @@
"""Unit tests for the hooks""" """Unit tests for the hooks"""
import os import os
from pathlib import Path from pathlib import Path
@ -22,7 +23,5 @@ def test_append_to_gitignore_file(working_directory):
gitignore_file.write_text("node_modules/\n") gitignore_file.write_text("node_modules/\n")
append_to_gitignore_file(".envs/*") append_to_gitignore_file(".envs/*")
linesep = os.linesep.encode() linesep = os.linesep.encode()
assert ( assert gitignore_file.read_bytes() == b"node_modules/" + linesep + b".envs/*" + linesep
gitignore_file.read_bytes() == b"node_modules/" + linesep + b".envs/*" + linesep
)
assert gitignore_file.read_text() == "node_modules/\n.envs/*\n" assert gitignore_file.read_text() == "node_modules/\n.envs/*\n"

View File

@ -1,10 +1,11 @@
[tox] [tox]
skipsdist = true skipsdist = true
envlist = py310,black-template envlist = py312,black-template
[testenv] [testenv]
deps = -rrequirements.txt deps = -rrequirements.txt
commands = pytest {posargs:./tests} passenv = AUTOFIXABLE_STYLES
commands = pytest -n auto {posargs:./tests}
[testenv:black-template] [testenv:black-template]
deps = black deps = black

View File

@ -0,0 +1,20 @@
#
# .bashrc.override.sh
#
# persistent bash history
HISTFILE=~/.bash_history
PROMPT_COMMAND="history -a; $PROMPT_COMMAND"
# set some django env vars
source /entrypoint
# restore default shell options
set +o errexit
set +o pipefail
set +o nounset
# start ssh-agent
# https://code.visualstudio.com/docs/remote/troubleshooting
eval "$(ssh-agent -s)"

View File

@ -0,0 +1,70 @@
// For format details, see https://containers.dev/implementors/json_reference/
{
"name": "{{cookiecutter.project_slug}}_dev",
"dockerComposeFile": [
"../docker-compose.local.yml"
],
"init": true,
"mounts": [
{
"source": "./.devcontainer/bash_history",
"target": "/home/dev-user/.bash_history",
"type": "bind"
},
{
"source": "~/.ssh",
"target": "/home/dev-user/.ssh",
"type": "bind"
}
],
// Tells devcontainer.json supporting services / tools whether they should run
// /bin/sh -c "while sleep 1000; do :; done" when starting the container instead of the containers default command
"overrideCommand": false,
"service": "django",
// "remoteEnv": {"PATH": "/home/dev-user/.local/bin:${containerEnv:PATH}"},
"remoteUser": "dev-user",
"workspaceFolder": "/app",
// Set *default* container specific settings.json values on container create.
"customizations": {
{%- if cookiecutter.editor == "VS Code" %}
"vscode": {
"settings": {
"editor.formatOnSave": true,
"[python]": {
"analysis.autoImportCompletions": true,
"analysis.typeCheckingMode": "basic",
"defaultInterpreterPath": "/usr/local/bin/python",
"editor.codeActionsOnSave": {
"source.organizeImports": "always"
},
"editor.defaultFormatter": "charliermarsh.ruff",
"languageServer": "Pylance",
"linting.enabled": true,
"linting.mypyEnabled": true,
"linting.mypyPath": "/usr/local/bin/mypy",
}
},
// https://code.visualstudio.com/docs/remote/devcontainerjson-reference#_vs-code-specific-properties
// Add the IDs of extensions you want installed when the container is created.
"extensions": [
"davidanson.vscode-markdownlint",
"mrmlnc.vscode-duplicate",
"visualstudioexptteam.vscodeintellicode",
"visualstudioexptteam.intellicode-api-usage-examples",
// python
"ms-python.python",
"ms-python.vscode-pylance",
"charliermarsh.ruff",
// django
"batisteo.vscode-django"
]
}
{%- endif %}
},
// Uncomment the next line if you want start specific services in your Docker Compose config.
// "runServices": [],
// Uncomment the next line if you want to keep your containers running after VS Code shuts down.
// "shutdownAction": "none",
// Uncomment the next line to run commands after the container is created.
"postCreateCommand": "cat .devcontainer/bashrc.override.sh >> ~/.bashrc"
}

View File

@ -8,3 +8,5 @@
.readthedocs.yml .readthedocs.yml
.travis.yml .travis.yml
venv venv
.git
.envs/

View File

@ -0,0 +1,49 @@
kind: pipeline
name: default
environment:
POSTGRES_USER: '{{ cookiecutter.project_slug }}'
POSTGRES_PASSWORD: ''
POSTGRES_DB: 'test_{{ cookiecutter.project_slug }}'
POSTGRES_HOST_AUTH_METHOD: trust
{%- if cookiecutter.use_celery == 'y' %}
CELERY_BROKER_URL: 'redis://redis:6379/0'
{%- endif %}
steps:
- name: lint
pull: if-not-exists
image: python:3.12
environment:
PRE_COMMIT_HOME: ${CI_PROJECT_DIR}/.cache/pre-commit
volumes:
- name: pre-commit cache
path: ${PRE_COMMIT_HOME}
commands:
- export PRE_COMMIT_HOME=$CI_PROJECT_DIR/.cache/pre-commit
- pip install -q pre-commit
- pre-commit run --show-diff-on-failure --color=always --all-files
- name: test
pull: if-not-exists
{%- if cookiecutter.use_docker == 'y' %}
image: docker:25.0
environment:
DATABASE_URL: pgsql://$POSTGRES_USER:$POSTGRES_PASSWORD@postgres/$POSTGRES_DB
commands:
- docker-compose -f docker-compose.local.yml build
- docker-compose -f docker-compose.docs.yml build
- docker-compose -f docker-compose.local.yml run --rm django python manage.py migrate
- docker-compose -f docker-compose.local.yml up -d
- docker-compose -f docker-compose.local.yml run django pytest
{%- else %}
image: python:3.12
commands:
- pip install -r requirements/local.txt
- pytest
{%- endif%}
volumes:
- name: pre-commit cache
host:
path: /tmp/drone/cache/pre-commit

View File

@ -22,6 +22,6 @@ trim_trailing_whitespace = false
[Makefile] [Makefile]
indent_style = tab indent_style = tab
[nginx.conf] [default.conf]
indent_style = space indent_style = space
indent_size = 2 indent_size = 2

View File

@ -44,6 +44,12 @@ DJANGO_AWS_STORAGE_BUCKET_NAME=
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
GOOGLE_APPLICATION_CREDENTIALS= GOOGLE_APPLICATION_CREDENTIALS=
DJANGO_GCP_STORAGE_BUCKET_NAME= DJANGO_GCP_STORAGE_BUCKET_NAME=
{% elif cookiecutter.cloud_provider == 'Azure' %}
# Azure
# ------------------------------------------------------------------------------
DJANGO_AZURE_ACCOUNT_KEY=
DJANGO_AZURE_ACCOUNT_NAME=
DJANGO_AZURE_CONTAINER_NAME=
{% endif %} {% endif %}
# django-allauth # django-allauth
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------

View File

@ -4,11 +4,11 @@
version: 2 version: 2
updates: updates:
# Update GitHub actions in workflows # Update GitHub actions in workflows
- package-ecosystem: "github-actions" - package-ecosystem: 'github-actions'
directory: "/" directory: '/'
# Check for updates to GitHub Actions every weekday # Every weekday
schedule: schedule:
interval: "daily" interval: 'daily'
{%- if cookiecutter.use_docker == 'y' %} {%- if cookiecutter.use_docker == 'y' %}
@ -16,80 +16,92 @@ updates:
# We need to specify each Dockerfile in a separate entry because Dependabot doesn't # We need to specify each Dockerfile in a separate entry because Dependabot doesn't
# support wildcards or recursively checking subdirectories. Check this issue for updates: # support wildcards or recursively checking subdirectories. Check this issue for updates:
# https://github.com/dependabot/dependabot-core/issues/2178 # https://github.com/dependabot/dependabot-core/issues/2178
- package-ecosystem: "docker" - package-ecosystem: 'docker'
# Look for a `Dockerfile` in the `compose/local/django` directory # Look for a `Dockerfile` in the `compose/local/django` directory
directory: "compose/local/django/" directory: 'compose/local/django/'
# Check for updates to GitHub Actions every weekday # Every weekday
schedule: schedule:
interval: "daily" interval: 'daily'
# Ignore minor version updates (3.10 -> 3.11) but update patch versions
ignore:
- dependency-name: '*'
update-types:
- 'version-update:semver-major'
- 'version-update:semver-minor'
# Enable version updates for Docker - package-ecosystem: 'docker'
- package-ecosystem: "docker"
# Look for a `Dockerfile` in the `compose/local/docs` directory # Look for a `Dockerfile` in the `compose/local/docs` directory
directory: "compose/local/docs/" directory: 'compose/local/docs/'
# Check for updates to GitHub Actions every weekday # Every weekday
schedule: schedule:
interval: "daily" interval: 'daily'
# Ignore minor version updates (3.10 -> 3.11) but update patch versions
ignore:
- dependency-name: '*'
update-types:
- 'version-update:semver-major'
- 'version-update:semver-minor'
# Enable version updates for Docker - package-ecosystem: 'docker'
- package-ecosystem: "docker"
# Look for a `Dockerfile` in the `compose/local/node` directory # Look for a `Dockerfile` in the `compose/local/node` directory
directory: "compose/local/node/" directory: 'compose/local/node/'
# Check for updates to GitHub Actions every weekday # Every weekday
schedule: schedule:
interval: "daily" interval: 'daily'
# Enable version updates for Docker - package-ecosystem: 'docker'
- package-ecosystem: "docker"
# Look for a `Dockerfile` in the `compose/production/aws` directory # Look for a `Dockerfile` in the `compose/production/aws` directory
directory: "compose/production/aws/" directory: 'compose/production/aws/'
# Check for updates to GitHub Actions every weekday # Every weekday
schedule: schedule:
interval: "daily" interval: 'daily'
# Enable version updates for Docker - package-ecosystem: 'docker'
- package-ecosystem: "docker"
# Look for a `Dockerfile` in the `compose/production/django` directory # Look for a `Dockerfile` in the `compose/production/django` directory
directory: "compose/production/django/" directory: 'compose/production/django/'
# Check for updates to GitHub Actions every weekday # Every weekday
schedule: schedule:
interval: "daily" interval: 'daily'
# Ignore minor version updates (3.10 -> 3.11) but update patch versions
ignore:
- dependency-name: '*'
update-types:
- 'version-update:semver-major'
- 'version-update:semver-minor'
# Enable version updates for Docker - package-ecosystem: 'docker'
- package-ecosystem: "docker"
# Look for a `Dockerfile` in the `compose/production/postgres` directory # Look for a `Dockerfile` in the `compose/production/postgres` directory
directory: "compose/production/postgres/" directory: 'compose/production/postgres/'
# Check for updates to GitHub Actions every weekday # Every weekday
schedule: schedule:
interval: "daily" interval: 'daily'
# Enable version updates for Docker - package-ecosystem: 'docker'
- package-ecosystem: "docker"
# Look for a `Dockerfile` in the `compose/production/traefik` directory # Look for a `Dockerfile` in the `compose/production/traefik` directory
directory: "compose/production/traefik/" directory: 'compose/production/traefik/'
# Check for updates to GitHub Actions every weekday # Every weekday
schedule: schedule:
interval: "daily" interval: 'daily'
{%- endif %} {%- endif %}
# Enable version updates for Python/Pip - Production # Enable version updates for Python/Pip - Production
- package-ecosystem: "pip" - package-ecosystem: 'pip'
# Look for a `requirements.txt` in the `root` directory # Look for a `requirements.txt` in the `root` directory
# also 'setup.cfg', 'runtime.txt' and 'requirements/*.txt' # also 'setup.cfg', 'runtime.txt' and 'requirements/*.txt'
directory: "/" directory: '/'
# Check for updates to GitHub Actions every weekday # Every weekday
schedule: schedule:
interval: "daily" interval: 'daily'
{%- if cookiecutter.frontend_pipeline == 'Gulp' %} {%- if cookiecutter.frontend_pipeline == 'Gulp' %}
# Enable version updates for javascript/npm # Enable version updates for javascript/npm
- package-ecosystem: "npm" - package-ecosystem: 'npm'
# Look for a `packages.json' in the `root` directory # Look for a `packages.json` in the `root` directory
directory: "/" directory: '/'
# Check for updates to GitHub Actions every weekday # Every weekday
schedule: schedule:
interval: "daily" interval: 'daily'
{%- endif %} {%- endif %}

View File

@ -7,12 +7,12 @@ env:
on: on:
pull_request: pull_request:
branches: [ "master", "main" ] branches: ['master', 'main']
paths-ignore: [ "docs/**" ] paths-ignore: ['docs/**']
push: push:
branches: [ "master", "main" ] branches: ['master', 'main']
paths-ignore: [ "docs/**" ] paths-ignore: ['docs/**']
concurrency: concurrency:
group: {% raw %}${{ github.head_ref || github.run_id }}{% endraw %} group: {% raw %}${{ github.head_ref || github.run_id }}{% endraw %}
@ -22,23 +22,21 @@ jobs:
linter: linter:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Checkout Code Repository - name: Checkout Code Repository
uses: actions/checkout@v3 uses: actions/checkout@v4
- name: Set up Python - name: Set up Python
uses: actions/setup-python@v3 uses: actions/setup-python@v5
with: with:
python-version: "3.10" python-version: '3.12'
cache: pip
cache-dependency-path: |
requirements/base.txt
requirements/local.txt
{%- if cookiecutter.open_source_license != 'Not open source' %}
# Consider using pre-commit.ci for open source project
{%- endif %}
- name: Run pre-commit - name: Run pre-commit
uses: pre-commit/action@v2.0.3 uses: pre-commit/action@v3.0.1
# With no caching at all the entire ci process takes 4m 30s to complete! # With no caching at all the entire ci process takes 3m to complete!
pytest: pytest:
runs-on: ubuntu-latest runs-on: ubuntu-latest
{%- if cookiecutter.use_docker == 'n' %} {%- if cookiecutter.use_docker == 'n' %}
@ -51,7 +49,7 @@ jobs:
- 6379:6379 - 6379:6379
{%- endif %} {%- endif %}
postgres: postgres:
image: postgres:12 image: postgres:{{ cookiecutter.postgresql_version }}
ports: ports:
- 5432:5432 - 5432:5432
env: env:
@ -59,35 +57,37 @@ jobs:
env: env:
{%- if cookiecutter.use_celery == 'y' %} {%- if cookiecutter.use_celery == 'y' %}
CELERY_BROKER_URL: "redis://localhost:6379/0" CELERY_BROKER_URL: 'redis://localhost:6379/0'
{%- endif %} {%- endif %}
# postgres://user:password@host:port/database # postgres://user:password@host:port/database
DATABASE_URL: "postgres://postgres:postgres@localhost:5432/postgres" DATABASE_URL: 'postgres://postgres:postgres@localhost:5432/postgres'
{%- endif %} {%- endif %}
steps: steps:
- name: Checkout Code Repository - name: Checkout Code Repository
uses: actions/checkout@v3 uses: actions/checkout@v4
{%- if cookiecutter.use_docker == 'y' %} {%- if cookiecutter.use_docker == 'y' %}
- name: Build the Stack - name: Build the Stack
run: docker-compose -f local.yml build run: docker compose -f docker-compose.local.yml build django
- name: Build the docs
run: docker compose -f docker-compose.docs.yml build docs
- name: Run DB Migrations - name: Run DB Migrations
run: docker-compose -f local.yml run --rm django python manage.py migrate run: docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
- name: Run Django Tests - name: Run Django Tests
run: docker-compose -f local.yml run django pytest run: docker compose -f docker-compose.local.yml run django pytest
- name: Tear down the Stack - name: Tear down the Stack
run: docker-compose -f local.yml down run: docker compose -f docker-compose.local.yml down
{%- else %} {%- else %}
- name: Set up Python - name: Set up Python
uses: actions/setup-python@v3 uses: actions/setup-python@v4
with: with:
python-version: "3.10" python-version: '3.12'
cache: pip cache: pip
cache-dependency-path: | cache-dependency-path: |
requirements/base.txt requirements/base.txt
@ -99,5 +99,5 @@ jobs:
pip install -r requirements/local.txt pip install -r requirements/local.txt
- name: Test with pytest - name: Test with pytest
run: pytest run: pytest
{%- endif %} {%- endif %}

View File

@ -161,11 +161,10 @@ typings/
!.vscode/extensions.json !.vscode/extensions.json
*.code-workspace *.code-workspace
# Local History for Visual Studio Code # Local History for devcontainer
.history/ .devcontainer/bash_history
{% if cookiecutter.editor == 'PyCharm' -%}
{% if cookiecutter.use_pycharm == 'y' -%}
# Provided default Pycharm Run/Debug Configurations should be tracked by git # Provided default Pycharm Run/Debug Configurations should be tracked by git
# In case of local modifications made by Pycharm, use update-index command # In case of local modifications made by Pycharm, use update-index command
# for each changed file, like this: # for each changed file, like this:
@ -326,9 +325,12 @@ Session.vim
# Auto-generated tag files # Auto-generated tag files
tags tags
# Redis dump file
dump.rdb
### Project template ### Project template
{%- if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' %} {%- if cookiecutter.use_mailpit == 'y' and cookiecutter.use_docker == 'n' %}
MailHog mailpit
{%- endif %} {%- endif %}
{{ cookiecutter.project_slug }}/media/ {{ cookiecutter.project_slug }}/media/
@ -343,4 +345,9 @@ project.css
project.min.css project.min.css
vendors.js vendors.js
*.min.js *.min.js
*.min.js.map
{%- endif %}
{%- if cookiecutter.frontend_pipeline == 'Webpack' %}
{{ cookiecutter.project_slug }}/static/webpack_bundles/
webpack-stats.json
{%- endif %} {%- endif %}

View File

@ -7,46 +7,49 @@ variables:
POSTGRES_PASSWORD: '' POSTGRES_PASSWORD: ''
POSTGRES_DB: 'test_{{ cookiecutter.project_slug }}' POSTGRES_DB: 'test_{{ cookiecutter.project_slug }}'
POSTGRES_HOST_AUTH_METHOD: trust POSTGRES_HOST_AUTH_METHOD: trust
{% if cookiecutter.use_celery == 'y' -%} {%- if cookiecutter.use_celery == 'y' %}
CELERY_BROKER_URL: 'redis://redis:6379/0' CELERY_BROKER_URL: 'redis://redis:6379/0'
{%- endif %} {%- endif %}
flake8: precommit:
stage: lint stage: lint
image: python:3.10-alpine image: python:3.12
variables:
PRE_COMMIT_HOME: ${CI_PROJECT_DIR}/.cache/pre-commit
cache:
paths:
- ${PRE_COMMIT_HOME}
before_script: before_script:
- pip install -q flake8 - pip install -q pre-commit
script: script:
- flake8 - pre-commit run --show-diff-on-failure --color=always --all-files
pytest: pytest:
stage: test stage: test
{% if cookiecutter.use_docker == 'y' -%} {%- if cookiecutter.use_docker == 'y' %}
image: docker/compose:1.29.2 image: docker:25.0
tags: tags:
- docker - docker
services: services:
- docker:dind - docker:dind
before_script: before_script:
- docker-compose -f local.yml build - docker compose -f docker-compose.local.yml build
- docker compose -f docker-compose.docs.yml build
# Ensure celerybeat does not crash due to non-existent tables # Ensure celerybeat does not crash due to non-existent tables
- docker-compose -f local.yml run --rm django python manage.py migrate - docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
- docker-compose -f local.yml up -d - docker compose -f docker-compose.local.yml up -d
script: script:
- docker-compose -f local.yml run django pytest - docker compose -f docker-compose.local.yml run django pytest
{%- else -%} {%- else %}
image: python:3.10 image: python:3.12
tags: tags:
- python - python
services: services:
- postgres:{{ cookiecutter.postgresql_version }} - postgres:{{ cookiecutter.postgresql_version }}
variables: variables:
DATABASE_URL: pgsql://$POSTGRES_USER:$POSTGRES_PASSWORD@postgres/$POSTGRES_DB DATABASE_URL: pgsql://$POSTGRES_USER:$POSTGRES_PASSWORD@postgres/$POSTGRES_DB
before_script: before_script:
- pip install -r requirements/local.txt - pip install -r requirements/local.txt
script: script:
- pytest - pytest
{%- endif %} {%- endif %}

View File

@ -10,7 +10,7 @@
<option value="celeryworker"/> <option value="celeryworker"/>
<option value="celerybeat"/> <option value="celerybeat"/>
{%- endif %} {%- endif %}
{%- if cookiecutter.frontend_pipeline == 'Gulp' %} {%- if cookiecutter.frontend_pipeline in ['Gulp', 'Webpack'] %}
<option value="node"/> <option value="node"/>
{%- endif %} {%- endif %}
</list> </list>

View File

@ -13,7 +13,7 @@
</facet> </facet>
</component> </component>
<component name="NewModuleRootManager"> <component name="NewModuleRootManager">
{% if cookiecutter.frontend_pipeline == 'Gulp' %} {% if cookiecutter.frontend_pipeline in ['Gulp', 'Webpack'] %}
<content url="file://$MODULE_DIR$"> <content url="file://$MODULE_DIR$">
<excludeFolder url="file://$MODULE_DIR$/node_modules" /> <excludeFolder url="file://$MODULE_DIR$/node_modules" />
</content> </content>

View File

@ -1,36 +1,53 @@
exclude: "^docs/|/migrations/" exclude: '^docs/|/migrations/|devcontainer.json'
default_stages: [commit] default_stages: [commit]
default_language_version:
python: python3.12
repos: repos:
- repo: https://github.com/pre-commit/pre-commit-hooks - repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0 rev: v4.6.0
hooks: hooks:
- id: trailing-whitespace - id: trailing-whitespace
- id: end-of-file-fixer - id: end-of-file-fixer
- id: check-json
- id: check-toml
- id: check-xml
- id: check-yaml - id: check-yaml
- id: debug-statements
- id: check-builtin-literals
- id: check-case-conflict
- id: check-docstring-first
- id: detect-private-key
- repo: https://github.com/asottile/pyupgrade - repo: https://github.com/pre-commit/mirrors-prettier
rev: v3.2.0 rev: v4.0.0-alpha.8
hooks: hooks:
- id: pyupgrade - id: prettier
args: [--py310-plus] args: ['--tab-width', '2', '--single-quote']
exclude: '{{cookiecutter.project_slug}}/templates/'
- repo: https://github.com/psf/black - repo: https://github.com/adamchainz/django-upgrade
rev: 22.10.0 rev: '1.17.0'
hooks: hooks:
- id: black - id: django-upgrade
args: ['--target-version', '4.2']
- repo: https://github.com/PyCQA/isort # Run the Ruff linter.
rev: 5.10.1 - repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.4.4
hooks: hooks:
- id: isort # Linter
- id: ruff
args: [--fix, --exit-non-zero-on-fix]
# Formatter
- id: ruff-format
- repo: https://github.com/PyCQA/flake8 - repo: https://github.com/Riverside-Healthcare/djLint
rev: 5.0.4 rev: v1.34.1
hooks: hooks:
- id: flake8 - id: djlint-reformat-django
args: ["--config=setup.cfg"] - id: djlint-django
additional_dependencies: [flake8-isort]
# sets up .pre-commit-ci.yaml to ensure pre-commit dependencies stay up to date # sets up .pre-commit-ci.yaml to ensure pre-commit dependencies stay up to date
ci: ci:

View File

@ -1,14 +0,0 @@
[MASTER]
load-plugins=pylint_django{% if cookiecutter.use_celery == "y" %}, pylint_celery{% endif %}
django-settings-module=config.settings.local
[FORMAT]
max-line-length=120
[MESSAGES CONTROL]
disable=missing-docstring,invalid-name
[DESIGN]
max-parents=13
[TYPECHECK]
generated-members=REQUEST,acl_users,aq_parent,"[a-zA-Z]+_set{1,2}",save,delete

View File

@ -1,12 +1,20 @@
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2 version: 2
# Set the version of Python and other tools you might need
build:
os: ubuntu-22.04
tools:
python: '3.12'
# Build documentation in the docs/ directory with Sphinx
sphinx: sphinx:
configuration: docs/conf.py configuration: docs/conf.py
build: # Python requirements required to build your docs
image: testing
python: python:
version: 3.10
install: install:
- requirements: requirements/local.txt - requirements: requirements/local.txt

View File

@ -2,7 +2,7 @@ dist: focal
language: python language: python
python: python:
- "3.10" - "3.12"
services: services:
- {% if cookiecutter.use_docker == 'y' %}docker{% else %}postgresql{% endif %} - {% if cookiecutter.use_docker == 'y' %}docker{% else %}postgresql{% endif %}
@ -10,23 +10,24 @@ jobs:
include: include:
- name: "Linter" - name: "Linter"
before_script: before_script:
- pip install -q flake8 - pip install -q ruff
script: script:
- "flake8" - ruff check .
- name: "Django Test" - name: "Django Test"
{%- if cookiecutter.use_docker == 'y' %} {%- if cookiecutter.use_docker == 'y' %}
before_script: before_script:
- docker-compose -v - docker compose -v
- docker -v - docker -v
- docker-compose -f local.yml build - docker compose -f docker-compose.local.yml build
- docker compose -f docker-compose.docs.yml build
# Ensure celerybeat does not crash due to non-existent tables # Ensure celerybeat does not crash due to non-existent tables
- docker-compose -f local.yml run --rm django python manage.py migrate - docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
- docker-compose -f local.yml up -d - docker compose -f docker-compose.local.yml up -d
script: script:
- "docker-compose -f local.yml run django pytest" - docker compose -f docker-compose.local.yml run django pytest
after_failure: after_failure:
- docker-compose -f local.yml logs - docker compose -f docker-compose.local.yml logs
{%- else %} {%- else %}
before_install: before_install:
- sudo apt-get update -qq - sudo apt-get update -qq
@ -37,9 +38,9 @@ jobs:
- sudo apt-get install -qq libsqlite3-dev libxml2 libxml2-dev libssl-dev libbz2-dev wget curl llvm - sudo apt-get install -qq libsqlite3-dev libxml2 libxml2-dev libssl-dev libbz2-dev wget curl llvm
language: python language: python
python: python:
- "3.10" - "3.12"
install: install:
- pip install -r requirements/local.txt - pip install -r requirements/local.txt
script: script:
- "pytest" - pytest
{%- endif %} {%- endif %}

View File

@ -3,7 +3,7 @@
{{ cookiecutter.description }} {{ cookiecutter.description }}
[![Built with Cookiecutter Django](https://img.shields.io/badge/built%20with-Cookiecutter%20Django-ff69b4.svg?logo=cookiecutter)](https://github.com/cookiecutter/cookiecutter-django/) [![Built with Cookiecutter Django](https://img.shields.io/badge/built%20with-Cookiecutter%20Django-ff69b4.svg?logo=cookiecutter)](https://github.com/cookiecutter/cookiecutter-django/)
[![Black code style](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black) [![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)
{%- if cookiecutter.open_source_license != "Not open source" %} {%- if cookiecutter.open_source_license != "Not open source" %}
@ -18,11 +18,11 @@ Moved to [settings](http://cookiecutter-django.readthedocs.io/en/latest/settings
### Setting Up Your Users ### Setting Up Your Users
- To create a **normal user account**, just go to Sign Up and fill out the form. Once you submit it, you'll see a "Verify Your E-mail Address" page. Go to your console to see a simulated email verification message. Copy the link into your browser. Now the user's email should be verified and ready to go. - To create a **normal user account**, just go to Sign Up and fill out the form. Once you submit it, you'll see a "Verify Your E-mail Address" page. Go to your console to see a simulated email verification message. Copy the link into your browser. Now the user's email should be verified and ready to go.
- To create a **superuser account**, use this command: - To create a **superuser account**, use this command:
$ python manage.py createsuperuser $ python manage.py createsuperuser
For convenience, you can keep your normal user logged in on Chrome and your superuser logged in on Firefox (or similar), so that you can see how the site behaves for both kinds of users. For convenience, you can keep your normal user logged in on Chrome and your superuser logged in on Firefox (or similar), so that you can see how the site behaves for both kinds of users.
@ -56,45 +56,57 @@ This app comes with Celery.
To run a celery worker: To run a celery worker:
``` bash ```bash
cd {{cookiecutter.project_slug}} cd {{cookiecutter.project_slug}}
celery -A config.celery_app worker -l info celery -A config.celery_app worker -l info
``` ```
Please note: For Celery's import magic to work, it is important *where* the celery commands are run. If you are in the same folder with *manage.py*, you should be right. Please note: For Celery's import magic to work, it is important _where_ the celery commands are run. If you are in the same folder with _manage.py_, you should be right.
To run [periodic tasks](https://docs.celeryq.dev/en/stable/userguide/periodic-tasks.html), you'll need to start the celery beat scheduler service. You can start it as a standalone process:
```bash
cd {{cookiecutter.project_slug}}
celery -A config.celery_app beat
```
or you can embed the beat service inside a worker with the `-B` option (not recommended for production use):
```bash
cd {{cookiecutter.project_slug}}
celery -A config.celery_app worker -B -l info
```
{%- endif %} {%- endif %}
{%- if cookiecutter.use_mailhog == "y" %} {%- if cookiecutter.use_mailpit == "y" %}
### Email Server ### Email Server
{%- if cookiecutter.use_docker == "y" %} {%- if cookiecutter.use_docker == "y" %}
In development, it is often nice to be able to see emails that are being sent from your application. For that reason local SMTP server [MailHog](https://github.com/mailhog/MailHog) with a web interface is available as docker container. In development, it is often nice to be able to see emails that are being sent from your application. For that reason local SMTP server [Mailpit](https://github.com/axllent/mailpit) with a web interface is available as docker container.
Container mailhog will start automatically when you will run all docker containers. Container mailpit will start automatically when you will run all docker containers.
Please check [cookiecutter-django Docker documentation](http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-docker.html) for more details how to start all containers. Please check [cookiecutter-django Docker documentation](http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-docker.html) for more details how to start all containers.
With MailHog running, to view messages that are sent by your application, open your browser and go to `http://127.0.0.1:8025` With Mailpit running, to view messages that are sent by your application, open your browser and go to `http://127.0.0.1:8025`
{%- else %} {%- else %}
In development, it is often nice to be able to see emails that are being sent from your application. If you choose to use [MailHog](https://github.com/mailhog/MailHog) when generating the project a local SMTP server with a web interface will be available. In development, it is often nice to be able to see emails that are being sent from your application. If you choose to use [Mailpit](https://github.com/axllent/mailpit) when generating the project a local SMTP server with a web interface will be available.
1. [Download the latest MailHog release](https://github.com/mailhog/MailHog/releases) for your OS. 1. [Download the latest Mailpit release](https://github.com/axllent/mailpit/releases) for your OS.
2. Rename the build to `MailHog`. 2. Copy the binary file to the project root.
3. Copy the file to the project root. 3. Make it executable:
4. Make it executable: $ chmod +x mailpit
$ chmod +x MailHog 4. Spin up another terminal window and start it there:
5. Spin up another terminal window and start it there: ./mailpit
./MailHog 5. Check out <http://127.0.0.1:8025/> to see how it goes.
6. Check out <http://127.0.0.1:8025/> to see how it goes.
Now you have your own mail server running locally, ready to receive whatever you send it. Now you have your own mail server running locally, ready to receive whatever you send it.
@ -128,15 +140,16 @@ See detailed [cookiecutter-django Heroku documentation](http://cookiecutter-djan
See detailed [cookiecutter-django Docker documentation](http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-docker.html). See detailed [cookiecutter-django Docker documentation](http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-docker.html).
{%- endif %} {%- endif %}
{%- if cookiecutter.frontend_pipeline == 'Gulp' %} {%- if cookiecutter.frontend_pipeline in ['Gulp', 'Webpack'] %}
### Custom Bootstrap Compilation ### Custom Bootstrap Compilation
The generated CSS is set up with automatic Bootstrap recompilation with variables of your choice. The generated CSS is set up with automatic Bootstrap recompilation with variables of your choice.
Bootstrap v5 is installed using npm and customised by tweaking your variables in `static/sass/custom_bootstrap_vars`. Bootstrap v5 is installed using npm and customised by tweaking your variables in `static/sass/custom_bootstrap_vars`.
You can find a list of available variables [in the bootstrap source](https://github.com/twbs/bootstrap/blob/main/scss/_variables.scss), or get explanations on them in the [Bootstrap docs](https://getbootstrap.com/docs/5.1/customize/sass/). You can find a list of available variables [in the bootstrap source](https://github.com/twbs/bootstrap/blob/v5.1.3/scss/_variables.scss), or get explanations on them in the [Bootstrap docs](https://getbootstrap.com/docs/5.1/customize/sass/).
Bootstrap's javascript as well as its dependencies is concatenated into a single file: `static/js/vendors.js`. Bootstrap's javascript as well as its dependencies are concatenated into a single file: `static/js/vendors.js`.
{%- endif %} {%- endif %}
{% if cookiecutter.open_source_license != "Not open source" %} {% if cookiecutter.open_source_license != "Not open source" %}

View File

@ -1,4 +1,5 @@
#!/usr/bin/env bash #!/usr/bin/env bash
{%- if cookiecutter.frontend_pipeline == "Django Compressor" %}
compress_enabled() { compress_enabled() {
python << END python << END
@ -19,4 +20,7 @@ if compress_enabled
then then
python manage.py compress python manage.py compress
fi fi
{%- endif %}
python manage.py collectstatic --noinput python manage.py collectstatic --noinput
python manage.py compilemessages -i site-packages

View File

@ -1,7 +1,5 @@
ARG PYTHON_VERSION=3.10-slim-bullseye # define an alias for the specific python version used in this file.
FROM docker.io/python:3.12.3-slim-bookworm as python
# define an alias for the specfic python version used in this file.
FROM python:${PYTHON_VERSION} as python
# Python build stage # Python build stage
FROM python as python-build-stage FROM python as python-build-stage
@ -12,7 +10,7 @@ ARG BUILD_ENVIRONMENT=local
RUN apt-get update && apt-get install --no-install-recommends -y \ RUN apt-get update && apt-get install --no-install-recommends -y \
# dependencies for building Python packages # dependencies for building Python packages
build-essential \ build-essential \
# psycopg2 dependencies # psycopg dependencies
libpq-dev libpq-dev
# Requirements are installed here to ensure they will be cached. # Requirements are installed here to ensure they will be cached.
@ -35,9 +33,21 @@ ENV BUILD_ENV ${BUILD_ENVIRONMENT}
WORKDIR ${APP_HOME} WORKDIR ${APP_HOME}
{% if cookiecutter.use_docker == "y" %}
# devcontainer dependencies and utils
RUN apt-get update && apt-get install --no-install-recommends -y \
sudo git bash-completion nano ssh
# Create devcontainer user and add it to sudoers
RUN groupadd --gid 1000 dev-user \
&& useradd --uid 1000 --gid dev-user --shell /bin/bash --create-home dev-user \
&& echo dev-user ALL=\(root\) NOPASSWD:ALL > /etc/sudoers.d/dev-user \
&& chmod 0440 /etc/sudoers.d/dev-user
{% endif %}
# Install required system dependencies # Install required system dependencies
RUN apt-get update && apt-get install --no-install-recommends -y \ RUN apt-get update && apt-get install --no-install-recommends -y \
# psycopg2 dependencies # psycopg dependencies
libpq-dev \ libpq-dev \
# Translations dependencies # Translations dependencies
gettext \ gettext \
@ -51,7 +61,7 @@ COPY --from=python-build-stage /usr/src/app/wheels /wheels/
# use wheels to install python dependencies # use wheels to install python dependencies
RUN pip install --no-cache-dir --no-index --find-links=/wheels/ /wheels/* \ RUN pip install --no-cache-dir --no-index --find-links=/wheels/ /wheels/* \
&& rm -rf /wheels/ && rm -rf /wheels/
COPY ./compose/production/django/entrypoint /entrypoint COPY ./compose/production/django/entrypoint /entrypoint
RUN sed -i 's/\r$//g' /entrypoint RUN sed -i 's/\r$//g' /entrypoint

View File

@ -5,4 +5,4 @@ set -o nounset
rm -f './celerybeat.pid' rm -f './celerybeat.pid'
celery -A config.celery_app beat -l INFO exec watchfiles --filter python celery.__main__.main --args '-A config.celery_app beat -l INFO'

View File

@ -3,9 +3,6 @@
set -o errexit set -o errexit
set -o nounset set -o nounset
exec watchfiles --filter python celery.__main__.main \
celery \ --args \
-A config.celery_app \ "-A config.celery_app -b \"${CELERY_BROKER_URL}\" flower --basic_auth=\"${CELERY_FLOWER_USER}:${CELERY_FLOWER_PASSWORD}\""
-b "${CELERY_BROKER_URL}" \
flower \
--basic_auth="${CELERY_FLOWER_USER}:${CELERY_FLOWER_PASSWORD}"

View File

@ -4,4 +4,4 @@ set -o errexit
set -o nounset set -o nounset
watchfiles celery.__main__.main --args '-A config.celery_app worker -l INFO' exec watchfiles --filter python celery.__main__.main --args '-A config.celery_app worker -l INFO'

View File

@ -7,7 +7,7 @@ set -o nounset
python manage.py migrate python manage.py migrate
{%- if cookiecutter.use_async == 'y' %} {%- if cookiecutter.use_async == 'y' %}
uvicorn config.asgi:application --host 0.0.0.0 --reload --reload-include '*.html' exec uvicorn config.asgi:application --host 0.0.0.0 --reload --reload-include '*.html'
{%- else %} {%- else %}
python manage.py runserver_plus 0.0.0.0:8000 exec python manage.py runserver_plus 0.0.0.0:8000
{%- endif %} {%- endif %}

View File

@ -1,7 +1,5 @@
ARG PYTHON_VERSION=3.10-slim-bullseye # define an alias for the specific python version used in this file.
FROM docker.io/python:3.12.3-slim-bookworm as python
# define an alias for the specfic python version used in this file.
FROM python:${PYTHON_VERSION} as python
# Python build stage # Python build stage
@ -12,7 +10,7 @@ ENV PYTHONDONTWRITEBYTECODE 1
RUN apt-get update && apt-get install --no-install-recommends -y \ RUN apt-get update && apt-get install --no-install-recommends -y \
# dependencies for building Python packages # dependencies for building Python packages
build-essential \ build-essential \
# psycopg2 dependencies # psycopg dependencies
libpq-dev \ libpq-dev \
# cleaning up unused files # cleaning up unused files
&& apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \ && apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \
@ -37,7 +35,7 @@ ENV PYTHONDONTWRITEBYTECODE 1
RUN apt-get update && apt-get install --no-install-recommends -y \ RUN apt-get update && apt-get install --no-install-recommends -y \
# To run the Makefile # To run the Makefile
make \ make \
# psycopg2 dependencies # psycopg dependencies
libpq-dev \ libpq-dev \
# Translations dependencies # Translations dependencies
gettext \ gettext \

View File

@ -4,4 +4,4 @@ set -o errexit
set -o pipefail set -o pipefail
set -o nounset set -o nounset
make livehtml exec make livehtml

View File

@ -1,4 +1,4 @@
FROM node:16-bullseye-slim FROM docker.io/node:20-bookworm-slim
WORKDIR /app WORKDIR /app

View File

@ -1,4 +1,4 @@
FROM garland/aws-cli-docker:1.15.47 FROM docker.io/garland/aws-cli-docker:1.16.140
COPY ./compose/production/aws/maintenance /usr/local/bin/maintenance COPY ./compose/production/aws/maintenance /usr/local/bin/maintenance
COPY ./compose/production/postgres/maintenance/_sourced /usr/local/bin/maintenance/_sourced COPY ./compose/production/postgres/maintenance/_sourced /usr/local/bin/maintenance/_sourced

View File

@ -3,7 +3,7 @@
### Download a file from your Amazon S3 bucket to the postgres /backups folder ### Download a file from your Amazon S3 bucket to the postgres /backups folder
### ###
### Usage: ### Usage:
### $ docker-compose -f production.yml run --rm awscli <1> ### $ docker compose -f docker-compose.production.yml run --rm awscli <1>
set -o errexit set -o errexit
set -o pipefail set -o pipefail

View File

@ -3,7 +3,7 @@
### Upload the /backups folder to Amazon S3 ### Upload the /backups folder to Amazon S3
### ###
### Usage: ### Usage:
### $ docker-compose -f production.yml run --rm awscli upload ### $ docker compose -f docker-compose.production.yml run --rm awscli upload
set -o errexit set -o errexit
set -o pipefail set -o pipefail

View File

@ -1,7 +1,5 @@
ARG PYTHON_VERSION=3.10-slim-bullseye {% if cookiecutter.frontend_pipeline in ['Gulp', 'Webpack'] -%}
FROM docker.io/node:20-bookworm-slim as client-builder
{% if cookiecutter.frontend_pipeline == 'Gulp' -%}
FROM node:16-bullseye-slim as client-builder
ARG APP_HOME=/app ARG APP_HOME=/app
WORKDIR ${APP_HOME} WORKDIR ${APP_HOME}
@ -9,12 +7,25 @@ WORKDIR ${APP_HOME}
COPY ./package.json ${APP_HOME} COPY ./package.json ${APP_HOME}
RUN npm install && npm cache clean --force RUN npm install && npm cache clean --force
COPY . ${APP_HOME} COPY . ${APP_HOME}
{%- if cookiecutter.frontend_pipeline == 'Webpack' and cookiecutter.use_whitenoise == 'n' %}
{%- if cookiecutter.cloud_provider == 'AWS' %}
ARG DJANGO_AWS_STORAGE_BUCKET_NAME
ENV DJANGO_AWS_STORAGE_BUCKET_NAME=${DJANGO_AWS_STORAGE_BUCKET_NAME}
ARG DJANGO_AWS_S3_CUSTOM_DOMAIN
ENV DJANGO_AWS_S3_CUSTOM_DOMAIN=${DJANGO_AWS_S3_CUSTOM_DOMAIN}
{%- elif cookiecutter.cloud_provider == 'GCP' %}
ARG DJANGO_GCP_STORAGE_BUCKET_NAME
ENV DJANGO_GCP_STORAGE_BUCKET_NAME=${DJANGO_GCP_STORAGE_BUCKET_NAME}
{%- elif cookiecutter.cloud_provider == 'Azure' %}
ARG DJANGO_AZURE_ACCOUNT_NAME
ENV DJANGO_AZURE_ACCOUNT_NAME=${DJANGO_AZURE_ACCOUNT_NAME}
{%- endif %}
{%- endif %}
RUN npm run build RUN npm run build
{%- endif %} {%- endif %}
# define an alias for the specific python version used in this file.
# define an alias for the specfic python version used in this file. FROM docker.io/python:3.12.3-slim-bookworm as python
FROM python:${PYTHON_VERSION} as python
# Python build stage # Python build stage
FROM python as python-build-stage FROM python as python-build-stage
@ -25,7 +36,7 @@ ARG BUILD_ENVIRONMENT=production
RUN apt-get update && apt-get install --no-install-recommends -y \ RUN apt-get update && apt-get install --no-install-recommends -y \
# dependencies for building Python packages # dependencies for building Python packages
build-essential \ build-essential \
# psycopg2 dependencies # psycopg dependencies
libpq-dev libpq-dev
# Requirements are installed here to ensure they will be cached. # Requirements are installed here to ensure they will be cached.
@ -54,7 +65,7 @@ RUN addgroup --system django \
# Install required system dependencies # Install required system dependencies
RUN apt-get update && apt-get install --no-install-recommends -y \ RUN apt-get update && apt-get install --no-install-recommends -y \
# psycopg2 dependencies # psycopg dependencies
libpq-dev \ libpq-dev \
# Translations dependencies # Translations dependencies
gettext \ gettext \
@ -92,22 +103,29 @@ RUN sed -i 's/\r$//g' /start-celerybeat
RUN chmod +x /start-celerybeat RUN chmod +x /start-celerybeat
COPY ./compose/production/django/celery/flower/start /start-flower COPY --chown=django:django ./compose/production/django/celery/flower/start /start-flower
RUN sed -i 's/\r$//g' /start-flower RUN sed -i 's/\r$//g' /start-flower
RUN chmod +x /start-flower RUN chmod +x /start-flower
{%- endif %} {%- endif %}
# copy application code to WORKDIR # copy application code to WORKDIR
{%- if cookiecutter.frontend_pipeline == 'Gulp' %} {%- if cookiecutter.frontend_pipeline in ['Gulp', 'Webpack'] %}
COPY --from=client-builder --chown=django:django ${APP_HOME} ${APP_HOME} COPY --from=client-builder --chown=django:django ${APP_HOME} ${APP_HOME}
{% else %} {% else %}
COPY --chown=django:django . ${APP_HOME} COPY --chown=django:django . ${APP_HOME}
{%- endif %} {%- endif %}
# make django owner of the WORKDIR directory as well. # make django owner of the WORKDIR directory as well.
RUN chown django:django ${APP_HOME} RUN chown -R django:django ${APP_HOME}
USER django USER django
RUN DATABASE_URL="" \
{%- if cookiecutter.use_celery == "y" %}
CELERY_BROKER_URL="" \
{%- endif %}
DJANGO_SETTINGS_MODULE="config.settings.test" \
python manage.py compilemessages
ENTRYPOINT ["/entrypoint"] ENTRYPOINT ["/entrypoint"]

View File

@ -20,14 +20,14 @@ python << END
import sys import sys
import time import time
import psycopg2 import psycopg
suggest_unrecoverable_after = 30 suggest_unrecoverable_after = 30
start = time.time() start = time.time()
while True: while True:
try: try:
psycopg2.connect( psycopg.connect(
dbname="${POSTGRES_DB}", dbname="${POSTGRES_DB}",
user="${POSTGRES_USER}", user="${POSTGRES_USER}",
password="${POSTGRES_PASSWORD}", password="${POSTGRES_PASSWORD}",
@ -35,7 +35,7 @@ while True:
port="${POSTGRES_PORT}", port="${POSTGRES_PORT}",
) )
break break
except psycopg2.OperationalError as error: except psycopg.OperationalError as error:
sys.stderr.write("Waiting for PostgreSQL to become available...\n") sys.stderr.write("Waiting for PostgreSQL to become available...\n")
if time.time() - start > suggest_unrecoverable_after: if time.time() - start > suggest_unrecoverable_after:

View File

@ -28,7 +28,7 @@ if compress_enabled; then
fi fi
{%- endif %} {%- endif %}
{%- if cookiecutter.use_async == 'y' %} {%- if cookiecutter.use_async == 'y' %}
/usr/local/bin/gunicorn config.asgi --bind 0.0.0.0:5000 --chdir=/app -k uvicorn.workers.UvicornWorker exec /usr/local/bin/gunicorn config.asgi --bind 0.0.0.0:5000 --chdir=/app -k uvicorn.workers.UvicornWorker
{%- else %} {%- else %}
/usr/local/bin/gunicorn config.wsgi --bind 0.0.0.0:5000 --chdir=/app exec /usr/local/bin/gunicorn config.wsgi --bind 0.0.0.0:5000 --chdir=/app
{%- endif %} {%- endif %}

View File

@ -0,0 +1,2 @@
FROM docker.io/nginx:1.17.8-alpine
COPY ./compose/production/nginx/default.conf /etc/nginx/conf.d/default.conf

View File

@ -0,0 +1,7 @@
server {
listen 80;
server_name localhost;
location /media/ {
alias /usr/share/nginx/media/;
}
}

View File

@ -1,4 +1,4 @@
FROM postgres:{{ cookiecutter.postgresql_version }} FROM docker.io/postgres:{{ cookiecutter.postgresql_version }}
COPY ./compose/production/postgres/maintenance /usr/local/bin/maintenance COPY ./compose/production/postgres/maintenance /usr/local/bin/maintenance
RUN chmod +x /usr/local/bin/maintenance/* RUN chmod +x /usr/local/bin/maintenance/*

View File

@ -4,7 +4,7 @@
### Create a database backup. ### Create a database backup.
### ###
### Usage: ### Usage:
### $ docker-compose -f <environment>.yml (exec |run --rm) postgres backup ### $ docker compose -f <environment>.yml (exec |run --rm) postgres backup
set -o errexit set -o errexit

View File

@ -4,7 +4,7 @@
### View backups. ### View backups.
### ###
### Usage: ### Usage:
### $ docker-compose -f <environment>.yml (exec |run --rm) postgres backups ### $ docker compose -f <environment>.yml (exec |run --rm) postgres backups
set -o errexit set -o errexit

View File

@ -7,7 +7,7 @@
### <1> filename of an existing backup. ### <1> filename of an existing backup.
### ###
### Usage: ### Usage:
### $ docker-compose -f <environment>.yml (exec |run --rm) postgres restore <1> ### $ docker compose -f <environment>.yml (exec |run --rm) postgres restore <1>
set -o errexit set -o errexit

View File

@ -0,0 +1,36 @@
#!/usr/bin/env bash
### Remove a database backup.
###
### Parameters:
### <1> filename of a backup to remove.
###
### Usage:
### $ docker-compose -f <environment>.yml (exec |run --rm) postgres rmbackup <1>
set -o errexit
set -o pipefail
set -o nounset
working_dir="$(dirname ${0})"
source "${working_dir}/_sourced/constants.sh"
source "${working_dir}/_sourced/messages.sh"
if [[ -z ${1+x} ]]; then
message_error "Backup filename is not specified yet it is a required parameter. Make sure you provide one and try again."
exit 1
fi
backup_filename="${BACKUP_DIR_PATH}/${1}"
if [[ ! -f "${backup_filename}" ]]; then
message_error "No backup with the specified filename found. Check out the 'backups' maintenance script output to see if there is one and try again."
exit 1
fi
message_welcome "Removing the '${backup_filename}' backup file..."
rm -r "${backup_filename}"
message_success "The '${backup_filename}' database backup has been removed."

View File

@ -1,4 +1,4 @@
FROM traefik:v2.2.11 FROM docker.io/traefik:2.11.2
RUN mkdir -p /etc/traefik/acme \ RUN mkdir -p /etc/traefik/acme \
&& touch /etc/traefik/acme/acme.json \ && touch /etc/traefik/acme/acme.json \
&& chmod 600 /etc/traefik/acme/acme.json && chmod 600 /etc/traefik/acme/acme.json

View File

@ -4,29 +4,29 @@ log:
entryPoints: entryPoints:
web: web:
# http # http
address: ":80" address: ':80'
http: http:
# https://docs.traefik.io/routing/entrypoints/#entrypoint # https://doc.traefik.io/traefik/routing/entrypoints/#entrypoint
redirections: redirections:
entryPoint: entryPoint:
to: web-secure to: web-secure
web-secure: web-secure:
# https # https
address: ":443" address: ':443'
{%- if cookiecutter.use_celery == 'y' %} {%- if cookiecutter.use_celery == 'y' %}
flower: flower:
address: ":5555" address: ':5555'
{%- endif %} {%- endif %}
certificatesResolvers: certificatesResolvers:
letsencrypt: letsencrypt:
# https://docs.traefik.io/master/https/acme/#lets-encrypt # https://doc.traefik.io/traefik/https/acme/#lets-encrypt
acme: acme:
email: "{{ cookiecutter.email }}" email: '{{ cookiecutter.email }}'
storage: /etc/traefik/acme/acme.json storage: /etc/traefik/acme/acme.json
# https://docs.traefik.io/master/https/acme/#httpchallenge # https://doc.traefik.io/traefik/https/acme/#httpchallenge
httpChallenge: httpChallenge:
entryPoint: web entryPoint: web
@ -34,9 +34,9 @@ http:
routers: routers:
web-secure-router: web-secure-router:
{%- if cookiecutter.domain_name.count('.') == 1 %} {%- if cookiecutter.domain_name.count('.') == 1 %}
rule: "Host(`{{ cookiecutter.domain_name }}`) || Host(`www.{{ cookiecutter.domain_name }}`)" rule: 'Host(`{{ cookiecutter.domain_name }}`) || Host(`www.{{ cookiecutter.domain_name }}`)'
{%- else %} {%- else %}
rule: "Host(`{{ cookiecutter.domain_name }}`)" rule: 'Host(`{{ cookiecutter.domain_name }}`)'
{%- endif %} {%- endif %}
entryPoints: entryPoints:
- web-secure - web-secure
@ -44,26 +44,42 @@ http:
- csrf - csrf
service: django service: django
tls: tls:
# https://docs.traefik.io/master/routing/routers/#certresolver # https://doc.traefik.io/traefik/routing/routers/#certresolver
certResolver: letsencrypt certResolver: letsencrypt
{%- if cookiecutter.use_celery == 'y' %} {%- if cookiecutter.use_celery == 'y' %}
flower-secure-router: flower-secure-router:
rule: "Host(`{{ cookiecutter.domain_name }}`)" rule: 'Host(`{{ cookiecutter.domain_name }}`)'
entryPoints: entryPoints:
- flower - flower
service: flower service: flower
tls: tls:
# https://docs.traefik.io/master/routing/routers/#certresolver # https://doc.traefik.io/traefik/master/routing/routers/#certresolver
certResolver: letsencrypt
{%- endif %}
{%- if cookiecutter.cloud_provider == 'None' %}
web-media-router:
{%- if cookiecutter.domain_name.count('.') == 1 %}
rule: '(Host(`{{ cookiecutter.domain_name }}`) || Host(`www.{{ cookiecutter.domain_name }}`)) && PathPrefix(`/media/`)'
{%- else %}
rule: 'Host(`{{ cookiecutter.domain_name }}`) && PathPrefix(`/media/`)'
{%- endif %}
entryPoints:
- web-secure
middlewares:
- csrf
service: django-media
tls:
certResolver: letsencrypt certResolver: letsencrypt
{%- endif %} {%- endif %}
middlewares: middlewares:
csrf: csrf:
# https://docs.traefik.io/master/middlewares/headers/#hostsproxyheaders # https://doc.traefik.io/traefik/master/middlewares/http/headers/#hostsproxyheaders
# https://docs.djangoproject.com/en/dev/ref/csrf/#ajax # https://docs.djangoproject.com/en/dev/ref/csrf/#ajax
headers: headers:
hostsProxyHeaders: ["X-CSRFToken"] hostsProxyHeaders: ['X-CSRFToken']
services: services:
django: django:
@ -77,9 +93,16 @@ http:
servers: servers:
- url: http://flower:5555 - url: http://flower:5555
{%- endif %} {%- endif %}
{%- if cookiecutter.cloud_provider == 'None' %}
django-media:
loadBalancer:
servers:
- url: http://nginx:80
{%- endif %}
providers: providers:
# https://docs.traefik.io/master/providers/file/ # https://doc.traefik.io/traefik/master/providers/file/
file: file:
filename: /etc/traefik/traefik.yml filename: /etc/traefik/traefik.yml
watch: true watch: true

Some files were not shown because too many files have changed in this diff Show More