### Special Thanks
diff --git a/README.md b/README.md
new file mode 100644
index 000000000..d8fc58ec4
--- /dev/null
+++ b/README.md
@@ -0,0 +1,247 @@
+# Cookiecutter Django
+
+[](https://github.com/cookiecutter/cookiecutter-django/actions/workflows/ci.yml?query=branch%3Amaster)
+[](https://cookiecutter-django.readthedocs.io/en/latest/?badge=latest)
+[](https://pyup.io/repos/github/cookiecutter/cookiecutter-django/)
+[](https://discord.gg/uFXweDQc5a)
+[](https://www.codetriage.com/cookiecutter/cookiecutter-django)
+[](https://github.com/ambv/black)
+
+Powered by [Cookiecutter](https://github.com/cookiecutter/cookiecutter), Cookiecutter Django is a framework for jumpstarting
+production-ready Django projects quickly.
+
+- Documentation:
+- See [Troubleshooting](https://cookiecutter-django.readthedocs.io/en/latest/troubleshooting.html) for common errors and obstacles
+- If you have problems with Cookiecutter Django, please open [issues](https://github.com/cookiecutter/cookiecutter-django/issues/new) don't send
+ emails to the maintainers.
+
+## Features
+
+- For Django 4.0
+- Works with Python 3.10
+- Renders Django projects with 100% starting test coverage
+- Twitter [Bootstrap](https://github.com/twbs/bootstrap) v5
+- [12-Factor](http://12factor.net/) based settings via [django-environ](https://github.com/joke2k/django-environ)
+- Secure by default. We believe in SSL.
+- Optimized development and production settings
+- Registration via [django-allauth](https://github.com/pennersr/django-allauth)
+- Comes with custom user model ready to go
+- Optional basic ASGI setup for Websockets
+- Optional custom static build using Gulp and livereload
+- Send emails via [Anymail](https://github.com/anymail/django-anymail) (using [Mailgun](http://www.mailgun.com/) by default or Amazon SES if AWS is selected cloud provider, but switchable)
+- Media storage using Amazon S3, Google Cloud Storage or Azure Storage
+- Docker support using [docker-compose](https://github.com/docker/compose) for development and production (using [Traefik](https://traefik.io/) with [LetsEncrypt](https://letsencrypt.org/) support)
+- [Procfile](https://devcenter.heroku.com/articles/procfile) for deploying to Heroku
+- Instructions for deploying to [PythonAnywhere](https://www.pythonanywhere.com/)
+- Run tests with unittest or pytest
+- Customizable PostgreSQL version
+- Default integration with [pre-commit](https://github.com/pre-commit/pre-commit) for identifying simple issues before submission to code review
+
+## Optional Integrations
+
+*These features can be enabled during initial project setup.*
+
+- Serve static files from Amazon S3, Google Cloud Storage, Azure Storage or [Whitenoise](https://whitenoise.readthedocs.io/)
+- Configuration for [Celery](https://docs.celeryq.dev) and [Flower](https://github.com/mher/flower) (the latter in Docker setup only)
+- Integration with [MailHog](https://github.com/mailhog/MailHog) for local email testing
+- Integration with [Sentry](https://sentry.io/welcome/) for error logging
+
+## Constraints
+
+- Only maintained 3rd party libraries are used.
+- Uses PostgreSQL everywhere: 10.19 - 14.1 ([MySQL fork](https://github.com/mabdullahadeel/cookiecutter-django-mysql) also available).
+- Environment variables for configuration (This won't work with Apache/mod_wsgi).
+
+## Support this Project!
+
+This project is run by volunteers. Please support them in their efforts to maintain and improve Cookiecutter Django:
+
+- Daniel Roy Greenfeld, Project Lead ([GitHub](https://github.com/pydanny), [Patreon](https://www.patreon.com/danielroygreenfeld)): expertise in Django and AWS ELB.
+- Nikita Shupeyko, Core Developer ([GitHub](https://github.com/webyneter)): expertise in Python/Django, hands-on DevOps and frontend experience.
+
+Projects that provide financial support to the maintainers:
+
+------------------------------------------------------------------------
+
+
+
+
+
+Two Scoops of Django 3.x is the best ice cream-themed Django reference in the universe!
+
+### PyUp
+
+
+
+
+
+PyUp brings you automated security and dependency updates used by Google and other organizations. Free for open source projects!
+
+## Usage
+
+Let's pretend you want to create a Django project called "redditclone". Rather than using `startproject`
+and then editing the results to include your name, email, and various configuration issues that always get forgotten until the worst possible moment, get [cookiecutter](https://github.com/cookiecutter/cookiecutter) to do all the work.
+
+First, get Cookiecutter. Trust me, it's awesome:
+
+ $ pip install "cookiecutter>=1.7.0"
+
+Now run it against this repo:
+
+ $ cookiecutter https://github.com/cookiecutter/cookiecutter-django
+
+You'll be prompted for some values. Provide them, then a Django project will be created for you.
+
+**Warning**: After this point, change 'Daniel Greenfeld', 'pydanny', etc to your own information.
+
+Answer the prompts with your own desired [options](http://cookiecutter-django.readthedocs.io/en/latest/project-generation-options.html). For example:
+
+ Cloning into 'cookiecutter-django'...
+ remote: Counting objects: 550, done.
+ remote: Compressing objects: 100% (310/310), done.
+ remote: Total 550 (delta 283), reused 479 (delta 222)
+ Receiving objects: 100% (550/550), 127.66 KiB | 58 KiB/s, done.
+ Resolving deltas: 100% (283/283), done.
+ project_name [My Awesome Project]: Reddit Clone
+ project_slug [reddit_clone]: reddit
+ description [Behold My Awesome Project!]: A reddit clone.
+ author_name [Daniel Roy Greenfeld]: Daniel Greenfeld
+ domain_name [example.com]: myreddit.com
+ email [daniel-greenfeld@example.com]: pydanny@gmail.com
+ version [0.1.0]: 0.0.1
+ Select open_source_license:
+ 1 - MIT
+ 2 - BSD
+ 3 - GPLv3
+ 4 - Apache Software License 2.0
+ 5 - Not open source
+ Choose from 1, 2, 3, 4, 5 [1]: 1
+ timezone [UTC]: America/Los_Angeles
+ windows [n]: n
+ use_pycharm [n]: y
+ use_docker [n]: n
+ Select postgresql_version:
+ 1 - 14
+ 2 - 13
+ 3 - 12
+ 4 - 11
+ 5 - 10
+ Choose from 1, 2, 3, 4, 5 [1]: 1
+ Select cloud_provider:
+ 1 - AWS
+ 2 - GCP
+ 3 - None
+ Choose from 1, 2, 3 [1]: 1
+ Select mail_service:
+ 1 - Mailgun
+ 2 - Amazon SES
+ 3 - Mailjet
+ 4 - Mandrill
+ 5 - Postmark
+ 6 - Sendgrid
+ 7 - SendinBlue
+ 8 - SparkPost
+ 9 - Other SMTP
+ Choose from 1, 2, 3, 4, 5, 6, 7, 8, 9 [1]: 1
+ use_async [n]: n
+ use_drf [n]: y
+ Select frontend_pipeline:
+ 1 - None
+ 2 - Django Compressor
+ 3 - Gulp
+ Choose from 1, 2, 3, 4 [1]: 1
+ use_celery [n]: y
+ use_mailhog [n]: n
+ use_sentry [n]: y
+ use_whitenoise [n]: n
+ use_heroku [n]: y
+ Select ci_tool:
+ 1 - None
+ 2 - Travis
+ 3 - Gitlab
+ 4 - Github
+ Choose from 1, 2, 3, 4 [1]: 4
+ keep_local_envs_in_vcs [y]: y
+ debug [n]: n
+
+Enter the project and take a look around:
+
+ $ cd reddit/
+ $ ls
+
+Create a git repo and push it there:
+
+ $ git init
+ $ git add .
+ $ git commit -m "first awesome commit"
+ $ git remote add origin git@github.com:pydanny/redditclone.git
+ $ git push -u origin master
+
+Now take a look at your repo. Don't forget to carefully look at the generated README. Awesome, right?
+
+For local development, see the following:
+
+- [Developing locally](http://cookiecutter-django.readthedocs.io/en/latest/developing-locally.html)
+- [Developing locally using docker](http://cookiecutter-django.readthedocs.io/en/latest/developing-locally-docker.html)
+
+## Community
+
+- Have questions? **Before you ask questions anywhere else**, please post your question on [Stack Overflow](http://stackoverflow.com/questions/tagged/cookiecutter-django) under the *cookiecutter-django* tag. We check there periodically for questions.
+- If you think you found a bug or want to request a feature, please open an [issue](https://github.com/cookiecutter/cookiecutter-django/issues).
+- For anything else, you can chat with us on [Discord](https://discord.gg/uFXweDQc5a).
+
+## For Readers of Two Scoops of Django
+
+You may notice that some elements of this project do not exactly match what we describe in chapter 3. The reason for that is this project, amongst other things, serves as a test bed for trying out new ideas and concepts. Sometimes they work, sometimes they don't, but the end result is that it won't necessarily match precisely what is described in the book I co-authored.
+
+## For PyUp Users
+
+If you are using [PyUp](https://pyup.io) to keep your dependencies updated and secure, use the code *cookiecutter* during checkout to get 15% off every month.
+
+## "Your Stuff"
+
+Scattered throughout the Python and HTML of this project are places marked with "your stuff". This is where third-party libraries are to be integrated with your project.
+
+## For MySQL users
+To get full MySQL support in addition to the default Postgresql, you can use this fork of the cookiecutter-django:
+https://github.com/mabdullahadeel/cookiecutter-django-mysql
+
+## Releases
+
+Need a stable release? You can find them at
+
+## Not Exactly What You Want?
+
+This is what I want. *It might not be what you want.* Don't worry, you have options:
+
+### Fork This
+
+If you have differences in your preferred setup, I encourage you to fork this to create your own version.
+Once you have your fork working, let me know and I'll add it to a '*Similar Cookiecutter Templates*' list here.
+It's up to you whether to rename your fork.
+
+If you do rename your fork, I encourage you to submit it to the following places:
+
+- [cookiecutter](https://github.com/cookiecutter/cookiecutter) so it gets listed in the README as a template.
+- The cookiecutter [grid](https://www.djangopackages.com/grids/g/cookiecutters/) on Django Packages.
+
+### Submit a Pull Request
+
+We accept pull requests if they're small, atomic, and make our own project development
+experience better.
+
+## Articles
+
+- [Cookiecutter Django With Amazon RDS](https://haseeburrehman.com/posts/cookiecutter-django-with-amazon-rds/) - Apr, 2, 2021
+- [Using cookiecutter-django with Google Cloud Storage](https://ahhda.github.io/cloud/gce/django/2019/03/12/using-django-cookiecutter-cloud-storage.html) - Mar. 12, 2019
+- [cookiecutter-django with Nginx, Route 53 and ELB](https://msaizar.com/blog/cookiecutter-django-nginx-route-53-and-elb/) - Feb. 12, 2018
+- [cookiecutter-django and Amazon RDS](https://msaizar.com/blog/cookiecutter-django-and-amazon-rds/) - Feb. 7, 2018
+- [Using Cookiecutter to Jumpstart a Django Project on Windows with PyCharm](https://joshuahunter.com/posts/using-cookiecutter-to-jumpstart-a-django-project-on-windows-with-pycharm/) - May 19, 2017
+- [Exploring with Cookiecutter](http://www.snowboardingcoder.com/django/2016/12/03/exploring-with-cookiecutter/) - Dec. 3, 2016
+- [Introduction to Cookiecutter-Django](http://krzysztofzuraw.com/blog/2016/django-cookiecutter.html) - Feb. 19, 2016
+- [Django and GitLab - Running Continuous Integration and tests with your FREE account](http://dezoito.github.io/2016/05/11/django-gitlab-continuous-integration-phantomjs.html) - May. 11, 2016
+- [Development and Deployment of Cookiecutter-Django on Fedora](https://realpython.com/blog/python/development-and-deployment-of-cookiecutter-django-on-fedora/) - Jan. 18, 2016
+- [Development and Deployment of Cookiecutter-Django via Docker](https://realpython.com/blog/python/development-and-deployment-of-cookiecutter-django-via-docker/) - Dec. 29, 2015
+- [How to create a Django Application using Cookiecutter and Django 1.8](https://www.swapps.io/blog/how-to-create-a-django-application-using-cookiecutter-and-django-1-8/) - Sept. 12, 2015
+
+Have a blog or online publication? Write about your cookiecutter-django tips and tricks, then send us a pull request with the link.
diff --git a/README.rst b/README.rst
deleted file mode 100644
index 090497319..000000000
--- a/README.rst
+++ /dev/null
@@ -1,324 +0,0 @@
-Cookiecutter Django
-===================
-
-.. image:: https://img.shields.io/github/workflow/status/pydanny/cookiecutter-django/CI/master
- :target: https://github.com/pydanny/cookiecutter-django/actions?query=workflow%3ACI
- :alt: Build Status
-
-.. image:: https://readthedocs.org/projects/cookiecutter-django/badge/?version=latest
- :target: https://cookiecutter-django.readthedocs.io/en/latest/?badge=latest
- :alt: Documentation Status
-
-.. image:: https://pyup.io/repos/github/pydanny/cookiecutter-django/shield.svg
- :target: https://pyup.io/repos/github/pydanny/cookiecutter-django/
- :alt: Updates
-
-.. image:: https://img.shields.io/badge/cookiecutter-Join%20on%20Slack-green?style=flat&logo=slack
- :target: https://join.slack.com/t/cookie-cutter/shared_invite/enQtNzI0Mzg5NjE5Nzk5LTRlYWI2YTZhYmQ4YmU1Y2Q2NmE1ZjkwOGM0NDQyNTIwY2M4ZTgyNDVkNjMxMDdhZGI5ZGE5YmJjM2M3ODJlY2U
-
-.. image:: https://www.codetriage.com/pydanny/cookiecutter-django/badges/users.svg
- :target: https://www.codetriage.com/pydanny/cookiecutter-django
- :alt: Code Helpers Badge
-
-.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
- :target: https://github.com/ambv/black
- :alt: Code style: black
-
-Powered by Cookiecutter_, Cookiecutter Django is a framework for jumpstarting
-production-ready Django projects quickly.
-
-* Documentation: https://cookiecutter-django.readthedocs.io/en/latest/
-* See Troubleshooting_ for common errors and obstacles
-* If you have problems with Cookiecutter Django, please open issues_ don't send
- emails to the maintainers.
-
-.. _Troubleshooting: https://cookiecutter-django.readthedocs.io/en/latest/troubleshooting.html
-
-.. _528: https://github.com/pydanny/cookiecutter-django/issues/528#issuecomment-212650373
-.. _issues: https://github.com/pydanny/cookiecutter-django/issues/new
-
-Features
----------
-
-* For Django 3.1
-* Works with Python 3.9
-* Renders Django projects with 100% starting test coverage
-* Twitter Bootstrap_ v4 (`maintained Foundation fork`_ also available)
-* 12-Factor_ based settings via django-environ_
-* Secure by default. We believe in SSL.
-* Optimized development and production settings
-* Registration via django-allauth_
-* Comes with custom user model ready to go
-* Optional basic ASGI setup for Websockets
-* Optional custom static build using Gulp and livereload
-* Send emails via Anymail_ (using Mailgun_ by default or Amazon SES if AWS is selected cloud provider, but switchable)
-* Media storage using Amazon S3 or Google Cloud Storage
-* Docker support using docker-compose_ for development and production (using Traefik_ with LetsEncrypt_ support)
-* Procfile_ for deploying to Heroku
-* Instructions for deploying to PythonAnywhere_
-* Run tests with unittest or pytest
-* Customizable PostgreSQL version
-* Default integration with pre-commit_ for identifying simple issues before submission to code review
-
-.. _`maintained Foundation fork`: https://github.com/Parbhat/cookiecutter-django-foundation
-
-
-Optional Integrations
----------------------
-
-*These features can be enabled during initial project setup.*
-
-* Serve static files from Amazon S3, Google Cloud Storage or Whitenoise_
-* Configuration for Celery_ and Flower_ (the latter in Docker setup only)
-* Integration with MailHog_ for local email testing
-* Integration with Sentry_ for error logging
-
-.. _Bootstrap: https://github.com/twbs/bootstrap
-.. _django-environ: https://github.com/joke2k/django-environ
-.. _12-Factor: http://12factor.net/
-.. _django-allauth: https://github.com/pennersr/django-allauth
-.. _django-avatar: https://github.com/grantmcconnaughey/django-avatar
-.. _Procfile: https://devcenter.heroku.com/articles/procfile
-.. _Mailgun: http://www.mailgun.com/
-.. _Whitenoise: https://whitenoise.readthedocs.io/
-.. _Celery: http://www.celeryproject.org/
-.. _Flower: https://github.com/mher/flower
-.. _Anymail: https://github.com/anymail/django-anymail
-.. _MailHog: https://github.com/mailhog/MailHog
-.. _Sentry: https://sentry.io/welcome/
-.. _docker-compose: https://github.com/docker/compose
-.. _PythonAnywhere: https://www.pythonanywhere.com/
-.. _Traefik: https://traefik.io/
-.. _LetsEncrypt: https://letsencrypt.org/
-.. _pre-commit: https://github.com/pre-commit/pre-commit
-
-Constraints
------------
-
-* Only maintained 3rd party libraries are used.
-* Uses PostgreSQL everywhere (10.16 - 13.2)
-* Environment variables for configuration (This won't work with Apache/mod_wsgi).
-
-Support this Project!
-----------------------
-
-This project is run by volunteers. Please support them in their efforts to maintain and improve Cookiecutter Django:
-
-* Daniel Roy Greenfeld, Project Lead (`GitHub `_, `Patreon `_): expertise in Django and AWS ELB.
-
-* Nikita Shupeyko, Core Developer (`GitHub `_): expertise in Python/Django, hands-on DevOps and frontend experience.
-
-Projects that provide financial support to the maintainers:
-
-
-
-~~~~~~~~~~~~~~~~~~~~~~~~~
-
-.. image:: https://cdn.shopify.com/s/files/1/0304/6901/products/Two-Scoops-of-Django-3-Alpha-Cover_540x_26507b15-e489-470b-8a97-02773dd498d1_1080x.jpg
- :name: Two Scoops of Django 3.x
- :align: center
- :alt: Two Scoops of Django
- :target: https://www.feldroy.com/products//two-scoops-of-django-3-x
-
-Two Scoops of Django 3.x is the best ice cream-themed Django reference in the universe!
-
-pyup
-~~~~~~~~~~~~~~~~~~
-
-.. image:: https://pyup.io/static/images/logo.png
- :name: pyup
- :align: center
- :alt: pyup
- :target: https://pyup.io/
-
-Pyup brings you automated security and dependency updates used by Google and other organizations. Free for open source projects!
-
-Usage
-------
-
-Let's pretend you want to create a Django project called "redditclone". Rather than using ``startproject``
-and then editing the results to include your name, email, and various configuration issues that always get forgotten until the worst possible moment, get cookiecutter_ to do all the work.
-
-First, get Cookiecutter. Trust me, it's awesome::
-
- $ pip install "cookiecutter>=1.7.0"
-
-Now run it against this repo::
-
- $ cookiecutter https://github.com/pydanny/cookiecutter-django
-
-You'll be prompted for some values. Provide them, then a Django project will be created for you.
-
-**Warning**: After this point, change 'Daniel Greenfeld', 'pydanny', etc to your own information.
-
-Answer the prompts with your own desired options_. For example::
-
- Cloning into 'cookiecutter-django'...
- remote: Counting objects: 550, done.
- remote: Compressing objects: 100% (310/310), done.
- remote: Total 550 (delta 283), reused 479 (delta 222)
- Receiving objects: 100% (550/550), 127.66 KiB | 58 KiB/s, done.
- Resolving deltas: 100% (283/283), done.
- project_name [Project Name]: Reddit Clone
- project_slug [reddit_clone]: reddit
- author_name [Daniel Roy Greenfeld]: Daniel Greenfeld
- email [you@example.com]: pydanny@gmail.com
- description [Behold My Awesome Project!]: A reddit clone.
- domain_name [example.com]: myreddit.com
- version [0.1.0]: 0.0.1
- timezone [UTC]: America/Los_Angeles
- use_whitenoise [n]: n
- use_celery [n]: y
- use_mailhog [n]: n
- use_sentry [n]: y
- use_pycharm [n]: y
- windows [n]: n
- use_docker [n]: n
- use_heroku [n]: y
- use_compressor [n]: y
- Select postgresql_version:
- 1 - 13.2
- 2 - 12.6
- 3 - 11.11
- 4 - 10.16
- Choose from 1, 2, 3, 4, 5 [1]: 1
- Select js_task_runner:
- 1 - None
- 2 - Gulp
- Choose from 1, 2 [1]: 1
- Select cloud_provider:
- 1 - AWS
- 2 - GCP
- 3 - None
- Choose from 1, 2, 3 [1]: 1
- custom_bootstrap_compilation [n]: n
- Select open_source_license:
- 1 - MIT
- 2 - BSD
- 3 - GPLv3
- 4 - Apache Software License 2.0
- 5 - Not open source
- Choose from 1, 2, 3, 4, 5 [1]: 1
- keep_local_envs_in_vcs [y]: y
- debug[n]: n
-
-Enter the project and take a look around::
-
- $ cd reddit/
- $ ls
-
-Create a git repo and push it there::
-
- $ git init
- $ git add .
- $ git commit -m "first awesome commit"
- $ git remote add origin git@github.com:pydanny/redditclone.git
- $ git push -u origin master
-
-Now take a look at your repo. Don't forget to carefully look at the generated README. Awesome, right?
-
-For local development, see the following:
-
-* `Developing locally`_
-* `Developing locally using docker`_
-
-.. _options: http://cookiecutter-django.readthedocs.io/en/latest/project-generation-options.html
-.. _`Developing locally`: http://cookiecutter-django.readthedocs.io/en/latest/developing-locally.html
-.. _`Developing locally using docker`: http://cookiecutter-django.readthedocs.io/en/latest/developing-locally-docker.html
-
-Community
------------
-
-* Have questions? **Before you ask questions anywhere else**, please post your question on `Stack Overflow`_ under the *cookiecutter-django* tag. We check there periodically for questions.
-* If you think you found a bug or want to request a feature, please open an issue_.
-* For anything else, you can chat with us on `Slack`_.
-
-.. _`Stack Overflow`: http://stackoverflow.com/questions/tagged/cookiecutter-django
-.. _`issue`: https://github.com/pydanny/cookiecutter-django/issues
-.. _`Slack`: https://join.slack.com/t/cookie-cutter/shared_invite/enQtNzI0Mzg5NjE5Nzk5LTRlYWI2YTZhYmQ4YmU1Y2Q2NmE1ZjkwOGM0NDQyNTIwY2M4ZTgyNDVkNjMxMDdhZGI5ZGE5YmJjM2M3ODJlY2U
-
-For Readers of Two Scoops of Django
---------------------------------------------
-
-You may notice that some elements of this project do not exactly match what we describe in chapter 3. The reason for that is this project, amongst other things, serves as a test bed for trying out new ideas and concepts. Sometimes they work, sometimes they don't, but the end result is that it won't necessarily match precisely what is described in the book I co-authored.
-
-For pyup.io Users
------------------
-
-If you are using `pyup.io`_ to keep your dependencies updated and secure, use the code *cookiecutter* during checkout to get 15% off every month.
-
-.. _`pyup.io`: https://pyup.io
-
-"Your Stuff"
--------------
-
-Scattered throughout the Python and HTML of this project are places marked with "your stuff". This is where third-party libraries are to be integrated with your project.
-
-Releases
---------
-
-Need a stable release? You can find them at https://github.com/pydanny/cookiecutter-django/releases
-
-
-Not Exactly What You Want?
----------------------------
-
-This is what I want. *It might not be what you want.* Don't worry, you have options:
-
-Fork This
-~~~~~~~~~~
-
-If you have differences in your preferred setup, I encourage you to fork this to create your own version.
-Once you have your fork working, let me know and I'll add it to a '*Similar Cookiecutter Templates*' list here.
-It's up to you whether or not to rename your fork.
-
-If you do rename your fork, I encourage you to submit it to the following places:
-
-* cookiecutter_ so it gets listed in the README as a template.
-* The cookiecutter grid_ on Django Packages.
-
-.. _cookiecutter: https://github.com/cookiecutter/cookiecutter
-.. _grid: https://www.djangopackages.com/grids/g/cookiecutters/
-
-Submit a Pull Request
-~~~~~~~~~~~~~~~~~~~~~~
-
-We accept pull requests if they're small, atomic, and make our own project development
-experience better.
-
-Articles
----------
-
-* `Using cookiecutter-django with Google Cloud Storage`_ - Mar. 12, 2019
-* `cookiecutter-django with Nginx, Route 53 and ELB`_ - Feb. 12, 2018
-* `cookiecutter-django and Amazon RDS`_ - Feb. 7, 2018
-* `Using Cookiecutter to Jumpstart a Django Project on Windows with PyCharm`_ - May 19, 2017
-* `Exploring with Cookiecutter`_ - Dec. 3, 2016
-* `Introduction to Cookiecutter-Django`_ - Feb. 19, 2016
-* `Django and GitLab - Running Continuous Integration and tests with your FREE account`_ - May. 11, 2016
-* `Development and Deployment of Cookiecutter-Django on Fedora`_ - Jan. 18, 2016
-* `Development and Deployment of Cookiecutter-Django via Docker`_ - Dec. 29, 2015
-* `How to create a Django Application using Cookiecutter and Django 1.8`_ - Sept. 12, 2015
-
-Have a blog or online publication? Write about your cookiecutter-django tips and tricks, then send us a pull request with the link.
-
-.. _`Using cookiecutter-django with Google Cloud Storage`: https://ahhda.github.io/cloud/gce/django/2019/03/12/using-django-cookiecutter-cloud-storage.html
-.. _`cookiecutter-django with Nginx, Route 53 and ELB`: https://msaizar.com/blog/cookiecutter-django-nginx-route-53-and-elb/
-.. _`cookiecutter-django and Amazon RDS`: https://msaizar.com/blog/cookiecutter-django-and-amazon-rds/
-.. _`Exploring with Cookiecutter`: http://www.snowboardingcoder.com/django/2016/12/03/exploring-with-cookiecutter/
-.. _`Using Cookiecutter to Jumpstart a Django Project on Windows with PyCharm`: https://joshuahunter.com/posts/using-cookiecutter-to-jumpstart-a-django-project-on-windows-with-pycharm/
-
-.. _`Development and Deployment of Cookiecutter-Django via Docker`: https://realpython.com/blog/python/development-and-deployment-of-cookiecutter-django-via-docker/
-.. _`Development and Deployment of Cookiecutter-Django on Fedora`: https://realpython.com/blog/python/development-and-deployment-of-cookiecutter-django-on-fedora/
-.. _`How to create a Django Application using Cookiecutter and Django 1.8`: https://www.swapps.io/blog/how-to-create-a-django-application-using-cookiecutter-and-django-1-8/
-.. _`Introduction to Cookiecutter-Django`: http://krzysztofzuraw.com/blog/2016/django-cookiecutter.html
-.. _`Django and GitLab - Running Continuous Integration and tests with your FREE account`: http://dezoito.github.io/2016/05/11/django-gitlab-continuous-integration-phantomjs.html
-
-Code of Conduct
----------------
-
-Everyone interacting in the Cookiecutter project's codebases, issue trackers, chat
-rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_.
-
-
-.. _`PyPA Code of Conduct`: https://www.pypa.io/en/latest/code-of-conduct/
diff --git a/cookiecutter.json b/cookiecutter.json
index 0f9a2c769..970411013 100644
--- a/cookiecutter.json
+++ b/cookiecutter.json
@@ -18,18 +18,16 @@
"use_pycharm": "n",
"use_docker": "n",
"postgresql_version": [
- "13.2",
- "12.6",
- "11.11",
- "10.16"
- ],
- "js_task_runner": [
- "None",
- "Gulp"
+ "14",
+ "13",
+ "12",
+ "11",
+ "10"
],
"cloud_provider": [
"AWS",
"GCP",
+ "Azure",
"None"
],
"mail_service": [
@@ -45,8 +43,11 @@
],
"use_async": "n",
"use_drf": "n",
- "custom_bootstrap_compilation": "n",
- "use_compressor": "n",
+ "frontend_pipeline": [
+ "None",
+ "Django Compressor",
+ "Gulp"
+ ],
"use_celery": "n",
"use_mailhog": "n",
"use_sentry": "n",
diff --git a/docs/conf.py b/docs/conf.py
index 469aa12d4..b53e6a7e7 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -7,10 +7,7 @@
#
# All configuration values have a default; values that are commented out
# serve to show the default.
-
from datetime import datetime
-import os
-import sys
now = datetime.now()
@@ -42,7 +39,7 @@ master_doc = "index"
# General information about the project.
project = "Cookiecutter Django"
-copyright = "2013-{}, Daniel Roy Greenfeld".format(now.year)
+copyright = f"2013-{now.year}, Daniel Roy Greenfeld"
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
@@ -92,7 +89,7 @@ pygments_style = "sphinx"
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
-html_theme = "default"
+html_theme = "sphinx_rtd_theme"
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
@@ -242,7 +239,8 @@ texinfo_documents = [
"Cookiecutter Django documentation",
"Daniel Roy Greenfeld",
"Cookiecutter Django",
- "A Cookiecutter template for creating production-ready Django projects quickly.",
+ "A Cookiecutter template for creating production-ready "
+ "Django projects quickly.",
"Miscellaneous",
)
]
diff --git a/docs/deployment-on-heroku.rst b/docs/deployment-on-heroku.rst
index 0681a50ca..71fb45dda 100644
--- a/docs/deployment-on-heroku.rst
+++ b/docs/deployment-on-heroku.rst
@@ -3,23 +3,24 @@ Deployment on Heroku
.. index:: Heroku
-Commands to run
----------------
+Script
+------
Run these commands to deploy the project to Heroku:
.. code-block:: bash
- heroku create --buildpack heroku/python
+ heroku create --buildpack heroku/python
- heroku addons:create heroku-postgresql:hobby-dev
+ heroku addons:create heroku-postgresql:mini
# On Windows use double quotes for the time zone, e.g.
# heroku pg:backups schedule --at "02:00 America/Los_Angeles" DATABASE_URL
heroku pg:backups schedule --at '02:00 America/Los_Angeles' DATABASE_URL
heroku pg:promote DATABASE_URL
- heroku addons:create heroku-redis:hobby-dev
+ heroku addons:create heroku-redis:mini
+ # Assuming you chose Mailgun as mail service (see below for others)
heroku addons:create mailgun:starter
heroku config:set PYTHONHASHSEED=random
@@ -53,11 +54,25 @@ Run these commands to deploy the project to Heroku:
heroku open
+Notes
+-----
+
+Email Service
++++++++++++++
+
+The script above assumes that you've chose Mailgun as email service. If you want to use another one, check the `documentation for django-anymail `_ to know which environment variables to set. Heroku provides other `add-ons for emails `_ (e.g. Sendgrid) which can be configured with a similar one line command.
.. warning::
.. include:: mailgun.rst
+Heroku & Docker
++++++++++++++++
+
+Although Heroku has some sort of `Docker support`_, it's not supported by cookiecutter-django.
+We invite you to follow Heroku documentation about it.
+
+.. _Docker support: https://devcenter.heroku.com/articles/build-docker-images-heroku-yml
Optional actions
----------------
@@ -97,7 +112,7 @@ Or add the DSN for your account, if you already have one:
Gulp & Bootstrap compilation
++++++++++++++++++++++++++++
-If you've opted for a custom bootstrap build, you'll most likely need to setup
+If you've opted for Gulp, you'll most likely need to setup
your app to use `multiple buildpacks`_: one for Python & one for Node.js:
.. code-block:: bash
@@ -111,11 +126,3 @@ which runs Gulp in cookiecutter-django.
If things don't work, please refer to the Heroku docs.
.. _multiple buildpacks: https://devcenter.heroku.com/articles/using-multiple-buildpacks-for-an-app
-
-About Heroku & Docker
----------------------
-
-Although Heroku has some sort of `Docker support`_, it's not supported by cookiecutter-django.
-We invite you to follow Heroku documentation about it.
-
-.. _Docker support: https://devcenter.heroku.com/articles/build-docker-images-heroku-yml
diff --git a/docs/deployment-on-pythonanywhere.rst b/docs/deployment-on-pythonanywhere.rst
index 67da158ba..7984d7b5e 100644
--- a/docs/deployment-on-pythonanywhere.rst
+++ b/docs/deployment-on-pythonanywhere.rst
@@ -15,7 +15,7 @@ Full instructions follow, but here's a high-level view.
2. Set your config variables in the *postactivate* script
-3. Run the *manage.py* ``migrate`` and ``collectstatic`` {%- if cookiecutter.use_compressor == "y" %}and ``compress`` {%- endif %}commands
+3. Run the *manage.py* ``migrate`` and ``collectstatic`` commands. If you've opted for django-compressor, also run ``compress``
4. Add an entry to the PythonAnywhere *Web tab*
@@ -25,7 +25,6 @@ Full instructions follow, but here's a high-level view.
Once you've been through this one-off config, future deployments are much simpler: just ``git pull`` and then hit the "Reload" button :)
-
Getting your code and dependencies installed on PythonAnywhere
--------------------------------------------------------------
@@ -35,11 +34,10 @@ Make sure your project is fully committed and pushed up to Bitbucket or Github o
git clone # you can also use hg
cd my-project-name
- mkvirtualenv --python=/usr/bin/python3.9 my-project-name
+ mkvirtualenv --python=/usr/bin/python3.10 my-project-name
pip install -r requirements/production.txt # may take a few minutes
-
Setting environment variables in the console
--------------------------------------------
@@ -57,7 +55,7 @@ Set environment variables via the virtualenv "postactivate" script (this will se
vi $VIRTUAL_ENV/bin/postactivate
-**TIP:** *If you don't like vi, you can also edit this file via the PythonAnywhere "Files" menu; look in the ".virtualenvs" folder*.
+.. note:: If you don't like vi, you can also edit this file via the PythonAnywhere "Files" menu; look in the ".virtualenvs" folder.
Add these exports
@@ -73,13 +71,14 @@ Add these exports
export DJANGO_AWS_ACCESS_KEY_ID=
export DJANGO_AWS_SECRET_ACCESS_KEY=
export DJANGO_AWS_STORAGE_BUCKET_NAME=
- export DATABASE_URL=''
+ export DATABASE_URL=''
+ export REDIS_URL=''
-**NOTE:** *The AWS details are not required if you're using whitenoise or the built-in pythonanywhere static files service, but you do need to set them to blank, as above.*
+.. note:: The AWS details are not required if you're using whitenoise or the built-in pythonanywhere static files service, but you do need to set them to blank, as above.
-Database setup:
----------------
+Database setup
+--------------
Go to the PythonAnywhere **Databases tab** and configure your database.
@@ -109,18 +108,26 @@ Now run the migration, and collectstatic:
source $VIRTUAL_ENV/bin/postactivate
python manage.py migrate
python manage.py collectstatic
- {%- if cookiecutter.use_compressor == "y" %}python manage.py compress {%- endif %}
+ # if using django-compressor:
+ python manage.py compress
# and, optionally
python manage.py createsuperuser
+Redis
+-----
+
+PythonAnywhere does NOT `offer a built-in solution `_ for Redis, however the production setup from Cookiecutter Django uses Redis as cache and requires one.
+
+We recommend to signup to a separate service offering hosted Redis (e.g. `Redislab `_) and use the URL they provide.
+
Configure the PythonAnywhere Web Tab
------------------------------------
Go to the PythonAnywhere **Web tab**, hit **Add new web app**, and choose **Manual Config**, and then the version of Python you used for your virtualenv.
-**NOTE:** *If you're using a custom domain (not on \*.pythonanywhere.com), then you'll need to set up a CNAME with your domain registrar.*
+.. note:: If you're using a custom domain (not on \*.pythonanywhere.com), then you'll need to set up a CNAME with your domain registrar.
When you're redirected back to the web app config screen, set the **path to your virtualenv**. If you used virtualenvwrapper as above, you can just enter its name.
@@ -153,15 +160,14 @@ Click through to the **WSGI configuration file** link (near the top) and edit th
Back on the Web tab, hit **Reload**, and your app should be live!
-**NOTE:** *you may see security warnings until you set up your SSL certificates. If you
-want to suppress them temporarily, set DJANGO_SECURE_SSL_REDIRECT to blank. Follow
-the instructions here to get SSL set up: https://help.pythonanywhere.com/pages/SSLOwnDomains/*
+.. note:: You may see security warnings until you set up your SSL certificates. If you want to suppress them temporarily, set ``DJANGO_SECURE_SSL_REDIRECT`` to blank. Follow `these instructions `_ to get SSL set up.
+
Optional: static files
----------------------
-If you want to use the PythonAnywhere static files service instead of using whitenoise or S3, you'll find its configuration section on the Web tab. Essentially you'll need an entry to match your ``STATIC_URL`` and ``STATIC_ROOT`` settings. There's more info here: https://help.pythonanywhere.com/pages/DjangoStaticFiles
+If you want to use the PythonAnywhere static files service instead of using whitenoise or S3, you'll find its configuration section on the Web tab. Essentially you'll need an entry to match your ``STATIC_URL`` and ``STATIC_ROOT`` settings. There's more info `in this article `_.
Future deployments
@@ -176,8 +182,9 @@ For subsequent deployments, the procedure is much simpler. In a Bash console:
git pull
python manage.py migrate
python manage.py collectstatic
- {%- if cookiecutter.use_compressor == "y" %}python manage.py compress {%- endif %}
+ # if using django-compressor:
+ python manage.py compress
And then go to the Web tab and hit **Reload**
-**TIP:** *if you're really keen, you can set up git-push based deployments: https://blog.pythonanywhere.com/87/*
+.. note:: If you're really keen, you can set up git-push based deployments: https://blog.pythonanywhere.com/87/
diff --git a/docs/developing-locally-docker.rst b/docs/developing-locally-docker.rst
index 72999be31..a7d77e108 100644
--- a/docs/developing-locally-docker.rst
+++ b/docs/developing-locally-docker.rst
@@ -3,9 +3,6 @@ Getting Up and Running Locally With Docker
.. index:: Docker
-The steps below will get you up and running with a local development environment.
-All of these commands assume you are in the root of your generated project.
-
.. note::
If you're new to Docker, please be aware that some resources are cached system-wide
@@ -18,11 +15,17 @@ Prerequisites
* Docker; if you don't have it yet, follow the `installation instructions`_;
* Docker Compose; refer to the official documentation for the `installation guide`_.
-* Pre-commit; refer to the official documentation for the [pre-commit](https://pre-commit.com/#install).
+* Pre-commit; refer to the official documentation for the `pre-commit`_.
+* Cookiecutter; refer to the official GitHub repository of `Cookiecutter`_
.. _`installation instructions`: https://docs.docker.com/install/#supported-platforms
.. _`installation guide`: https://docs.docker.com/compose/install/
.. _`pre-commit`: https://pre-commit.com/#install
+.. _`Cookiecutter`: https://github.com/cookiecutter/cookiecutter
+
+Before Getting Started
+----------------------
+.. include:: generate-project-block.rst
Build the Stack
---------------
@@ -167,16 +170,18 @@ docker
The ``container_name`` from the yml file can be used to check on containers with docker commands, for example: ::
- $ docker logs worker
- $ docker top worker
+ $ docker logs _local_celeryworker
+ $ docker top _local_celeryworker
+Notice that the ``container_name`` is generated dynamically using your project slug as a prefix
+
Mailhog
~~~~~~~
When developing locally you can go with MailHog_ for email testing provided ``use_mailhog`` was set to ``y`` on setup. To proceed,
-#. make sure ``mailhog`` container is up and running;
+#. make sure ``_local_mailhog`` container is up and running;
#. open up ``http://127.0.0.1:8025``.
@@ -188,7 +193,7 @@ Celery tasks in local development
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
When not using docker Celery tasks are set to run in Eager mode, so that a full stack is not needed. When using docker the task scheduler will be used by default.
-If you need tasks to be executed on the main thread during development set CELERY_TASK_ALWAYS_EAGER = True in config/settings/local.py.
+If you need tasks to be executed on the main thread during development set ``CELERY_TASK_ALWAYS_EAGER = True`` in ``config/settings/local.py``.
Possible uses could be for testing, or ease of profiling with DJDT.
@@ -213,20 +218,20 @@ Developing locally with HTTPS
Increasingly it is becoming necessary to develop software in a secure environment in order that there are very few changes when deploying to production. Recently Facebook changed their policies for apps/sites that use Facebook login which requires the use of an HTTPS URL for the OAuth redirect URL. So if you want to use the ``users`` application with a OAuth provider such as Facebook, securing your communication to the local development environment will be necessary.
-In order to create a secure environment, we need to have a trusted SSL certficate installed in our Docker application.
+In order to create a secure environment, we need to have a trusted SSL certificate installed in our Docker application.
#. **Let's Encrypt**
-
- The official line from Let’s Encrypt is:
- [For local development section] ... The best option: Generate your own certificate, either self-signed or signed by a local root, and trust it in your operating system’s trust store. Then use that certificate in your local web server. See below for details.
+ The official line from Let’s Encrypt is:
+
+ [For local development section] ... The best option: Generate your own certificate, either self-signed or signed by a local root, and trust it in your operating system’s trust store. Then use that certificate in your local web server. See below for details.
See `letsencrypt.org - certificates-for-localhost`_
.. _`letsencrypt.org - certificates-for-localhost`: https://letsencrypt.org/docs/certificates-for-localhost/
#. **mkcert: Valid Https Certificates For Localhost**
-
+
`mkcert`_ is a simple by design tool that hides all the arcane knowledge required to generate valid TLS certificates. It works for any hostname or IP, including localhost. It supports macOS, Linux, and Windows, and Firefox, Chrome and Java. It even works on mobile devices with a couple manual steps.
See https://blog.filippo.io/mkcert-valid-https-certificates-for-localhost/
@@ -261,11 +266,11 @@ local.yml
restart: always
depends_on:
- django
-
+
...
#. Link the ``nginx-proxy`` to ``django`` through environment variables.
-
+
``django`` already has an ``.env`` file connected to it. Add the following variables. You should do this especially if you are working with a team and you want to keep your local environment details to yourself.
::
diff --git a/docs/developing-locally.rst b/docs/developing-locally.rst
index 83bc1342c..2b9438059 100644
--- a/docs/developing-locally.rst
+++ b/docs/developing-locally.rst
@@ -9,7 +9,7 @@ Setting Up Development Environment
Make sure to have the following on your host:
-* Python 3.9
+* Python 3.10
* PostgreSQL_.
* Redis_, if using Celery
* Cookiecutter_
@@ -18,15 +18,14 @@ First things first.
#. Create a virtualenv: ::
- $ python3.9 -m venv
+ $ python3.10 -m venv
#. Activate the virtualenv you have just created: ::
$ source /bin/activate
-#. Install cookiecutter-django: ::
-
- $ cookiecutter gh:pydanny/cookiecutter-django
+#.
+ .. include:: generate-project-block.rst
#. Install development requirements: ::
@@ -42,7 +41,9 @@ First things first.
#. Create a new PostgreSQL database using createdb_: ::
- $ createdb -U postgres --password
+ $ createdb --username=postgres
+
+ ``project_slug`` is what you have entered as the project_slug at the setup stage.
.. note::
@@ -81,13 +82,13 @@ First things first.
or if you're running asynchronously: ::
- $ uvicorn config.asgi:application --host 0.0.0.0 --reload
+ $ uvicorn config.asgi:application --host 0.0.0.0 --reload --reload-include '*.html'
.. _PostgreSQL: https://www.postgresql.org/download/
.. _Redis: https://redis.io/download
.. _CookieCutter: https://github.com/cookiecutter/cookiecutter
.. _createdb: https://www.postgresql.org/docs/current/static/app-createdb.html
-.. _initial PostgreSQL set up: http://suite.opengeo.org/docs/latest/dataadmin/pgGettingStarted/firstconnect.html
+.. _initial PostgreSQL set up: https://web.archive.org/web/20190303010033/http://suite.opengeo.org/docs/latest/dataadmin/pgGettingStarted/firstconnect.html
.. _postgres documentation: https://www.postgresql.org/docs/current/static/auth-pg-hba-conf.html
.. _pre-commit: https://pre-commit.com/
.. _direnv: https://direnv.net/
@@ -140,22 +141,55 @@ In production, we have Mailgun_ configured to have your back!
Celery
------
-If the project is configured to use Celery as a task scheduler then by default tasks are set to run on the main thread
-when developing locally. If you have the appropriate setup on your local machine then set the following
-in ``config/settings/local.py``::
+If the project is configured to use Celery as a task scheduler then, by default, tasks are set to run on the main thread when developing locally instead of getting sent to a broker. However, if you have Redis setup on your local machine, you can set the following in ``config/settings/local.py``::
CELERY_TASK_ALWAYS_EAGER = False
-
-To run Celery locally, make sure redis-server is installed (instructions are available at https://redis.io/topics/quickstart), run the server in one terminal with `redis-server`, and then start celery in another terminal with the following command::
-
- celery -A config.celery_app worker --loglevel=info
+
+Next, make sure `redis-server` is installed (per the `Getting started with Redis`_ guide) and run the server in one terminal::
+
+ $ redis-server
+
+Start the Celery worker by running the following command in another terminal::
+
+ $ celery -A config.celery_app worker --loglevel=info
+
+That Celery worker should be running whenever your app is running, typically as a background process,
+so that it can pick up any tasks that get queued. Learn more from the `Celery Workers Guide`_.
+
+The project comes with a simple task for manual testing purposes, inside `/users/tasks.py`. To queue that task locally, start the Django shell, import the task, and call `delay()` on it::
+
+ $ python manage.py shell
+ >> from .users.tasks import get_users_count
+ >> get_users_count.delay()
+
+You can also use Django admin to queue up tasks, thanks to the `django-celerybeat`_ package.
+
+.. _Getting started with Redis guide: https://redis.io/docs/getting-started/
+.. _Celery Workers Guide: https://docs.celeryq.dev/en/stable/userguide/workers.html
+.. _django-celerybeat: https://django-celery-beat.readthedocs.io/en/latest/
Sass Compilation & Live Reloading
---------------------------------
-If you’d like to take advantage of live reloading and Sass compilation you can do so with a little
-bit of preparation, see :ref:`sass-compilation-live-reload`.
+If you've opted for Gulp as front-end pipeline, the project comes configured with `Sass`_ compilation and `live reloading`_. As you change you Sass/JS source files, the task runner will automatically rebuild the corresponding CSS and JS assets and reload them in your browser without refreshing the page.
+
+#. Make sure that `Node.js`_ v16 is installed on your machine.
+#. In the project root, install the JS dependencies with::
+
+ $ npm install
+
+#. Now - with your virtualenv activated - start the application by running::
+
+ $ npm run dev
+
+ The app will now run with live reloading enabled, applying front-end changes dynamically.
+
+.. note:: The task will start 2 processes in parallel: the static assets build loop on one side, and the Django server on the other. You do NOT need to run Django as your would normally with ``manage.py runserver``.
+
+.. _Node.js: http://nodejs.org/download/
+.. _Sass: https://sass-lang.com/
+.. _live reloading: https://browsersync.io
Summary
-------
diff --git a/docs/document.rst b/docs/document.rst
index a30979094..974c66c69 100644
--- a/docs/document.rst
+++ b/docs/document.rst
@@ -13,7 +13,7 @@ If you set up your project to `develop locally with docker`_, run the following
$ docker-compose -f local.yml up docs
-Navigate to port 7000 on your host to see the documentation. This will be opened automatically at `localhost`_ for local, non-docker development.
+Navigate to port 9000 on your host to see the documentation. This will be opened automatically at `localhost`_ for local, non-docker development.
Note: using Docker for documentation sets up a temporary SQLite file by setting the environment variable ``DATABASE_URL=sqlite:///readthedocs.db`` in ``docs/conf.py`` to avoid a dependency on PostgreSQL.
@@ -36,7 +36,7 @@ To setup your documentation on `ReadTheDocs`_, you must
Additionally, you can auto-build Pull Request previews, but `you must enable it`_.
-.. _localhost: http://localhost:7000/
+.. _localhost: http://localhost:9000/
.. _Sphinx: https://www.sphinx-doc.org/en/master/index.html
.. _develop locally: ./developing-locally.html
.. _develop locally with docker: ./developing-locally-docker.html
diff --git a/docs/faq.rst b/docs/faq.rst
index 59e82465f..52a99467c 100644
--- a/docs/faq.rst
+++ b/docs/faq.rst
@@ -6,11 +6,11 @@ FAQ
Why is there a django.contrib.sites directory in Cookiecutter Django?
---------------------------------------------------------------------
-It is there to add a migration so you don't have to manually change the ``sites.Site`` record from ``example.com`` to whatever your domain is. Instead, your ``{{cookiecutter.domain_name}}`` and {{cookiecutter.project_name}} value is placed by **Cookiecutter** in the domain and name fields respectively.
+It is there to add a migration so you don't have to manually change the ``sites.Site`` record from ``example.com`` to whatever your domain is. Instead, your ``{{cookiecutter.domain_name}}`` and ``{{cookiecutter.project_name}}`` value is placed by **Cookiecutter** in the domain and name fields respectively.
See `0003_set_site_domain_and_name.py`_.
-.. _`0003_set_site_domain_and_name.py`: https://github.com/pydanny/cookiecutter-django/blob/master/%7B%7Bcookiecutter.project_slug%7D%7D/%7B%7Bcookiecutter.project_slug%7D%7D/contrib/sites/migrations/0003_set_site_domain_and_name.py
+.. _`0003_set_site_domain_and_name.py`: https://github.com/cookiecutter/cookiecutter-django/blob/master/%7B%7Bcookiecutter.project_slug%7D%7D/%7B%7Bcookiecutter.project_slug%7D%7D/contrib/sites/migrations/0003_set_site_domain_and_name.py
Why aren't you using just one configuration file (12-Factor App)
diff --git a/docs/generate-project-block.rst b/docs/generate-project-block.rst
new file mode 100644
index 000000000..2842b551d
--- /dev/null
+++ b/docs/generate-project-block.rst
@@ -0,0 +1,7 @@
+Generate a new cookiecutter-django project: ::
+
+ $ cookiecutter gh:cookiecutter/cookiecutter-django
+
+For more information refer to
+:ref:`Project Generation Options `.
+
diff --git a/docs/index.rst b/docs/index.rst
index f62184643..dae641d10 100644
--- a/docs/index.rst
+++ b/docs/index.rst
@@ -1,13 +1,14 @@
.. cookiecutter-django documentation master file.
Welcome to Cookiecutter Django's documentation!
-====================================================================
+===============================================
-A Cookiecutter_ template for Django.
+Powered by Cookiecutter_, Cookiecutter Django is a project template for jumpstarting production-ready Django projects. The template offers a number of generation options, we invite you to check the :ref:`dedicated page ` to learn more about each of them.
.. _cookiecutter: https://github.com/cookiecutter/cookiecutter
-Contents:
+Contents
+--------
.. toctree::
:maxdepth: 2
@@ -28,7 +29,7 @@ Contents:
troubleshooting
Indices and tables
-==================
+------------------
* :ref:`genindex`
* :ref:`search`
diff --git a/docs/live-reloading-and-sass-compilation.rst b/docs/live-reloading-and-sass-compilation.rst
deleted file mode 100644
index a55b4fd8c..000000000
--- a/docs/live-reloading-and-sass-compilation.rst
+++ /dev/null
@@ -1,24 +0,0 @@
-.. _sass-compilation-live-reload:
-
-Sass Compilation & Live Reloading
-=================================
-
-If you'd like to take advantage of `live reload`_ and Sass compilation:
-
-- Make sure that nodejs_ is installed. Then in the project root run::
-
- $ npm install
-
-.. _nodejs: http://nodejs.org/download/
-
-- Now you just need::
-
- $ npm run dev
-
-The base app will now run as it would with the usual ``manage.py runserver`` but with live reloading and Sass compilation enabled.
-When changing your Sass files, they will be automatically recompiled and change will be reflected in your browser without refreshing.
-
-To get live reloading to work you'll probably need to install an `appropriate browser extension`_
-
-.. _live reload: http://livereload.com/
-.. _appropriate browser extension: http://livereload.com/extensions/
diff --git a/docs/mailgun.rst b/docs/mailgun.rst
index 1f34e3c86..b5f49b625 100644
--- a/docs/mailgun.rst
+++ b/docs/mailgun.rst
@@ -1,7 +1,7 @@
If your email server used to send email isn't configured properly (Mailgun by default),
attempting to send an email will cause an Internal Server Error.
-By default, django-allauth is setup to `have emails verifications mandatory`_,
+By default, ``django-allauth`` is setup to `have emails verifications mandatory`_,
which means it'll send a verification email when an unverified user tries to
log-in or when someone tries to sign-up.
diff --git a/docs/project-generation-options.rst b/docs/project-generation-options.rst
index a2573b94d..0560badd3 100644
--- a/docs/project-generation-options.rst
+++ b/docs/project-generation-options.rst
@@ -1,6 +1,12 @@
+.. _template-options:
+
Project Generation Options
==========================
+This page describes all the template options that will be prompted by the `cookiecutter CLI`_ prior to generating your project.
+
+.. _cookiecutter CLI: https://github.com/cookiecutter/cookiecutter
+
project_name:
Your project's human-readable name, capitals and spaces allowed.
@@ -49,23 +55,19 @@ use_docker:
postgresql_version:
Select a PostgreSQL_ version to use. The choices are:
- 1. 13.2
- 2. 12.6
- 3. 11.11
- 4. 10.16
-
-js_task_runner:
- Select a JavaScript task runner. The choices are:
-
- 1. None
- 2. Gulp_
+ 1. 14
+ 2. 13
+ 3. 12
+ 4. 11
+ 5. 10
cloud_provider:
Select a cloud provider for static & media files. The choices are:
1. AWS_
2. GCP_
- 3. None
+ 3. Azure_
+ 4. None
Note that if you choose no cloud provider, media files won't work.
@@ -88,13 +90,12 @@ use_async:
use_drf:
Indicates whether the project should be configured to use `Django Rest Framework`_.
-custom_bootstrap_compilation:
- Indicates whether the project should support Bootstrap recompilation
- via the selected JavaScript task runner's task. This can be useful
- for real-time Bootstrap variable alteration.
+frontend_pipeline:
+ Select a pipeline to compile and optimise frontend assets (JS, CSS, ...):
-use_compressor:
- Indicates whether the project should be configured to use `Django Compressor`_.
+ 1. None
+ 2. `Django Compressor`_
+ 3. `Gulp`_: support Bootstrap recompilation with real-time variables alteration.
use_celery:
Indicates whether the project should be configured to use Celery_.
@@ -147,6 +148,7 @@ debug:
.. _AWS: https://aws.amazon.com/s3/
.. _GCP: https://cloud.google.com/storage/
+.. _Azure: https://azure.microsoft.com/en-us/products/storage/blobs/
.. _Amazon SES: https://aws.amazon.com/ses/
.. _Mailgun: https://www.mailgun.com
diff --git a/docs/requirements.txt b/docs/requirements.txt
new file mode 100644
index 000000000..2cc8302a2
--- /dev/null
+++ b/docs/requirements.txt
@@ -0,0 +1,2 @@
+sphinx==5.3.0
+sphinx-rtd-theme==1.1.1
diff --git a/docs/settings.rst b/docs/settings.rst
index 7563f50d2..4691adbbd 100644
--- a/docs/settings.rst
+++ b/docs/settings.rst
@@ -46,8 +46,12 @@ DJANGO_AWS_SECRET_ACCESS_KEY AWS_SECRET_ACCESS_KEY n/a
DJANGO_AWS_STORAGE_BUCKET_NAME AWS_STORAGE_BUCKET_NAME n/a raises error
DJANGO_AWS_S3_REGION_NAME AWS_S3_REGION_NAME n/a None
DJANGO_AWS_S3_CUSTOM_DOMAIN AWS_S3_CUSTOM_DOMAIN n/a None
+DJANGO_AWS_S3_MAX_MEMORY_SIZE AWS_S3_MAX_MEMORY_SIZE n/a 100_000_000
DJANGO_GCP_STORAGE_BUCKET_NAME GS_BUCKET_NAME n/a raises error
GOOGLE_APPLICATION_CREDENTIALS n/a n/a raises error
+DJANGO_AZURE_ACCOUNT_KEY AZURE_ACCOUNT_KEY n/a raises error
+DJANGO_AZURE_ACCOUNT_NAME AZURE_ACCOUNT_NAME n/a raises error
+DJANGO_AZURE_CONTAINER_NAME AZURE_CONTAINER n/a raises error
SENTRY_DSN SENTRY_DSN n/a raises error
SENTRY_ENVIRONMENT n/a n/a production
SENTRY_TRACES_SAMPLE_RATE n/a n/a 0.0
diff --git a/docs/testing.rst b/docs/testing.rst
index dd6fcb48f..bea45c6dd 100644
--- a/docs/testing.rst
+++ b/docs/testing.rst
@@ -28,10 +28,15 @@ Coverage
You should build your tests to provide the highest level of **code coverage**. You can run the ``pytest`` with code ``coverage`` by typing in the following command: ::
- $ docker-compose -f local.yml run --rm django coverage run -m pytest
+ $ coverage run -m pytest
Once the tests are complete, in order to see the code coverage, run the following command: ::
+ $ coverage report
+
+If you're running the project locally with Docker, use these commands instead: ::
+
+ $ docker-compose -f local.yml run --rm django coverage run -m pytest
$ docker-compose -f local.yml run --rm django coverage report
.. note::
@@ -53,4 +58,4 @@ Once the tests are complete, in order to see the code coverage, run the followin
.. _develop locally with docker: ./developing-locally-docker.html
.. _customize: https://docs.pytest.org/en/latest/customize.html
.. _unittest: https://docs.python.org/3/library/unittest.html#module-unittest
-.. _configuring: https://coverage.readthedocs.io/en/v4.5.x/config.html
+.. _configuring: https://coverage.readthedocs.io/en/latest/config.html
diff --git a/docs/troubleshooting.rst b/docs/troubleshooting.rst
index 2df9db344..708acb4a7 100644
--- a/docs/troubleshooting.rst
+++ b/docs/troubleshooting.rst
@@ -47,5 +47,5 @@ Others
#. To create a new app using the recommended directory structure, run `django-admin startapp --template=../startapp_template myappname` from inside the project root. This template can be customized to suit your needs. (see `startapp`_)
-.. _#528: https://github.com/pydanny/cookiecutter-django/issues/528#issuecomment-212650373
+.. _#528: https://github.com/cookiecutter/cookiecutter-django/issues/528#issuecomment-212650373
.. _startapp: https://docs.djangoproject.com/en/dev/ref/django-admin/#startapp
diff --git a/hooks/post_gen_project.py b/hooks/post_gen_project.py
index 1c85288f3..d3113e24c 100644
--- a/hooks/post_gen_project.py
+++ b/hooks/post_gen_project.py
@@ -5,7 +5,8 @@ NOTE:
can potentially be run in Python 2.x environment
(at least so we presume in `pre_gen_project.py`).
-TODO: ? restrict Cookiecutter Django project initialization to Python 3.x environments only
+TODO: restrict Cookiecutter Django project initialization to
+ Python 3.x environments only
"""
from __future__ import print_function
@@ -59,6 +60,10 @@ def remove_docker_files():
file_names = ["local.yml", "production.yml", ".dockerignore"]
for file_name in file_names:
os.remove(file_name)
+ if "{{ cookiecutter.use_pycharm }}".lower() == "y":
+ file_names = ["docker_compose_up_django.xml", "docker_compose_up_docs.xml"]
+ for file_name in file_names:
+ os.remove(os.path.join(".idea", "runConfigurations", file_name))
def remove_utility_files():
@@ -86,6 +91,11 @@ def remove_gulp_files():
file_names = ["gulpfile.js"]
for file_name in file_names:
os.remove(file_name)
+ remove_sass_files()
+
+
+def remove_sass_files():
+ shutil.rmtree(os.path.join("{{cookiecutter.project_slug}}", "static", "sass"))
def remove_packagejson_file():
@@ -128,13 +138,6 @@ def remove_dotgithub_folder():
shutil.rmtree(".github")
-def append_to_project_gitignore(path):
- gitignore_file_path = ".gitignore"
- with open(gitignore_file_path, "a") as gitignore_file:
- gitignore_file.write(path)
- gitignore_file.write(os.linesep)
-
-
def generate_random_string(
length, using_digits=False, using_ascii_letters=False, using_punctuation=False
):
@@ -165,8 +168,8 @@ def set_flag(file_path, flag, value=None, formatted=None, *args, **kwargs):
random_string = generate_random_string(*args, **kwargs)
if random_string is None:
print(
- "We couldn't find a secure pseudo-random number generator on your system. "
- "Please, make sure to manually {} later.".format(flag)
+ "We couldn't find a secure pseudo-random number generator on your "
+ "system. Please, make sure to manually {} later.".format(flag)
)
random_string = flag
if formatted is not None:
@@ -249,10 +252,10 @@ def set_celery_flower_password(file_path, value=None):
return celery_flower_password
-def append_to_gitignore_file(s):
+def append_to_gitignore_file(ignored_line):
with open(".gitignore", "a") as gitignore_file:
- gitignore_file.write(s)
- gitignore_file.write(os.linesep)
+ gitignore_file.write(ignored_line)
+ gitignore_file.write("\n")
def set_flags_in_envs(postgres_user, celery_flower_user, debug=False):
@@ -291,6 +294,7 @@ def set_flags_in_settings_files():
def remove_envs_and_associated_files():
shutil.rmtree(".envs")
os.remove("merge_production_dotenvs_in_dotenv.py")
+ shutil.rmtree("tests")
def remove_celery_compose_dirs():
@@ -319,6 +323,11 @@ def remove_drf_starter_files():
"{{cookiecutter.project_slug}}", "users", "tests", "test_drf_views.py"
)
)
+ os.remove(
+ os.path.join(
+ "{{cookiecutter.project_slug}}", "users", "tests", "test_swagger.py"
+ )
+ )
shutil.rmtree(os.path.join("startapp_template", "api"))
os.remove(os.path.join("startapp_template", "tests", "test_drf_urls.py-tpl"))
os.remove(os.path.join("startapp_template", "tests", "test_drf_views.py-tpl"))
@@ -353,13 +362,13 @@ def main():
if (
"{{ cookiecutter.use_docker }}".lower() == "y"
- and "{{ cookiecutter.cloud_provider}}".lower() != "aws"
+ and "{{ cookiecutter.cloud_provider}}" != "AWS"
):
remove_aws_dockerfile()
if "{{ cookiecutter.use_heroku }}".lower() == "n":
remove_heroku_files()
- elif "{{ cookiecutter.use_compressor }}".lower() == "n":
+ elif "{{ cookiecutter.frontend_pipeline }}" != "Django Compressor":
remove_heroku_build_hooks()
if (
@@ -379,13 +388,13 @@ def main():
if "{{ cookiecutter.keep_local_envs_in_vcs }}".lower() == "y":
append_to_gitignore_file("!.envs/.local/")
- if "{{ cookiecutter.js_task_runner}}".lower() == "none":
+ if "{{ cookiecutter.frontend_pipeline }}" != "Gulp":
remove_gulp_files()
remove_packagejson_file()
if "{{ cookiecutter.use_docker }}".lower() == "y":
remove_node_dockerfile()
- if "{{ cookiecutter.cloud_provider}}".lower() == "none":
+ if "{{ cookiecutter.cloud_provider}}" == "None":
print(
WARNING + "You chose not to use a cloud provider, "
"media files won't be served in production." + TERMINATOR
@@ -397,13 +406,13 @@ def main():
if "{{ cookiecutter.use_docker }}".lower() == "y":
remove_celery_compose_dirs()
- if "{{ cookiecutter.ci_tool }}".lower() != "travis":
+ if "{{ cookiecutter.ci_tool }}" != "Travis":
remove_dottravisyml_file()
- if "{{ cookiecutter.ci_tool }}".lower() != "gitlab":
+ if "{{ cookiecutter.ci_tool }}" != "Gitlab":
remove_dotgitlabciyml_file()
- if "{{ cookiecutter.ci_tool }}".lower() != "github":
+ if "{{ cookiecutter.ci_tool }}" != "Github":
remove_dotgithub_folder()
if "{{ cookiecutter.use_drf }}".lower() == "n":
diff --git a/hooks/pre_gen_project.py b/hooks/pre_gen_project.py
index 70c29557c..c3eef1e43 100644
--- a/hooks/pre_gen_project.py
+++ b/hooks/pre_gen_project.py
@@ -4,7 +4,8 @@ NOTE:
as the whole Cookiecutter Django project initialization
can potentially be run in Python 2.x environment.
-TODO: ? restrict Cookiecutter Django project initialization to Python 3.x environments only
+TODO: restrict Cookiecutter Django project initialization
+ to Python 3.x environments only
"""
from __future__ import print_function
@@ -35,11 +36,11 @@ if "{{ cookiecutter.use_docker }}".lower() == "n":
if python_major_version == 2:
print(
WARNING + "You're running cookiecutter under Python 2, but the generated "
- "project requires Python 3.9+. Do you want to proceed (y/n)? " + TERMINATOR
+ "project requires Python 3.10+. Do you want to proceed (y/n)? " + TERMINATOR
)
yes_options, no_options = frozenset(["y"]), frozenset(["n"])
while True:
- choice = raw_input().lower()
+ choice = raw_input().lower() # noqa: F821
if choice in yes_options:
break
@@ -65,18 +66,17 @@ if (
and "{{ cookiecutter.cloud_provider }}" == "None"
):
print(
- "You should either use Whitenoise or select a Cloud Provider to serve static files"
+ "You should either use Whitenoise or select a "
+ "Cloud Provider to serve static files"
)
sys.exit(1)
if (
- "{{ cookiecutter.cloud_provider }}" == "GCP"
- and "{{ cookiecutter.mail_service }}" == "Amazon SES"
-) or (
- "{{ cookiecutter.cloud_provider }}" == "None"
- and "{{ cookiecutter.mail_service }}" == "Amazon SES"
+ "{{ cookiecutter.mail_service }}" == "Amazon SES"
+ and "{{ cookiecutter.cloud_provider }}" != "AWS"
):
print(
- "You should either use AWS or select a different Mail Service for sending emails."
+ "You should either use AWS or select a different "
+ "Mail Service for sending emails."
)
sys.exit(1)
diff --git a/pytest.ini b/pytest.ini
index 03ca13891..52506f47d 100644
--- a/pytest.ini
+++ b/pytest.ini
@@ -1,4 +1,3 @@
[pytest]
addopts = -v --tb=short
-python_paths = .
norecursedirs = .tox .git */migrations/* */static/* docs venv */{{cookiecutter.project_slug}}/*
diff --git a/requirements.txt b/requirements.txt
index 7ba8f8e41..2b7c69d33 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -1,24 +1,26 @@
-cookiecutter==1.7.3
-sh==1.14.2
+cookiecutter==2.1.1
+sh==1.14.3; sys_platform != "win32"
binaryornot==0.4.4
# Code quality
# ------------------------------------------------------------------------------
-black==21.9b0
-isort==5.9.3
-flake8==3.9.2
-flake8-isort==4.0.0
-pre-commit==2.15.0
+black==22.12.0
+isort==5.12.0
+flake8==6.0.0
+flake8-isort==6.0.0
+pre-commit==3.0.1
# Testing
# ------------------------------------------------------------------------------
-tox==3.24.4
-pytest==6.2.5
+tox==4.4.2
+pytest==7.2.1
pytest-cookies==0.6.1
pytest-instafail==0.4.2
-pyyaml==5.4.1
+pyyaml==6.0
# Scripting
# ------------------------------------------------------------------------------
-PyGithub==1.55
-jinja2==3.0.2
+PyGithub==1.57
+gitpython==3.1.30
+jinja2==3.1.2
+requests==2.28.2
diff --git a/scripts/__init__.py b/scripts/__init__.py
index 8b1378917..e69de29bb 100644
--- a/scripts/__init__.py
+++ b/scripts/__init__.py
@@ -1 +0,0 @@
-
diff --git a/scripts/create_django_issue.py b/scripts/create_django_issue.py
new file mode 100644
index 000000000..5809f393d
--- /dev/null
+++ b/scripts/create_django_issue.py
@@ -0,0 +1,320 @@
+"""
+Creates an issue that generates a table for dependency checking whether
+all packages support the latest Django version. "Latest" does not include
+patches, only comparing major and minor version numbers.
+
+This script handles when there are multiple Django versions that need
+to keep up to date.
+"""
+from __future__ import annotations
+
+import os
+import re
+import sys
+from collections.abc import Iterable
+from pathlib import Path
+from typing import TYPE_CHECKING, Any, NamedTuple
+
+import requests
+from github import Github
+
+if TYPE_CHECKING:
+ from github.Issue import Issue
+
+CURRENT_FILE = Path(__file__)
+ROOT = CURRENT_FILE.parents[1]
+REQUIREMENTS_DIR = ROOT / "{{cookiecutter.project_slug}}" / "requirements"
+GITHUB_TOKEN = os.getenv("GITHUB_TOKEN", None)
+GITHUB_REPO = os.getenv("GITHUB_REPOSITORY", None)
+
+
+class DjVersion(NamedTuple):
+ """
+ Wrapper to parse, compare and render Django versions.
+
+ Only keeps track on (major, minor) versions, excluding patches and pre-releases.
+ """
+
+ major: int
+ minor: int
+
+ def __str__(self) -> str:
+ """To render as string."""
+ return f"{self.major}.{self.minor}"
+
+ @classmethod
+ def parse(cls, version_str: str) -> DjVersion:
+ """Parse interesting values from the version string."""
+ major, minor, *_ = version_str.split(".")
+ return cls(major=int(major), minor=int(minor))
+
+ @classmethod
+ def parse_to_tuple(cls, version_str: str):
+ version = cls.parse(version_str=version_str)
+ return version.major, version.minor
+
+
+def get_package_info(package: str) -> dict:
+ """Get package metadata using PyPI API."""
+ # "django" converts to "Django" on redirect
+ r = requests.get(f"https://pypi.org/pypi/{package}/json", allow_redirects=True)
+ if not r.ok:
+ print(f"Couldn't find package: {package}")
+ sys.exit(1)
+ return r.json()
+
+
+def get_django_versions() -> Iterable[DjVersion]:
+ """List all django versions."""
+ django_package_info: dict[str, Any] = get_package_info("django")
+ releases = django_package_info["releases"].keys()
+ for release_str in releases:
+ if release_str.replace(".", "").isdigit():
+ # Exclude pre-releases with non-numeric characters in version
+ yield DjVersion.parse(release_str)
+
+
+def get_name_and_version(requirements_line: str) -> tuple[str, ...]:
+ """Get the name a version of a package from a line in the requirement file."""
+ full_name, version = requirements_line.split(" ", 1)[0].split("==")
+ name_without_extras = full_name.split("[", 1)[0]
+ return name_without_extras, version
+
+
+def get_all_latest_django_versions(
+ django_max_version: tuple[DjVersion] = None,
+) -> tuple[DjVersion, list[DjVersion]]:
+ """
+ Grabs all Django versions that are worthy of a GitHub issue.
+ Depends on Django versions having higher major version or minor version.
+ """
+ _django_max_version = (99, 99)
+ if django_max_version:
+ _django_max_version = django_max_version
+
+ print("Fetching all Django versions from PyPI")
+ base_txt = REQUIREMENTS_DIR / "base.txt"
+ with base_txt.open() as f:
+ for line in f.readlines():
+ if "django==" in line.lower():
+ break
+ else:
+ print(f"django not found in {base_txt}") # Huh...?
+ sys.exit(1)
+
+ # Begin parsing and verification
+ _, current_version_str = get_name_and_version(line)
+ # Get a tuple of (major, minor) - ignoring patch version
+ current_minor_version = DjVersion.parse(current_version_str)
+ newer_versions: set[DjVersion] = set()
+ for django_version in get_django_versions():
+ if current_minor_version < django_version <= _django_max_version:
+ newer_versions.add(django_version)
+
+ return current_minor_version, sorted(newer_versions, reverse=True)
+
+
+_TABLE_HEADER = """
+
+## {file}.txt
+
+| Name | Version in Master | {dj_version} Compatible Version | OK |
+| ---- | :---------------: | :-----------------------------: | :-: |
+"""
+VITAL_BUT_UNKNOWN = [
+ "django-environ", # not updated often
+]
+
+
+class GitHubManager:
+ def __init__(self, base_dj_version: DjVersion, needed_dj_versions: list[DjVersion]):
+ self.github = Github(GITHUB_TOKEN)
+ self.repo = self.github.get_repo(GITHUB_REPO)
+
+ self.base_dj_version = base_dj_version
+ self.needed_dj_versions = needed_dj_versions
+ # (major+minor) Version and description
+ self.existing_issues: dict[DjVersion, Issue] = {}
+
+ # Load all requirements from our requirements files and preload their
+ # package information like a cache:
+ self.requirements_files = ["base", "local", "production"]
+ # Format:
+ # requirement file name: {package name: (master_version, package_info)}
+ self.requirements: dict[str, dict[str, tuple[str, dict]]] = {
+ x: {} for x in self.requirements_files
+ }
+
+ def setup(self) -> None:
+ self.load_requirements()
+ self.load_existing_issues()
+
+ def load_requirements(self):
+ print("Reading requirements")
+ for requirements_file in self.requirements_files:
+ with (REQUIREMENTS_DIR / f"{requirements_file}.txt").open() as f:
+ for line in f.readlines():
+ if (
+ "==" in line
+ and not line.startswith("{%")
+ and not line.startswith(" #")
+ and not line.startswith("#")
+ and not line.startswith(" ")
+ ):
+ name, version = get_name_and_version(line)
+ self.requirements[requirements_file][name] = (
+ version,
+ get_package_info(name),
+ )
+
+ def load_existing_issues(self):
+ """Closes the issue if the base Django version is greater than needed"""
+ print("Load existing issues from GitHub")
+ qualifiers = {
+ "repo": GITHUB_REPO,
+ "author": "app/github-actions",
+ "state": "open",
+ "is": "issue",
+ "in": "title",
+ }
+ issues = list(
+ self.github.search_issues(
+ "[Django Update]", "created", "desc", **qualifiers
+ )
+ )
+ print(f"Found {len(issues)} issues matching search")
+ for issue in issues:
+ matches = re.match(r"\[Update Django] Django (\d+.\d+)$", issue.title)
+ if not matches:
+ continue
+ issue_version = DjVersion.parse(matches.group(1))
+ if self.base_dj_version > issue_version:
+ issue.edit(state="closed")
+ print(f"Closed issue {issue.title} (ID: [{issue.id}]({issue.url}))")
+ else:
+ self.existing_issues[issue_version] = issue
+
+ def get_compatibility(
+ self, package_name: str, package_info: dict, needed_dj_version: DjVersion
+ ):
+ """
+ Verify compatibility via setup.py classifiers. If Django is not in the
+ classifiers, then default compatibility is n/a and OK is ✅.
+
+ If it's a package that's vital but known to not be updated often, we give it
+ a ❓. If a package has ❓ or 🕒, then we allow manual update. Automatic updates
+ only include ❌ and ✅.
+ """
+ # If issue previously existed, find package and skip any gtg, manually
+ # updated packages, or known releases that will happen but haven't yet
+ if issue := self.existing_issues.get(needed_dj_version):
+ if index := issue.body.find(package_name):
+ name, _current, prev_compat, ok = (
+ s.strip() for s in issue.body[index:].split("|", 4)[:4]
+ )
+ if ok in ("✅", "❓", "🕒"):
+ return prev_compat, ok
+
+ if package_name in VITAL_BUT_UNKNOWN:
+ return "", "❓"
+
+ # Check classifiers if it includes Django
+ supported_dj_versions: list[DjVersion] = []
+ for classifier in package_info["info"]["classifiers"]:
+ # Usually in the form of "Framework :: Django :: 3.2"
+ tokens = classifier.split(" ")
+ if len(tokens) >= 5 and tokens[2].lower() == "django":
+ version = DjVersion.parse(tokens[4])
+ if len(version) == 2:
+ supported_dj_versions.append(version)
+
+ if supported_dj_versions:
+ if any(v >= needed_dj_version for v in supported_dj_versions):
+ return package_info["info"]["version"], "✅"
+ else:
+ return "", "❌"
+
+ # Django classifier DNE; assume it isn't a Django lib
+ # Great exceptions include pylint-django, where we need to do this manually...
+ return "n/a", "✅"
+
+ HOME_PAGE_URL_KEYS = [
+ "home_page",
+ "project_url",
+ "docs_url",
+ "package_url",
+ "release_url",
+ "bugtrack_url",
+ ]
+
+ def _get_md_home_page_url(self, package_info: dict):
+ urls = [
+ package_info["info"].get(url_key) for url_key in self.HOME_PAGE_URL_KEYS
+ ]
+ try:
+ return f"[{{}}]({next(item for item in urls if item)})"
+ except StopIteration:
+ return "{}"
+
+ def generate_markdown(self, needed_dj_version: DjVersion):
+ requirements = f"{needed_dj_version} requirements tables\n\n"
+ for _file in self.requirements_files:
+ requirements += _TABLE_HEADER.format_map(
+ {"file": _file, "dj_version": needed_dj_version}
+ )
+ for package_name, (version, info) in self.requirements[_file].items():
+ compat_version, icon = self.get_compatibility(
+ package_name, info, needed_dj_version
+ )
+ requirements += (
+ f"| {self._get_md_home_page_url(info).format(package_name)} "
+ f"| {version.strip()} "
+ f"| {compat_version.strip()} "
+ f"| {icon} "
+ f"|\n"
+ )
+
+ return requirements
+
+ def create_or_edit_issue(self, needed_dj_version: DjVersion, description: str):
+ if issue := self.existing_issues.get(needed_dj_version):
+ print(f"Editing issue #{issue.number} for Django {needed_dj_version}")
+ issue.edit(body=description)
+ else:
+ print(f"Creating new issue for Django {needed_dj_version}")
+ issue = self.repo.create_issue(
+ f"[Update Django] Django {needed_dj_version}", description
+ )
+ issue.add_to_labels(f"django{needed_dj_version}")
+
+ def generate(self):
+ for version in self.needed_dj_versions:
+ print(f"Handling GitHub issue for Django {version}")
+ md_content = self.generate_markdown(version)
+ print(f"Generated markdown:\n\n{md_content}")
+ self.create_or_edit_issue(version, md_content)
+
+
+def main(django_max_version=None) -> None:
+ # Check if there are any djs
+ current_dj, latest_djs = get_all_latest_django_versions(
+ django_max_version=django_max_version
+ )
+ if not latest_djs:
+ sys.exit(0)
+ manager = GitHubManager(current_dj, latest_djs)
+ manager.setup()
+ manager.generate()
+
+
+if __name__ == "__main__":
+ if GITHUB_REPO is None:
+ raise RuntimeError(
+ "No github repo, please set the environment variable GITHUB_REPOSITORY"
+ )
+ max_version = None
+ last_arg = sys.argv[-1]
+ if CURRENT_FILE.name not in last_arg:
+ max_version = DjVersion.parse_to_tuple(version_str=last_arg)
+
+ main(django_max_version=max_version)
diff --git a/scripts/update_changelog.py b/scripts/update_changelog.py
index 08ff5b095..b50d25066 100644
--- a/scripts/update_changelog.py
+++ b/scripts/update_changelog.py
@@ -1,22 +1,31 @@
+import datetime as dt
import os
+import re
+from collections.abc import Iterable
from pathlib import Path
+
+import git
+import github.PullRequest
+import github.Repository
from github import Github
from jinja2 import Template
-import datetime as dt
CURRENT_FILE = Path(__file__)
ROOT = CURRENT_FILE.parents[1]
-GITHUB_TOKEN = os.getenv("GITHUB_TOKEN", None)
-
-# Generate changelog for PRs merged yesterday
-MERGED_DATE = dt.date.today() - dt.timedelta(days=1)
+GITHUB_TOKEN = os.getenv("GITHUB_TOKEN")
+GITHUB_REPO = os.getenv("GITHUB_REPOSITORY")
+GIT_BRANCH = os.getenv("GITHUB_REF_NAME")
def main() -> None:
"""
Script entry point.
"""
- merged_pulls = list(iter_pulls())
+ # Generate changelog for PRs merged yesterday
+ merged_date = dt.date.today() - dt.timedelta(days=1)
+ repo = Github(login_or_token=GITHUB_TOKEN).get_repo(GITHUB_REPO)
+ merged_pulls = list(iter_pulls(repo, merged_date))
+ print(f"Merged pull requests: {merged_pulls}")
if not merged_pulls:
print("Nothing was merged, existing.")
return
@@ -25,30 +34,50 @@ def main() -> None:
grouped_pulls = group_pulls_by_change_type(merged_pulls)
# Generate portion of markdown
- rendered_content = generate_md(grouped_pulls)
+ release_changes_summary = generate_md(grouped_pulls)
+ print(f"Summary of changes: {release_changes_summary}")
# Update CHANGELOG.md file
- file_path = ROOT / "CHANGELOG.md"
- old_content = file_path.read_text()
- updated_content = old_content.replace(
- "",
- f"\n\n{rendered_content}",
+ release = f"{merged_date:%Y.%m.%d}"
+ changelog_path = ROOT / "CHANGELOG.md"
+ write_changelog(changelog_path, release, release_changes_summary)
+ print(f"Wrote {changelog_path}")
+
+ # Update version
+ setup_py_path = ROOT / "setup.py"
+ update_version(setup_py_path, release)
+ print(f"Updated version in {setup_py_path}")
+
+ # Commit changes, create tag and push
+ update_git_repo([changelog_path, setup_py_path], release)
+
+ # Create GitHub release
+ github_release = repo.create_git_release(
+ tag=release,
+ name=release,
+ message=release_changes_summary,
)
- file_path.write_text(updated_content)
+ print(f"Created release on GitHub {github_release}")
-def iter_pulls():
+def iter_pulls(
+ repo: github.Repository.Repository,
+ merged_date: dt.date,
+) -> Iterable[github.PullRequest.PullRequest]:
"""Fetch merged pull requests at the date we're interested in."""
- repo = Github(login_or_token=GITHUB_TOKEN).get_repo("pydanny/cookiecutter-django")
recent_pulls = repo.get_pulls(
- state="closed", sort="updated", direction="desc"
+ state="closed",
+ sort="updated",
+ direction="desc",
).get_page(0)
for pull in recent_pulls:
- if pull.merged and pull.merged_at.date() == MERGED_DATE:
+ if pull.merged and pull.merged_at.date() == merged_date:
yield pull
-def group_pulls_by_change_type(pull_requests_list):
+def group_pulls_by_change_type(
+ pull_requests_list: list[github.PullRequest.PullRequest],
+) -> dict[str, list[github.PullRequest.PullRequest]]:
"""Group pull request by change type."""
grouped_pulls = {
"Changed": [],
@@ -56,7 +85,7 @@ def group_pulls_by_change_type(pull_requests_list):
"Updated": [],
}
for pull in pull_requests_list:
- label_names = {l.name for l in pull.labels}
+ label_names = {label.name for label in pull.labels}
if "update" in label_names:
group_name = "Updated"
elif "bug" in label_names:
@@ -67,12 +96,63 @@ def group_pulls_by_change_type(pull_requests_list):
return grouped_pulls
-def generate_md(grouped_pulls):
+def generate_md(grouped_pulls: dict[str, list[github.PullRequest.PullRequest]]) -> str:
"""Generate markdown file from Jinja template."""
changelog_template = ROOT / ".github" / "changelog-template.md"
template = Template(changelog_template.read_text(), autoescape=True)
- return template.render(merge_date=MERGED_DATE, grouped_pulls=grouped_pulls)
+ return template.render(grouped_pulls=grouped_pulls)
+
+
+def write_changelog(file_path: Path, release: str, content: str) -> None:
+ """Write Release details to the changelog file."""
+ content = f"## {release}\n{content}"
+ old_content = file_path.read_text()
+ updated_content = old_content.replace(
+ "",
+ f"\n\n{content}",
+ )
+ file_path.write_text(updated_content)
+
+
+def update_version(file_path: Path, release: str) -> None:
+ """Update template version in setup.py."""
+ old_content = file_path.read_text()
+ updated_content = re.sub(
+ r'\nversion = "\d+\.\d+\.\d+"\n',
+ f'\nversion = "{release}"\n',
+ old_content,
+ )
+ file_path.write_text(updated_content)
+
+
+def update_git_repo(paths: list[Path], release: str) -> None:
+ """Commit, tag changes in git repo and push to origin."""
+ repo = git.Repo(ROOT)
+ for path in paths:
+ repo.git.add(path)
+ message = f"Release {release}"
+
+ user = repo.git.config("--get", "user.name")
+ email = repo.git.config("--get", "user.email")
+
+ repo.git.commit(
+ m=message,
+ author=f"{user} <{email}>",
+ )
+ repo.git.tag("-a", release, m=message)
+ server = f"https://{GITHUB_TOKEN}@github.com/{GITHUB_REPO}.git"
+ print(f"Pushing changes to {GIT_BRANCH} branch of {GITHUB_REPO}")
+ repo.git.push(server, GIT_BRANCH)
+ repo.git.push("--tags", server, GIT_BRANCH)
if __name__ == "__main__":
+ if GITHUB_REPO is None:
+ raise RuntimeError(
+ "No github repo, please set the environment variable GITHUB_REPOSITORY"
+ )
+ if GIT_BRANCH is None:
+ raise RuntimeError(
+ "No git branch set, please set the GITHUB_REF_NAME environment variable"
+ )
main()
diff --git a/scripts/update_contributors.py b/scripts/update_contributors.py
index 0236daae3..76ccf60ad 100644
--- a/scripts/update_contributors.py
+++ b/scripts/update_contributors.py
@@ -1,5 +1,7 @@
import json
+import os
from pathlib import Path
+
from github import Github
from github.NamedUser import NamedUser
from jinja2 import Template
@@ -7,6 +9,8 @@ from jinja2 import Template
CURRENT_FILE = Path(__file__)
ROOT = CURRENT_FILE.parents[1]
BOT_LOGINS = ["pyup-bot"]
+GITHUB_TOKEN = os.getenv("GITHUB_TOKEN", None)
+GITHUB_REPO = os.getenv("GITHUB_REPOSITORY", None)
def main() -> None:
@@ -39,7 +43,7 @@ def iter_recent_authors():
Use Github API to fetch recent authors rather than
git CLI to work with Github usernames.
"""
- repo = Github(per_page=5).get_repo("pydanny/cookiecutter-django")
+ repo = Github(login_or_token=GITHUB_TOKEN, per_page=5).get_repo(GITHUB_REPO)
recent_pulls = repo.get_pulls(
state="closed", sort="updated", direction="desc"
).get_page(0)
@@ -101,4 +105,8 @@ def write_md_file(contributors):
if __name__ == "__main__":
+ if GITHUB_REPO is None:
+ raise RuntimeError(
+ "No github repo, please set the environment variable GITHUB_REPOSITORY"
+ )
main()
diff --git a/setup.cfg b/setup.cfg
new file mode 100644
index 000000000..dd8f1ef3c
--- /dev/null
+++ b/setup.cfg
@@ -0,0 +1,7 @@
+[flake8]
+exclude = docs
+max-line-length = 88
+
+[isort]
+profile = black
+known_first_party = tests,scripts,hooks
diff --git a/setup.py b/setup.py
index 64e923aef..5adbbc664 100644
--- a/setup.py
+++ b/setup.py
@@ -1,21 +1,11 @@
#!/usr/bin/env python
-
-import os
-import sys
-
try:
from setuptools import setup
except ImportError:
from distutils.core import setup
-# Our version ALWAYS matches the version of Django we support
-# If Django has a new release, we branch, tag, then update this setting after the tag.
-version = "3.1.13"
-
-if sys.argv[-1] == "tag":
- os.system(f'git tag -a {version} -m "version {version}"')
- os.system("git push --tags")
- sys.exit()
+# We use calendar versioning
+version = "2023.01.27"
with open("README.rst") as readme_file:
long_description = readme_file.read()
@@ -23,24 +13,27 @@ with open("README.rst") as readme_file:
setup(
name="cookiecutter-django",
version=version,
- description="A Cookiecutter template for creating production-ready Django projects quickly.",
+ description=(
+ "A Cookiecutter template for creating production-ready "
+ "Django projects quickly."
+ ),
long_description=long_description,
author="Daniel Roy Greenfeld",
author_email="pydanny@gmail.com",
- url="https://github.com/pydanny/cookiecutter-django",
+ url="https://github.com/cookiecutter/cookiecutter-django",
packages=[],
license="BSD",
zip_safe=False,
classifiers=[
"Development Status :: 4 - Beta",
"Environment :: Console",
- "Framework :: Django :: 3.0",
+ "Framework :: Django :: 4.0",
"Intended Audience :: Developers",
"Natural Language :: English",
"License :: OSI Approved :: BSD License",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
- "Programming Language :: Python :: 3.9",
+ "Programming Language :: Python :: 3.10",
"Programming Language :: Python :: Implementation :: CPython",
"Topic :: Software Development",
],
diff --git a/tests/__init__.py b/tests/__init__.py
new file mode 100644
index 000000000..e69de29bb
diff --git a/tests/test_bare.sh b/tests/test_bare.sh
index 1f52d91b6..05da9328b 100755
--- a/tests/test_bare.sh
+++ b/tests/test_bare.sh
@@ -6,19 +6,12 @@
set -o errexit
set -x
-# Install modern pip with new resolver:
-# https://blog.python.org/2020/11/pip-20-3-release-new-resolver.html
-pip install 'pip>=20.3'
-
-# install test requirements
-pip install -r requirements.txt
-
# create a cache directory
mkdir -p .cache/bare
cd .cache/bare
# create the project using the default settings in cookiecutter.json
-cookiecutter ../../ --no-input --overwrite-if-exists use_docker=n $@
+cookiecutter ../../ --no-input --overwrite-if-exists use_docker=n "$@"
cd my_awesome_project
# Install OS deps
@@ -35,3 +28,18 @@ pre-commit run --show-diff-on-failure -a
# run the project's tests
pytest
+
+# Make sure the check doesn't raise any warnings
+python manage.py check --fail-level WARNING
+
+if [ -f "package.json" ]
+then
+ npm install
+ if [ -f "gulpfile.js" ]
+ then
+ npm run build
+ fi
+fi
+
+# Generate the HTML for the documentation
+cd docs && make html
diff --git a/tests/test_cookiecutter_generation.py b/tests/test_cookiecutter_generation.py
index 0e36e83ac..3a881cc64 100755
--- a/tests/test_cookiecutter_generation.py
+++ b/tests/test_cookiecutter_generation.py
@@ -1,15 +1,25 @@
import os
import re
+import sys
import pytest
-from cookiecutter.exceptions import FailedHookException
-import sh
+
+try:
+ import sh
+except (ImportError, ModuleNotFoundError):
+ sh = None # sh doesn't support Windows
import yaml
from binaryornot.check import is_binary
+from cookiecutter.exceptions import FailedHookException
PATTERN = r"{{(\s?cookiecutter)[.](.*?)}}"
RE_OBJ = re.compile(PATTERN)
+if sys.platform.startswith("win"):
+ pytest.skip("sh doesn't support windows", allow_module_level=True)
+elif sys.platform.startswith("darwin") and os.getenv("CI"):
+ pytest.skip("skipping slow macOS tests on CI", allow_module_level=True)
+
@pytest.fixture
def context():
@@ -37,14 +47,17 @@ SUPPORTED_COMBINATIONS = [
{"use_pycharm": "n"},
{"use_docker": "y"},
{"use_docker": "n"},
- {"postgresql_version": "13.2"},
- {"postgresql_version": "12.6"},
- {"postgresql_version": "11.11"},
- {"postgresql_version": "10.16"},
+ {"postgresql_version": "14"},
+ {"postgresql_version": "13"},
+ {"postgresql_version": "12"},
+ {"postgresql_version": "11"},
+ {"postgresql_version": "10"},
{"cloud_provider": "AWS", "use_whitenoise": "y"},
{"cloud_provider": "AWS", "use_whitenoise": "n"},
{"cloud_provider": "GCP", "use_whitenoise": "y"},
{"cloud_provider": "GCP", "use_whitenoise": "n"},
+ {"cloud_provider": "Azure", "use_whitenoise": "y"},
+ {"cloud_provider": "Azure", "use_whitenoise": "n"},
{"cloud_provider": "None", "use_whitenoise": "y", "mail_service": "Mailgun"},
{"cloud_provider": "None", "use_whitenoise": "y", "mail_service": "Mailjet"},
{"cloud_provider": "None", "use_whitenoise": "y", "mail_service": "Mandrill"},
@@ -71,17 +84,23 @@ SUPPORTED_COMBINATIONS = [
{"cloud_provider": "GCP", "mail_service": "SendinBlue"},
{"cloud_provider": "GCP", "mail_service": "SparkPost"},
{"cloud_provider": "GCP", "mail_service": "Other SMTP"},
- # Note: cloud_providers GCP and None with mail_service Amazon SES is not supported
+ {"cloud_provider": "Azure", "mail_service": "Mailgun"},
+ {"cloud_provider": "Azure", "mail_service": "Mailjet"},
+ {"cloud_provider": "Azure", "mail_service": "Mandrill"},
+ {"cloud_provider": "Azure", "mail_service": "Postmark"},
+ {"cloud_provider": "Azure", "mail_service": "Sendgrid"},
+ {"cloud_provider": "Azure", "mail_service": "SendinBlue"},
+ {"cloud_provider": "Azure", "mail_service": "SparkPost"},
+ {"cloud_provider": "Azure", "mail_service": "Other SMTP"},
+ # Note: cloud_providers GCP, Azure, and None
+ # with mail_service Amazon SES is not supported
{"use_async": "y"},
{"use_async": "n"},
{"use_drf": "y"},
{"use_drf": "n"},
- {"js_task_runner": "None"},
- {"js_task_runner": "Gulp"},
- {"custom_bootstrap_compilation": "y"},
- {"custom_bootstrap_compilation": "n"},
- {"use_compressor": "y"},
- {"use_compressor": "n"},
+ {"frontend_pipeline": "None"},
+ {"frontend_pipeline": "Django Compressor"},
+ {"frontend_pipeline": "Gulp"},
{"use_celery": "y"},
{"use_celery": "n"},
{"use_mailhog": "y"},
@@ -105,20 +124,21 @@ SUPPORTED_COMBINATIONS = [
UNSUPPORTED_COMBINATIONS = [
{"cloud_provider": "None", "use_whitenoise": "n"},
{"cloud_provider": "GCP", "mail_service": "Amazon SES"},
+ {"cloud_provider": "Azure", "mail_service": "Amazon SES"},
{"cloud_provider": "None", "mail_service": "Amazon SES"},
]
def _fixture_id(ctx):
- """Helper to get a user friendly test name from the parametrized context."""
+ """Helper to get a user-friendly test name from the parametrized context."""
return "-".join(f"{key}:{value}" for key, value in ctx.items())
-def build_files_list(root_dir):
+def build_files_list(base_dir):
"""Build a list containing absolute paths to the generated files."""
return [
os.path.join(dirpath, file_path)
- for dirpath, subdirs, files in os.walk(root_dir)
+ for dirpath, subdirs, files in os.walk(base_dir)
for file_path in files
]
@@ -130,7 +150,7 @@ def check_paths(paths):
if is_binary(path):
continue
- for line in open(path, "r"):
+ for line in open(path):
match = RE_OBJ.search(line)
assert match is None, f"cookiecutter variable not replaced in {path}"
@@ -142,10 +162,10 @@ def test_project_generation(cookies, context, context_override):
result = cookies.bake(extra_context={**context, **context_override})
assert result.exit_code == 0
assert result.exception is None
- assert result.project.basename == context["project_slug"]
- assert result.project.isdir()
+ assert result.project_path.name == context["project_slug"]
+ assert result.project_path.is_dir()
- paths = build_files_list(str(result.project))
+ paths = build_files_list(str(result.project_path))
assert paths
check_paths(paths)
@@ -156,7 +176,7 @@ def test_flake8_passes(cookies, context_override):
result = cookies.bake(extra_context=context_override)
try:
- sh.flake8(_cwd=str(result.project))
+ sh.flake8(_cwd=str(result.project_path))
except sh.ErrorReturnCode as e:
pytest.fail(e.stdout.decode())
@@ -168,7 +188,12 @@ def test_black_passes(cookies, context_override):
try:
sh.black(
- "--check", "--diff", "--exclude", "migrations", _cwd=str(result.project)
+ "--check",
+ "--diff",
+ "--exclude",
+ "migrations",
+ ".",
+ _cwd=str(result.project_path),
)
except sh.ErrorReturnCode as e:
pytest.fail(e.stdout.decode())
@@ -187,10 +212,10 @@ def test_travis_invokes_pytest(cookies, context, use_docker, expected_test_scrip
assert result.exit_code == 0
assert result.exception is None
- assert result.project.basename == context["project_slug"]
- assert result.project.isdir()
+ assert result.project_path.name == context["project_slug"]
+ assert result.project_path.is_dir()
- with open(f"{result.project}/.travis.yml", "r") as travis_yml:
+ with open(f"{result.project_path}/.travis.yml") as travis_yml:
try:
yml = yaml.safe_load(travis_yml)["jobs"]["include"]
assert yml[0]["script"] == ["flake8"]
@@ -214,10 +239,10 @@ def test_gitlab_invokes_flake8_and_pytest(
assert result.exit_code == 0
assert result.exception is None
- assert result.project.basename == context["project_slug"]
- assert result.project.isdir()
+ assert result.project_path.name == context["project_slug"]
+ assert result.project_path.is_dir()
- with open(f"{result.project}/.gitlab-ci.yml", "r") as gitlab_yml:
+ with open(f"{result.project_path}/.gitlab-ci.yml") as gitlab_yml:
try:
gitlab_config = yaml.safe_load(gitlab_yml)
assert gitlab_config["flake8"]["script"] == ["flake8"]
@@ -241,10 +266,10 @@ def test_github_invokes_linter_and_pytest(
assert result.exit_code == 0
assert result.exception is None
- assert result.project.basename == context["project_slug"]
- assert result.project.isdir()
+ assert result.project_path.name == context["project_slug"]
+ assert result.project_path.is_dir()
- with open(f"{result.project}/.github/workflows/ci.yml", "r") as github_yml:
+ with open(f"{result.project_path}/.github/workflows/ci.yml") as github_yml:
try:
github_config = yaml.safe_load(github_yml)
linter_present = False
@@ -264,7 +289,7 @@ def test_github_invokes_linter_and_pytest(
@pytest.mark.parametrize("slug", ["project slug", "Project_Slug"])
def test_invalid_slug(cookies, context, slug):
- """Invalid slug should failed pre-generation hook."""
+ """Invalid slug should fail pre-generation hook."""
context.update({"project_slug": slug})
result = cookies.bake(extra_context=context)
@@ -295,6 +320,6 @@ def test_pycharm_docs_removed(cookies, context, use_pycharm, pycharm_docs_exist)
context.update({"use_pycharm": use_pycharm})
result = cookies.bake(extra_context=context)
- with open(f"{result.project}/docs/index.rst", "r") as f:
+ with open(f"{result.project_path}/docs/index.rst") as f:
has_pycharm_docs = "pycharm/configuration" in f.read()
assert has_pycharm_docs is pycharm_docs_exist
diff --git a/tests/test_docker.sh b/tests/test_docker.sh
index 001ef06d0..b3663bd2c 100755
--- a/tests/test_docker.sh
+++ b/tests/test_docker.sh
@@ -6,15 +6,12 @@
set -o errexit
set -x
-# install test requirements
-pip install -r requirements.txt
-
# create a cache directory
mkdir -p .cache/docker
cd .cache/docker
# create the project using the default settings in cookiecutter.json
-cookiecutter ../../ --no-input --overwrite-if-exists use_docker=y $@
+cookiecutter ../../ --no-input --overwrite-if-exists use_docker=y "$@"
cd my_awesome_project
# Lint by running pre-commit on all files
@@ -24,6 +21,9 @@ git init
git add .
pre-commit run --show-diff-on-failure -a
+# make sure all images build
+docker-compose -f local.yml build
+
# run the project's type checks
docker-compose -f local.yml run django mypy my_awesome_project
@@ -35,3 +35,9 @@ docker-compose -f local.yml run django python manage.py makemigrations --dry-run
# Test support for translations
docker-compose -f local.yml run django python manage.py makemessages --all
+
+# Make sure the check doesn't raise any warnings
+docker-compose -f local.yml run django python manage.py check --fail-level WARNING
+
+# Generate the HTML for the documentation
+docker-compose -f local.yml run docs make html
diff --git a/tests/test_hooks.py b/tests/test_hooks.py
new file mode 100644
index 000000000..7ca752722
--- /dev/null
+++ b/tests/test_hooks.py
@@ -0,0 +1,28 @@
+"""Unit tests for the hooks"""
+import os
+from pathlib import Path
+
+import pytest
+
+from hooks.post_gen_project import append_to_gitignore_file
+
+
+@pytest.fixture()
+def working_directory(tmp_path):
+ prev_cwd = Path.cwd()
+ os.chdir(tmp_path)
+ try:
+ yield tmp_path
+ finally:
+ os.chdir(prev_cwd)
+
+
+def test_append_to_gitignore_file(working_directory):
+ gitignore_file = working_directory / ".gitignore"
+ gitignore_file.write_text("node_modules/\n")
+ append_to_gitignore_file(".envs/*")
+ linesep = os.linesep.encode()
+ assert (
+ gitignore_file.read_bytes() == b"node_modules/" + linesep + b".envs/*" + linesep
+ )
+ assert gitignore_file.read_text() == "node_modules/\n.envs/*\n"
diff --git a/tox.ini b/tox.ini
index 0afe3931d..0400e4f91 100644
--- a/tox.ini
+++ b/tox.ini
@@ -1,6 +1,6 @@
[tox]
skipsdist = true
-envlist = py39,black-template
+envlist = py310,black-template
[testenv]
deps = -rrequirements.txt
diff --git a/{{cookiecutter.project_slug}}/.dockerignore b/{{cookiecutter.project_slug}}/.dockerignore
index 5518e60af..7369480e3 100644
--- a/{{cookiecutter.project_slug}}/.dockerignore
+++ b/{{cookiecutter.project_slug}}/.dockerignore
@@ -8,3 +8,4 @@
.readthedocs.yml
.travis.yml
venv
+.git
diff --git a/{{cookiecutter.project_slug}}/.editorconfig b/{{cookiecutter.project_slug}}/.editorconfig
index 261407067..6a9a5c45d 100644
--- a/{{cookiecutter.project_slug}}/.editorconfig
+++ b/{{cookiecutter.project_slug}}/.editorconfig
@@ -12,7 +12,7 @@ trim_trailing_whitespace = true
indent_style = space
indent_size = 4
-[*.{html,css,scss,json,yml}]
+[*.{html,css,scss,json,yml,xml}]
indent_style = space
indent_size = 2
diff --git a/{{cookiecutter.project_slug}}/.envs/.production/.django b/{{cookiecutter.project_slug}}/.envs/.production/.django
index e7e8461c9..ad652c9ad 100644
--- a/{{cookiecutter.project_slug}}/.envs/.production/.django
+++ b/{{cookiecutter.project_slug}}/.envs/.production/.django
@@ -44,6 +44,12 @@ DJANGO_AWS_STORAGE_BUCKET_NAME=
# ------------------------------------------------------------------------------
GOOGLE_APPLICATION_CREDENTIALS=
DJANGO_GCP_STORAGE_BUCKET_NAME=
+{% elif cookiecutter.cloud_provider == 'Azure' %}
+# Azure
+# ------------------------------------------------------------------------------
+DJANGO_AZURE_ACCOUNT_KEY=
+DJANGO_AZURE_ACCOUNT_NAME=
+DJANGO_AZURE_CONTAINER_NAME=
{% endif %}
# django-allauth
# ------------------------------------------------------------------------------
diff --git a/{{cookiecutter.project_slug}}/.github/dependabot.yml b/{{cookiecutter.project_slug}}/.github/dependabot.yml
index 8e8ac8663..420a63cdc 100644
--- a/{{cookiecutter.project_slug}}/.github/dependabot.yml
+++ b/{{cookiecutter.project_slug}}/.github/dependabot.yml
@@ -1,7 +1,95 @@
+# Config for Dependabot updates. See Documentation here:
+# https://help.github.com/github/administering-a-repository/configuration-options-for-dependency-updates
+
version: 2
updates:
- # Update Github actions in workflows
+ # Update GitHub actions in workflows
- package-ecosystem: "github-actions"
directory: "/"
+ # Check for updates to GitHub Actions every weekday
schedule:
interval: "daily"
+
+{%- if cookiecutter.use_docker == 'y' %}
+
+ # Enable version updates for Docker
+ # We need to specify each Dockerfile in a separate entry because Dependabot doesn't
+ # support wildcards or recursively checking subdirectories. Check this issue for updates:
+ # https://github.com/dependabot/dependabot-core/issues/2178
+ - package-ecosystem: "docker"
+ # Look for a `Dockerfile` in the `compose/local/django` directory
+ directory: "compose/local/django/"
+ # Check for updates to GitHub Actions every weekday
+ schedule:
+ interval: "daily"
+
+ # Enable version updates for Docker
+ - package-ecosystem: "docker"
+ # Look for a `Dockerfile` in the `compose/local/docs` directory
+ directory: "compose/local/docs/"
+ # Check for updates to GitHub Actions every weekday
+ schedule:
+ interval: "daily"
+
+ # Enable version updates for Docker
+ - package-ecosystem: "docker"
+ # Look for a `Dockerfile` in the `compose/local/node` directory
+ directory: "compose/local/node/"
+ # Check for updates to GitHub Actions every weekday
+ schedule:
+ interval: "daily"
+
+ # Enable version updates for Docker
+ - package-ecosystem: "docker"
+ # Look for a `Dockerfile` in the `compose/production/aws` directory
+ directory: "compose/production/aws/"
+ # Check for updates to GitHub Actions every weekday
+ schedule:
+ interval: "daily"
+
+ # Enable version updates for Docker
+ - package-ecosystem: "docker"
+ # Look for a `Dockerfile` in the `compose/production/django` directory
+ directory: "compose/production/django/"
+ # Check for updates to GitHub Actions every weekday
+ schedule:
+ interval: "daily"
+
+ # Enable version updates for Docker
+ - package-ecosystem: "docker"
+ # Look for a `Dockerfile` in the `compose/production/postgres` directory
+ directory: "compose/production/postgres/"
+ # Check for updates to GitHub Actions every weekday
+ schedule:
+ interval: "daily"
+
+ # Enable version updates for Docker
+ - package-ecosystem: "docker"
+ # Look for a `Dockerfile` in the `compose/production/traefik` directory
+ directory: "compose/production/traefik/"
+ # Check for updates to GitHub Actions every weekday
+ schedule:
+ interval: "daily"
+
+{%- endif %}
+
+ # Enable version updates for Python/Pip - Production
+ - package-ecosystem: "pip"
+ # Look for a `requirements.txt` in the `root` directory
+ # also 'setup.cfg', 'runtime.txt' and 'requirements/*.txt'
+ directory: "/"
+ # Check for updates to GitHub Actions every weekday
+ schedule:
+ interval: "daily"
+
+{%- if cookiecutter.frontend_pipeline == 'Gulp' %}
+
+ # Enable version updates for javascript/npm
+ - package-ecosystem: "npm"
+ # Look for a `packages.json' in the `root` directory
+ directory: "/"
+ # Check for updates to GitHub Actions every weekday
+ schedule:
+ interval: "daily"
+
+{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/.github/workflows/ci.yml b/{{cookiecutter.project_slug}}/.github/workflows/ci.yml
index 1e17068ee..0790187bd 100644
--- a/{{cookiecutter.project_slug}}/.github/workflows/ci.yml
+++ b/{{cookiecutter.project_slug}}/.github/workflows/ci.yml
@@ -14,6 +14,9 @@ on:
branches: [ "master", "main" ]
paths-ignore: [ "docs/**" ]
+concurrency:
+ group: {% raw %}${{ github.head_ref || github.run_id }}{% endraw %}
+ cancel-in-progress: true
jobs:
linter:
@@ -21,18 +24,19 @@ jobs:
steps:
- name: Checkout Code Repository
- uses: actions/checkout@v2
+ uses: actions/checkout@v3
- - name: Set up Python 3.9
- uses: actions/setup-python@v2
+ - name: Set up Python
+ uses: actions/setup-python@v3
with:
- python-version: 3.9
+ python-version: "3.10"
+ cache: pip
+ cache-dependency-path: |
+ requirements/base.txt
+ requirements/local.txt
- # Run all pre-commit hooks on all the files.
- # Getting only staged files can be tricky in case a new PR is opened
- # since the action is run on a branch in detached head state
- - name: Install and Run Pre-commit
- uses: pre-commit/action@v2.0.0
+ - name: Run pre-commit
+ uses: pre-commit/action@v2.0.3
# With no caching at all the entire ci process takes 4m 30s to complete!
pytest:
@@ -64,7 +68,7 @@ jobs:
steps:
- name: Checkout Code Repository
- uses: actions/checkout@v2
+ uses: actions/checkout@v3
{%- if cookiecutter.use_docker == 'y' %}
- name: Build the Stack
@@ -80,27 +84,14 @@ jobs:
run: docker-compose -f local.yml down
{%- else %}
- - name: Set up Python 3.9
- uses: actions/setup-python@v2
+ - name: Set up Python
+ uses: actions/setup-python@v3
with:
- python-version: 3.9
-
- - name: Get pip cache dir
- id: pip-cache-location
- run: |
- echo "::set-output name=dir::$(pip cache dir)"
- {%- raw %}
-
- - name: Cache pip Project Dependencies
- uses: actions/cache@v2
- with:
- # Get the location of pip cache dir
- path: ${{ steps.pip-cache-location.outputs.dir }}
- # Look to see if there is a cache hit for the corresponding requirements file
- key: ${{ runner.os }}-pip-${{ hashFiles('**/local.txt') }}
- restore-keys: |
- ${{ runner.os }}-pip-
- {%- endraw %}
+ python-version: "3.10"
+ cache: pip
+ cache-dependency-path: |
+ requirements/base.txt
+ requirements/local.txt
- name: Install Dependencies
run: |
diff --git a/{{cookiecutter.project_slug}}/.gitignore b/{{cookiecutter.project_slug}}/.gitignore
index 613e8beab..17e9249c0 100644
--- a/{{cookiecutter.project_slug}}/.gitignore
+++ b/{{cookiecutter.project_slug}}/.gitignore
@@ -326,14 +326,25 @@ Session.vim
# Auto-generated tag files
tags
+# Redis dump file
+dump.rdb
+
### Project template
-{% if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' %}
+{%- if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' %}
MailHog
{%- endif %}
{{ cookiecutter.project_slug }}/media/
.pytest_cache/
-{% if cookiecutter.use_docker == 'y' %}
+{%- if cookiecutter.use_docker == 'y' %}
.ipython/
{%- endif %}
+
+{%- if cookiecutter.frontend_pipeline == 'Gulp' %}
+project.css
+project.min.css
+vendors.js
+*.min.js
+*.min.js.map
+{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/.gitlab-ci.yml b/{{cookiecutter.project_slug}}/.gitlab-ci.yml
index 711bfc392..dbb65fb73 100644
--- a/{{cookiecutter.project_slug}}/.gitlab-ci.yml
+++ b/{{cookiecutter.project_slug}}/.gitlab-ci.yml
@@ -13,7 +13,7 @@ variables:
flake8:
stage: lint
- image: python:3.9-alpine
+ image: python:3.10-alpine
before_script:
- pip install -q flake8
script:
@@ -22,7 +22,7 @@ flake8:
pytest:
stage: test
{% if cookiecutter.use_docker == 'y' -%}
- image: docker/compose:latest
+ image: docker/compose:1.29.2
tags:
- docker
services:
@@ -35,7 +35,7 @@ pytest:
script:
- docker-compose -f local.yml run django pytest
{%- else -%}
- image: python:3.9
+ image: python:3.10
tags:
- python
services:
diff --git a/{{cookiecutter.project_slug}}/.idea/runConfigurations/docker_compose_up_django.xml b/{{cookiecutter.project_slug}}/.idea/runConfigurations/docker_compose_up_django.xml
new file mode 100644
index 000000000..ad3b6a35a
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/.idea/runConfigurations/docker_compose_up_django.xml
@@ -0,0 +1,23 @@
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/{{cookiecutter.project_slug}}/.idea/{{cookiecutter.project_slug}}.iml b/{{cookiecutter.project_slug}}/.idea/{{cookiecutter.project_slug}}.iml
index d408765a5..98759fa1b 100644
--- a/{{cookiecutter.project_slug}}/.idea/{{cookiecutter.project_slug}}.iml
+++ b/{{cookiecutter.project_slug}}/.idea/{{cookiecutter.project_slug}}.iml
@@ -13,12 +13,12 @@
- {% if cookiecutter.js_task_runner != 'None' %}
-
+ {% if cookiecutter.frontend_pipeline == 'Gulp' %}
+
- {% else %}
-
+ {% else %}
+
{% endif %}
diff --git a/{{cookiecutter.project_slug}}/.pre-commit-config.yaml b/{{cookiecutter.project_slug}}/.pre-commit-config.yaml
index 6f1bdcbc6..a273bdf17 100644
--- a/{{cookiecutter.project_slug}}/.pre-commit-config.yaml
+++ b/{{cookiecutter.project_slug}}/.pre-commit-config.yaml
@@ -1,35 +1,39 @@
-exclude: 'docs|node_modules|migrations|.git|.tox'
+exclude: "^docs/|/migrations/"
default_stages: [commit]
-fail_fast: true
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
- rev: v4.0.1
+ rev: v4.4.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
+ - repo: https://github.com/asottile/pyupgrade
+ rev: v3.3.1
+ hooks:
+ - id: pyupgrade
+ args: [--py310-plus]
+
- repo: https://github.com/psf/black
- rev: 21.9b0
+ rev: 22.12.0
hooks:
- id: black
- - repo: https://github.com/timothycrosley/isort
- rev: 5.9.3
+ - repo: https://github.com/PyCQA/isort
+ rev: 5.12.0
hooks:
- id: isort
- - repo: https://gitlab.com/pycqa/flake8
- rev: 3.9.2
+ - repo: https://github.com/PyCQA/flake8
+ rev: 6.0.0
hooks:
- id: flake8
- args: ['--config=setup.cfg']
+ args: ["--config=setup.cfg"]
additional_dependencies: [flake8-isort]
-
# sets up .pre-commit-ci.yaml to ensure pre-commit dependencies stay up to date
ci:
- autoupdate_schedule: weekly
- skip: []
- submodules: false
+ autoupdate_schedule: weekly
+ skip: []
+ submodules: false
diff --git a/{{cookiecutter.project_slug}}/.pylintrc b/{{cookiecutter.project_slug}}/.pylintrc
index 6f195c6ef..9d604334b 100644
--- a/{{cookiecutter.project_slug}}/.pylintrc
+++ b/{{cookiecutter.project_slug}}/.pylintrc
@@ -1,6 +1,6 @@
[MASTER]
load-plugins=pylint_django{% if cookiecutter.use_celery == "y" %}, pylint_celery{% endif %}
-django-settings-module=config.settings.base
+django-settings-module=config.settings.local
[FORMAT]
max-line-length=120
diff --git a/{{cookiecutter.project_slug}}/.readthedocs.yml b/{{cookiecutter.project_slug}}/.readthedocs.yml
index b4cf0c080..e943a5fa9 100644
--- a/{{cookiecutter.project_slug}}/.readthedocs.yml
+++ b/{{cookiecutter.project_slug}}/.readthedocs.yml
@@ -7,6 +7,6 @@ build:
image: testing
python:
- version: 3.9
+ version: 3.10
install:
- requirements: requirements/local.txt
diff --git a/{{cookiecutter.project_slug}}/.travis.yml b/{{cookiecutter.project_slug}}/.travis.yml
index a684da225..326d78392 100644
--- a/{{cookiecutter.project_slug}}/.travis.yml
+++ b/{{cookiecutter.project_slug}}/.travis.yml
@@ -2,7 +2,7 @@ dist: focal
language: python
python:
- - "3.9"
+ - "3.10"
services:
- {% if cookiecutter.use_docker == 'y' %}docker{% else %}postgresql{% endif %}
@@ -37,7 +37,7 @@ jobs:
- sudo apt-get install -qq libsqlite3-dev libxml2 libxml2-dev libssl-dev libbz2-dev wget curl llvm
language: python
python:
- - "3.9"
+ - "3.10"
install:
- pip install -r requirements/local.txt
script:
diff --git a/{{cookiecutter.project_slug}}/Procfile b/{{cookiecutter.project_slug}}/Procfile
index 274108d14..2f2fbe927 100644
--- a/{{cookiecutter.project_slug}}/Procfile
+++ b/{{cookiecutter.project_slug}}/Procfile
@@ -1,10 +1,10 @@
release: python manage.py migrate
-{% if cookiecutter.use_async == "y" -%}
+{%- if cookiecutter.use_async == "y" %}
web: gunicorn config.asgi:application -k uvicorn.workers.UvicornWorker
{%- else %}
web: gunicorn config.wsgi:application
{%- endif %}
-{% if cookiecutter.use_celery == "y" -%}
+{%- if cookiecutter.use_celery == "y" %}
worker: REMAP_SIGTERM=SIGQUIT celery -A config.celery_app worker --loglevel=info
beat: REMAP_SIGTERM=SIGQUIT celery -A config.celery_app beat --loglevel=info
{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/README.md b/{{cookiecutter.project_slug}}/README.md
new file mode 100644
index 000000000..f7c29fb22
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/README.md
@@ -0,0 +1,140 @@
+# {{cookiecutter.project_name}}
+
+{{ cookiecutter.description }}
+
+[](https://github.com/cookiecutter/cookiecutter-django/)
+[](https://github.com/ambv/black)
+
+{%- if cookiecutter.open_source_license != "Not open source" %}
+
+License: {{cookiecutter.open_source_license}}
+{%- endif %}
+
+## Settings
+
+Moved to [settings](http://cookiecutter-django.readthedocs.io/en/latest/settings.html).
+
+## Basic Commands
+
+### Setting Up Your Users
+
+- To create a **normal user account**, just go to Sign Up and fill out the form. Once you submit it, you'll see a "Verify Your E-mail Address" page. Go to your console to see a simulated email verification message. Copy the link into your browser. Now the user's email should be verified and ready to go.
+
+- To create a **superuser account**, use this command:
+
+ $ python manage.py createsuperuser
+
+For convenience, you can keep your normal user logged in on Chrome and your superuser logged in on Firefox (or similar), so that you can see how the site behaves for both kinds of users.
+
+### Type checks
+
+Running type checks with mypy:
+
+ $ mypy {{cookiecutter.project_slug}}
+
+### Test coverage
+
+To run the tests, check your test coverage, and generate an HTML coverage report:
+
+ $ coverage run -m pytest
+ $ coverage html
+ $ open htmlcov/index.html
+
+#### Running tests with pytest
+
+ $ pytest
+
+### Live reloading and Sass CSS compilation
+
+Moved to [Live reloading and SASS compilation](https://cookiecutter-django.readthedocs.io/en/latest/developing-locally.html#sass-compilation-live-reloading).
+
+{%- if cookiecutter.use_celery == "y" %}
+
+### Celery
+
+This app comes with Celery.
+
+To run a celery worker:
+
+``` bash
+cd {{cookiecutter.project_slug}}
+celery -A config.celery_app worker -l info
+```
+
+Please note: For Celery's import magic to work, it is important *where* the celery commands are run. If you are in the same folder with *manage.py*, you should be right.
+
+{%- endif %}
+{%- if cookiecutter.use_mailhog == "y" %}
+
+### Email Server
+
+{%- if cookiecutter.use_docker == "y" %}
+
+In development, it is often nice to be able to see emails that are being sent from your application. For that reason local SMTP server [MailHog](https://github.com/mailhog/MailHog) with a web interface is available as docker container.
+
+Container mailhog will start automatically when you will run all docker containers.
+Please check [cookiecutter-django Docker documentation](http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-docker.html) for more details how to start all containers.
+
+With MailHog running, to view messages that are sent by your application, open your browser and go to `http://127.0.0.1:8025`
+{%- else %}
+
+In development, it is often nice to be able to see emails that are being sent from your application. If you choose to use [MailHog](https://github.com/mailhog/MailHog) when generating the project a local SMTP server with a web interface will be available.
+
+1. [Download the latest MailHog release](https://github.com/mailhog/MailHog/releases) for your OS.
+
+2. Rename the build to `MailHog`.
+
+3. Copy the file to the project root.
+
+4. Make it executable:
+
+ $ chmod +x MailHog
+
+5. Spin up another terminal window and start it there:
+
+ ./MailHog
+
+6. Check out to see how it goes.
+
+Now you have your own mail server running locally, ready to receive whatever you send it.
+
+{%- endif %}
+
+{%- endif %}
+{%- if cookiecutter.use_sentry == "y" %}
+
+### Sentry
+
+Sentry is an error logging aggregator service. You can sign up for a free account at or download and host it yourself.
+The system is set up with reasonable defaults, including 404 logging and integration with the WSGI application.
+
+You must set the DSN url in production.
+{%- endif %}
+
+## Deployment
+
+The following details how to deploy this application.
+{%- if cookiecutter.use_heroku.lower() == "y" %}
+
+### Heroku
+
+See detailed [cookiecutter-django Heroku documentation](http://cookiecutter-django.readthedocs.io/en/latest/deployment-on-heroku.html).
+
+{%- endif %}
+{%- if cookiecutter.use_docker.lower() == "y" %}
+
+### Docker
+
+See detailed [cookiecutter-django Docker documentation](http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-docker.html).
+
+{%- endif %}
+{%- if cookiecutter.frontend_pipeline == 'Gulp' %}
+### Custom Bootstrap Compilation
+
+The generated CSS is set up with automatic Bootstrap recompilation with variables of your choice.
+Bootstrap v5 is installed using npm and customised by tweaking your variables in `static/sass/custom_bootstrap_vars`.
+
+You can find a list of available variables [in the bootstrap source](https://github.com/twbs/bootstrap/blob/main/scss/_variables.scss), or get explanations on them in the [Bootstrap docs](https://getbootstrap.com/docs/5.1/customize/sass/).
+
+Bootstrap's javascript as well as its dependencies is concatenated into a single file: `static/js/vendors.js`.
+{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/README.rst b/{{cookiecutter.project_slug}}/README.rst
deleted file mode 100644
index aa4e48d3d..000000000
--- a/{{cookiecutter.project_slug}}/README.rst
+++ /dev/null
@@ -1,175 +0,0 @@
-{{cookiecutter.project_name}}
-{{ '=' * cookiecutter.project_name|length }}
-
-{{cookiecutter.description}}
-
-.. image:: https://img.shields.io/badge/built%20with-Cookiecutter%20Django-ff69b4.svg?logo=cookiecutter
- :target: https://github.com/pydanny/cookiecutter-django/
- :alt: Built with Cookiecutter Django
-.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
- :target: https://github.com/ambv/black
- :alt: Black code style
-{%- if cookiecutter.open_source_license != "Not open source" %}
-
-:License: {{cookiecutter.open_source_license}}
-{%- endif %}
-
-Settings
---------
-
-Moved to settings_.
-
-.. _settings: http://cookiecutter-django.readthedocs.io/en/latest/settings.html
-
-Basic Commands
---------------
-
-Setting Up Your Users
-^^^^^^^^^^^^^^^^^^^^^
-
-* To create a **normal user account**, just go to Sign Up and fill out the form. Once you submit it, you'll see a "Verify Your E-mail Address" page. Go to your console to see a simulated email verification message. Copy the link into your browser. Now the user's email should be verified and ready to go.
-
-* To create an **superuser account**, use this command::
-
- $ python manage.py createsuperuser
-
-For convenience, you can keep your normal user logged in on Chrome and your superuser logged in on Firefox (or similar), so that you can see how the site behaves for both kinds of users.
-
-Type checks
-^^^^^^^^^^^
-
-Running type checks with mypy:
-
-::
-
- $ mypy {{cookiecutter.project_slug}}
-
-Test coverage
-^^^^^^^^^^^^^
-
-To run the tests, check your test coverage, and generate an HTML coverage report::
-
- $ coverage run -m pytest
- $ coverage html
- $ open htmlcov/index.html
-
-Running tests with py.test
-~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-::
-
- $ pytest
-
-Live reloading and Sass CSS compilation
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-Moved to `Live reloading and SASS compilation`_.
-
-.. _`Live reloading and SASS compilation`: http://cookiecutter-django.readthedocs.io/en/latest/live-reloading-and-sass-compilation.html
-
-{%- if cookiecutter.use_celery == "y" %}
-
-Celery
-^^^^^^
-
-This app comes with Celery.
-
-To run a celery worker:
-
-.. code-block:: bash
-
- cd {{cookiecutter.project_slug}}
- celery -A config.celery_app worker -l info
-
-Please note: For Celery's import magic to work, it is important *where* the celery commands are run. If you are in the same folder with *manage.py*, you should be right.
-
-{%- endif %}
-{%- if cookiecutter.use_mailhog == "y" %}
-
-Email Server
-^^^^^^^^^^^^
-{%- if cookiecutter.use_docker == 'y' %}
-
-In development, it is often nice to be able to see emails that are being sent from your application. For that reason local SMTP server `MailHog`_ with a web interface is available as docker container.
-
-Container mailhog will start automatically when you will run all docker containers.
-Please check `cookiecutter-django Docker documentation`_ for more details how to start all containers.
-
-With MailHog running, to view messages that are sent by your application, open your browser and go to ``http://127.0.0.1:8025``
-{%- else %}
-
-In development, it is often nice to be able to see emails that are being sent from your application. If you choose to use `MailHog`_ when generating the project a local SMTP server with a web interface will be available.
-
-#. `Download the latest MailHog release`_ for your OS.
-
-#. Rename the build to ``MailHog``.
-
-#. Copy the file to the project root.
-
-#. Make it executable: ::
-
- $ chmod +x MailHog
-
-#. Spin up another terminal window and start it there: ::
-
- ./MailHog
-
-#. Check out ``_ to see how it goes.
-
-Now you have your own mail server running locally, ready to receive whatever you send it.
-
-.. _`Download the latest MailHog release`: https://github.com/mailhog/MailHog/releases
-{%- endif %}
-
-.. _mailhog: https://github.com/mailhog/MailHog
-{%- endif %}
-{%- if cookiecutter.use_sentry == "y" %}
-
-Sentry
-^^^^^^
-
-Sentry is an error logging aggregator service. You can sign up for a free account at https://sentry.io/signup/?code=cookiecutter or download and host it yourself.
-The system is setup with reasonable defaults, including 404 logging and integration with the WSGI application.
-
-You must set the DSN url in production.
-{%- endif %}
-
-Deployment
-----------
-
-The following details how to deploy this application.
-{%- if cookiecutter.use_heroku.lower() == "y" %}
-
-Heroku
-^^^^^^
-
-See detailed `cookiecutter-django Heroku documentation`_.
-
-.. _`cookiecutter-django Heroku documentation`: http://cookiecutter-django.readthedocs.io/en/latest/deployment-on-heroku.html
-{%- endif %}
-{%- if cookiecutter.use_docker.lower() == "y" %}
-
-Docker
-^^^^^^
-
-See detailed `cookiecutter-django Docker documentation`_.
-
-.. _`cookiecutter-django Docker documentation`: http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-docker.html
-{%- endif %}
-{%- if cookiecutter.custom_bootstrap_compilation == "y" %}
-Custom Bootstrap Compilation
-^^^^^^
-
-The generated CSS is set up with automatic Bootstrap recompilation with variables of your choice.
-Bootstrap v4 is installed using npm and customised by tweaking your variables in ``static/sass/custom_bootstrap_vars``.
-
-You can find a list of available variables `in the bootstrap source`_, or get explanations on them in the `Bootstrap docs`_.
-
-{%- if cookiecutter.js_task_runner == 'Gulp' %}
-Bootstrap's javascript as well as its dependencies is concatenated into a single file: ``static/js/vendors.js``.
-{%- endif %}
-
-.. _in the bootstrap source: https://github.com/twbs/bootstrap/blob/v4-dev/scss/_variables.scss
-.. _Bootstrap docs: https://getbootstrap.com/docs/4.1/getting-started/theming/
-
-{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/compose/local/django/Dockerfile b/{{cookiecutter.project_slug}}/compose/local/django/Dockerfile
index f1a489a30..3ea6b2d4c 100644
--- a/{{cookiecutter.project_slug}}/compose/local/django/Dockerfile
+++ b/{{cookiecutter.project_slug}}/compose/local/django/Dockerfile
@@ -1,4 +1,4 @@
-ARG PYTHON_VERSION=3.9-slim-buster
+ARG PYTHON_VERSION=3.10-slim-bullseye
# define an alias for the specfic python version used in this file.
FROM python:${PYTHON_VERSION} as python
diff --git a/{{cookiecutter.project_slug}}/compose/local/django/celery/beat/start b/{{cookiecutter.project_slug}}/compose/local/django/celery/beat/start
index c04a7365e..61f83968b 100644
--- a/{{cookiecutter.project_slug}}/compose/local/django/celery/beat/start
+++ b/{{cookiecutter.project_slug}}/compose/local/django/celery/beat/start
@@ -5,4 +5,4 @@ set -o nounset
rm -f './celerybeat.pid'
-celery -A config.celery_app beat -l INFO
+exec watchfiles celery.__main__.main --args '-A config.celery_app beat -l INFO'
diff --git a/{{cookiecutter.project_slug}}/compose/local/django/celery/flower/start b/{{cookiecutter.project_slug}}/compose/local/django/celery/flower/start
index bd3c9f2fd..ac3cc6b36 100644
--- a/{{cookiecutter.project_slug}}/compose/local/django/celery/flower/start
+++ b/{{cookiecutter.project_slug}}/compose/local/django/celery/flower/start
@@ -3,9 +3,6 @@
set -o errexit
set -o nounset
-
-celery \
- -A config.celery_app \
- -b "${CELERY_BROKER_URL}" \
- flower \
- --basic_auth="${CELERY_FLOWER_USER}:${CELERY_FLOWER_PASSWORD}"
+exec watchfiles celery.__main__.main \
+ --args \
+ "-A config.celery_app -b \"${CELERY_BROKER_URL}\" flower --basic_auth=\"${CELERY_FLOWER_USER}:${CELERY_FLOWER_PASSWORD}\""
diff --git a/{{cookiecutter.project_slug}}/compose/local/django/celery/worker/start b/{{cookiecutter.project_slug}}/compose/local/django/celery/worker/start
index d7b63cd41..16341fdd1 100644
--- a/{{cookiecutter.project_slug}}/compose/local/django/celery/worker/start
+++ b/{{cookiecutter.project_slug}}/compose/local/django/celery/worker/start
@@ -4,4 +4,4 @@ set -o errexit
set -o nounset
-watchgod celery.__main__.main --args -A config.celery_app worker -l INFO
+exec watchfiles celery.__main__.main --args '-A config.celery_app worker -l INFO'
diff --git a/{{cookiecutter.project_slug}}/compose/local/django/start b/{{cookiecutter.project_slug}}/compose/local/django/start
index 9cbb6c897..ec57dc8e4 100644
--- a/{{cookiecutter.project_slug}}/compose/local/django/start
+++ b/{{cookiecutter.project_slug}}/compose/local/django/start
@@ -7,7 +7,7 @@ set -o nounset
python manage.py migrate
{%- if cookiecutter.use_async == 'y' %}
-uvicorn config.asgi:application --host 0.0.0.0 --reload
+exec uvicorn config.asgi:application --host 0.0.0.0 --reload --reload-include '*.html'
{%- else %}
-python manage.py runserver_plus 0.0.0.0:8000
+exec python manage.py runserver_plus 0.0.0.0:8000
{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/compose/local/docs/Dockerfile b/{{cookiecutter.project_slug}}/compose/local/docs/Dockerfile
index fbb5ce9d0..c45d18c95 100644
--- a/{{cookiecutter.project_slug}}/compose/local/docs/Dockerfile
+++ b/{{cookiecutter.project_slug}}/compose/local/docs/Dockerfile
@@ -1,28 +1,61 @@
-FROM python:3.9-slim-buster
+ARG PYTHON_VERSION=3.10-slim-bullseye
+
+# define an alias for the specfic python version used in this file.
+FROM python:${PYTHON_VERSION} as python
+
+
+# Python build stage
+FROM python as python-build-stage
-ENV PYTHONUNBUFFERED 1
ENV PYTHONDONTWRITEBYTECODE 1
-RUN apt-get update \
- # dependencies for building Python packages
- && apt-get install -y build-essential \
- # psycopg2 dependencies
- && apt-get install -y libpq-dev \
- # Translations dependencies
- && apt-get install -y gettext \
- # Uncomment below lines to enable Sphinx output to latex and pdf
- # && apt-get install -y texlive-latex-recommended \
- # && apt-get install -y texlive-fonts-recommended \
- # && apt-get install -y texlive-latex-extra \
- # && apt-get install -y latexmk \
- # cleaning up unused files
- && apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \
- && rm -rf /var/lib/apt/lists/*
+RUN apt-get update && apt-get install --no-install-recommends -y \
+ # dependencies for building Python packages
+ build-essential \
+ # psycopg2 dependencies
+ libpq-dev \
+ # cleaning up unused files
+ && apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \
+ && rm -rf /var/lib/apt/lists/*
# Requirements are installed here to ensure they will be cached.
COPY ./requirements /requirements
-# All imports needed for autodoc.
-RUN pip install -r /requirements/local.txt -r /requirements/production.txt
+
+# create python dependency wheels
+RUN pip wheel --no-cache-dir --wheel-dir /usr/src/app/wheels \
+ -r /requirements/local.txt -r /requirements/production.txt \
+ && rm -rf /requirements
+
+
+# Python 'run' stage
+FROM python as python-run-stage
+
+ARG BUILD_ENVIRONMENT
+ENV PYTHONUNBUFFERED 1
+ENV PYTHONDONTWRITEBYTECODE 1
+
+RUN apt-get update && apt-get install --no-install-recommends -y \
+ # To run the Makefile
+ make \
+ # psycopg2 dependencies
+ libpq-dev \
+ # Translations dependencies
+ gettext \
+ # Uncomment below lines to enable Sphinx output to latex and pdf
+ # texlive-latex-recommended \
+ # texlive-fonts-recommended \
+ # texlive-latex-extra \
+ # latexmk \
+ # cleaning up unused files
+ && apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \
+ && rm -rf /var/lib/apt/lists/*
+
+# copy python dependency wheels from python-build-stage
+COPY --from=python-build-stage /usr/src/app/wheels /wheels
+
+# use wheels to install python dependencies
+RUN pip install --no-cache /wheels/* \
+ && rm -rf /wheels
COPY ./compose/local/docs/start /start-docs
RUN sed -i 's/\r$//g' /start-docs
diff --git a/{{cookiecutter.project_slug}}/compose/local/docs/start b/{{cookiecutter.project_slug}}/compose/local/docs/start
index fd2e0de6a..96a94f566 100644
--- a/{{cookiecutter.project_slug}}/compose/local/docs/start
+++ b/{{cookiecutter.project_slug}}/compose/local/docs/start
@@ -4,4 +4,4 @@ set -o errexit
set -o pipefail
set -o nounset
-make livehtml
+exec make livehtml
diff --git a/{{cookiecutter.project_slug}}/compose/local/node/Dockerfile b/{{cookiecutter.project_slug}}/compose/local/node/Dockerfile
index f9976e206..8062fa689 100644
--- a/{{cookiecutter.project_slug}}/compose/local/node/Dockerfile
+++ b/{{cookiecutter.project_slug}}/compose/local/node/Dockerfile
@@ -1,4 +1,4 @@
-FROM node:10-stretch-slim
+FROM node:16-bullseye-slim
WORKDIR /app
diff --git a/{{cookiecutter.project_slug}}/compose/production/django/Dockerfile b/{{cookiecutter.project_slug}}/compose/production/django/Dockerfile
index 5f1bc78b3..4652f0898 100644
--- a/{{cookiecutter.project_slug}}/compose/production/django/Dockerfile
+++ b/{{cookiecutter.project_slug}}/compose/production/django/Dockerfile
@@ -1,7 +1,7 @@
-ARG PYTHON_VERSION=3.9-slim-buster
+ARG PYTHON_VERSION=3.10-slim-bullseye
-{% if cookiecutter.js_task_runner == 'Gulp' -%}
-FROM node:10-stretch-slim as client-builder
+{% if cookiecutter.frontend_pipeline == 'Gulp' -%}
+FROM node:16-bullseye-slim as client-builder
ARG APP_HOME=/app
WORKDIR ${APP_HOME}
@@ -99,7 +99,7 @@ RUN chmod +x /start-flower
# copy application code to WORKDIR
-{%- if cookiecutter.js_task_runner == 'Gulp' %}
+{%- if cookiecutter.frontend_pipeline == 'Gulp' %}
COPY --from=client-builder --chown=django:django ${APP_HOME} ${APP_HOME}
{% else %}
COPY --chown=django:django . ${APP_HOME}
diff --git a/{{cookiecutter.project_slug}}/compose/production/django/celery/beat/start b/{{cookiecutter.project_slug}}/compose/production/django/celery/beat/start
index 20b93123a..42ddca910 100644
--- a/{{cookiecutter.project_slug}}/compose/production/django/celery/beat/start
+++ b/{{cookiecutter.project_slug}}/compose/production/django/celery/beat/start
@@ -5,4 +5,4 @@ set -o pipefail
set -o nounset
-celery -A config.celery_app beat -l INFO
+exec celery -A config.celery_app beat -l INFO
diff --git a/{{cookiecutter.project_slug}}/compose/production/django/celery/flower/start b/{{cookiecutter.project_slug}}/compose/production/django/celery/flower/start
index bd3c9f2fd..4180d6778 100644
--- a/{{cookiecutter.project_slug}}/compose/production/django/celery/flower/start
+++ b/{{cookiecutter.project_slug}}/compose/production/django/celery/flower/start
@@ -4,7 +4,7 @@ set -o errexit
set -o nounset
-celery \
+exec celery \
-A config.celery_app \
-b "${CELERY_BROKER_URL}" \
flower \
diff --git a/{{cookiecutter.project_slug}}/compose/production/django/celery/worker/start b/{{cookiecutter.project_slug}}/compose/production/django/celery/worker/start
index 38fb77f35..af0c8f7b5 100644
--- a/{{cookiecutter.project_slug}}/compose/production/django/celery/worker/start
+++ b/{{cookiecutter.project_slug}}/compose/production/django/celery/worker/start
@@ -5,4 +5,4 @@ set -o pipefail
set -o nounset
-celery -A config.celery_app worker -l INFO
+exec celery -A config.celery_app worker -l INFO
diff --git a/{{cookiecutter.project_slug}}/compose/production/django/entrypoint b/{{cookiecutter.project_slug}}/compose/production/django/entrypoint
index 95ab8297a..2fbcad955 100644
--- a/{{cookiecutter.project_slug}}/compose/production/django/entrypoint
+++ b/{{cookiecutter.project_slug}}/compose/production/django/entrypoint
@@ -16,30 +16,34 @@ if [ -z "${POSTGRES_USER}" ]; then
fi
export DATABASE_URL="postgres://${POSTGRES_USER}:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}"
-postgres_ready() {
python << END
import sys
+import time
import psycopg2
-try:
- psycopg2.connect(
- dbname="${POSTGRES_DB}",
- user="${POSTGRES_USER}",
- password="${POSTGRES_PASSWORD}",
- host="${POSTGRES_HOST}",
- port="${POSTGRES_PORT}",
- )
-except psycopg2.OperationalError:
- sys.exit(-1)
-sys.exit(0)
+suggest_unrecoverable_after = 30
+start = time.time()
+while True:
+ try:
+ psycopg2.connect(
+ dbname="${POSTGRES_DB}",
+ user="${POSTGRES_USER}",
+ password="${POSTGRES_PASSWORD}",
+ host="${POSTGRES_HOST}",
+ port="${POSTGRES_PORT}",
+ )
+ break
+ except psycopg2.OperationalError as error:
+ sys.stderr.write("Waiting for PostgreSQL to become available...\n")
+
+ if time.time() - start > suggest_unrecoverable_after:
+ sys.stderr.write(" This is taking longer than expected. The following exception may be indicative of an unrecoverable error: '{}'\n".format(error))
+
+ time.sleep(1)
END
-}
-until postgres_ready; do
- >&2 echo 'Waiting for PostgreSQL to become available...'
- sleep 1
-done
+
>&2 echo 'PostgreSQL is available'
exec "$@"
diff --git a/{{cookiecutter.project_slug}}/compose/production/django/start b/{{cookiecutter.project_slug}}/compose/production/django/start
index 1a41ed48d..73f686bd7 100644
--- a/{{cookiecutter.project_slug}}/compose/production/django/start
+++ b/{{cookiecutter.project_slug}}/compose/production/django/start
@@ -6,7 +6,7 @@ set -o nounset
python /app/manage.py collectstatic --noinput
-{% if cookiecutter.use_whitenoise == 'y' and cookiecutter.use_compressor == 'y' %}
+{% if cookiecutter.use_whitenoise == 'y' and cookiecutter.frontend_pipeline == 'Django Compressor' %}
compress_enabled() {
python << END
import sys
@@ -27,8 +27,8 @@ if compress_enabled; then
python /app/manage.py compress
fi
{%- endif %}
-{% if cookiecutter.use_async == 'y' %}
-/usr/local/bin/gunicorn config.asgi --bind 0.0.0.0:5000 --chdir=/app -k uvicorn.workers.UvicornWorker
-{% else %}
-/usr/local/bin/gunicorn config.wsgi --bind 0.0.0.0:5000 --chdir=/app
+{%- if cookiecutter.use_async == 'y' %}
+exec /usr/local/bin/gunicorn config.asgi --bind 0.0.0.0:5000 --chdir=/app -k uvicorn.workers.UvicornWorker
+{%- else %}
+exec /usr/local/bin/gunicorn config.wsgi --bind 0.0.0.0:5000 --chdir=/app
{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/config/asgi.py b/{{cookiecutter.project_slug}}/config/asgi.py
index 8c99bbf53..65e76ca0a 100644
--- a/{{cookiecutter.project_slug}}/config/asgi.py
+++ b/{{cookiecutter.project_slug}}/config/asgi.py
@@ -15,8 +15,8 @@ from django.core.asgi import get_asgi_application
# This allows easy placement of apps within the interior
# {{ cookiecutter.project_slug }} directory.
-ROOT_DIR = Path(__file__).resolve(strict=True).parent.parent
-sys.path.append(str(ROOT_DIR / "{{ cookiecutter.project_slug }}"))
+BASE_DIR = Path(__file__).resolve(strict=True).parent.parent
+sys.path.append(str(BASE_DIR / "{{ cookiecutter.project_slug }}"))
# If DJANGO_SETTINGS_MODULE is unset, default to the local settings
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.local")
diff --git a/{{cookiecutter.project_slug}}/config/settings/base.py b/{{cookiecutter.project_slug}}/config/settings/base.py
index 640d8b62c..44b03fa03 100644
--- a/{{cookiecutter.project_slug}}/config/settings/base.py
+++ b/{{cookiecutter.project_slug}}/config/settings/base.py
@@ -5,15 +5,15 @@ from pathlib import Path
import environ
-ROOT_DIR = Path(__file__).resolve(strict=True).parent.parent.parent
+BASE_DIR = Path(__file__).resolve(strict=True).parent.parent.parent
# {{ cookiecutter.project_slug }}/
-APPS_DIR = ROOT_DIR / "{{ cookiecutter.project_slug }}"
+APPS_DIR = BASE_DIR / "{{ cookiecutter.project_slug }}"
env = environ.Env()
READ_DOT_ENV_FILE = env.bool("DJANGO_READ_DOT_ENV_FILE", default=False)
if READ_DOT_ENV_FILE:
# OS environment variables take precedence over variables from .env
- env.read_env(str(ROOT_DIR / ".env"))
+ env.read_env(str(BASE_DIR / ".env"))
# GENERAL
# ------------------------------------------------------------------------------
@@ -30,12 +30,10 @@ LANGUAGE_CODE = "en-us"
SITE_ID = 1
# https://docs.djangoproject.com/en/dev/ref/settings/#use-i18n
USE_I18N = True
-# https://docs.djangoproject.com/en/dev/ref/settings/#use-l10n
-USE_L10N = True
# https://docs.djangoproject.com/en/dev/ref/settings/#use-tz
USE_TZ = True
# https://docs.djangoproject.com/en/dev/ref/settings/#locale-paths
-LOCALE_PATHS = [str(ROOT_DIR / "locale")]
+LOCALE_PATHS = [str(BASE_DIR / "locale")]
# DATABASES
# ------------------------------------------------------------------------------
@@ -44,10 +42,15 @@ LOCALE_PATHS = [str(ROOT_DIR / "locale")]
DATABASES = {"default": env.db("DATABASE_URL")}
{%- else %}
DATABASES = {
- "default": env.db("DATABASE_URL", default="postgres://{% if cookiecutter.windows == 'y' %}localhost{% endif %}/{{cookiecutter.project_slug}}"),
+ "default": env.db(
+ "DATABASE_URL",
+ default="postgres://{% if cookiecutter.windows == 'y' %}localhost{% endif %}/{{cookiecutter.project_slug}}",
+ ),
}
{%- endif %}
DATABASES["default"]["ATOMIC_REQUESTS"] = True
+# https://docs.djangoproject.com/en/stable/ref/settings/#std:setting-DEFAULT_AUTO_FIELD
+DEFAULT_AUTO_FIELD = "django.db.models.BigAutoField"
# URLS
# ------------------------------------------------------------------------------
@@ -71,6 +74,7 @@ DJANGO_APPS = [
]
THIRD_PARTY_APPS = [
"crispy_forms",
+ "crispy_bootstrap5",
"allauth",
"allauth.account",
"allauth.socialaccount",
@@ -81,11 +85,12 @@ THIRD_PARTY_APPS = [
"rest_framework",
"rest_framework.authtoken",
"corsheaders",
+ "drf_spectacular",
{%- endif %}
]
LOCAL_APPS = [
- "{{ cookiecutter.project_slug }}.users.apps.UsersConfig",
+ "{{ cookiecutter.project_slug }}.users",
# Your stuff: custom apps go here
]
# https://docs.djangoproject.com/en/dev/ref/settings/#installed-apps
@@ -154,7 +159,7 @@ MIDDLEWARE = [
# STATIC
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#static-root
-STATIC_ROOT = str(ROOT_DIR / "staticfiles")
+STATIC_ROOT = str(BASE_DIR / "staticfiles")
# https://docs.djangoproject.com/en/dev/ref/settings/#static-url
STATIC_URL = "/static/"
# https://docs.djangoproject.com/en/dev/ref/contrib/staticfiles/#std:setting-STATICFILES_DIRS
@@ -179,15 +184,11 @@ TEMPLATES = [
{
# https://docs.djangoproject.com/en/dev/ref/settings/#std:setting-TEMPLATES-BACKEND
"BACKEND": "django.template.backends.django.DjangoTemplates",
- # https://docs.djangoproject.com/en/dev/ref/settings/#template-dirs
+ # https://docs.djangoproject.com/en/dev/ref/settings/#dirs
"DIRS": [str(APPS_DIR / "templates")],
+ # https://docs.djangoproject.com/en/dev/ref/settings/#app-dirs
+ "APP_DIRS": True,
"OPTIONS": {
- # https://docs.djangoproject.com/en/dev/ref/settings/#template-loaders
- # https://docs.djangoproject.com/en/dev/ref/templates/api/#loader-types
- "loaders": [
- "django.template.loaders.filesystem.Loader",
- "django.template.loaders.app_directories.Loader",
- ],
# https://docs.djangoproject.com/en/dev/ref/settings/#template-context-processors
"context_processors": [
"django.template.context_processors.debug",
@@ -198,7 +199,7 @@ TEMPLATES = [
"django.template.context_processors.static",
"django.template.context_processors.tz",
"django.contrib.messages.context_processors.messages",
- "{{ cookiecutter.project_slug }}.utils.context_processors.settings_context",
+ "{{cookiecutter.project_slug}}.users.context_processors.allauth_settings",
],
},
}
@@ -208,7 +209,8 @@ TEMPLATES = [
FORM_RENDERER = "django.forms.renderers.TemplatesSetting"
# http://django-crispy-forms.readthedocs.io/en/latest/install.html#template-packs
-CRISPY_TEMPLATE_PACK = "bootstrap4"
+CRISPY_TEMPLATE_PACK = "bootstrap5"
+CRISPY_ALLOWED_TEMPLATE_PACKS = "bootstrap5"
# FIXTURES
# ------------------------------------------------------------------------------
@@ -273,26 +275,37 @@ LOGGING = {
# Celery
# ------------------------------------------------------------------------------
if USE_TZ:
- # http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-timezone
+ # https://docs.celeryq.dev/en/stable/userguide/configuration.html#std:setting-timezone
CELERY_TIMEZONE = TIME_ZONE
-# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-broker_url
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#std:setting-broker_url
CELERY_BROKER_URL = env("CELERY_BROKER_URL")
-# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-result_backend
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#std:setting-result_backend
CELERY_RESULT_BACKEND = CELERY_BROKER_URL
-# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-accept_content
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#result-extended
+CELERY_RESULT_EXTENDED = True
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#result-backend-always-retry
+# https://github.com/celery/celery/pull/6122
+CELERY_RESULT_BACKEND_ALWAYS_RETRY = True
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#result-backend-max-retries
+CELERY_RESULT_BACKEND_MAX_RETRIES = 10
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#std:setting-accept_content
CELERY_ACCEPT_CONTENT = ["json"]
-# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-task_serializer
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#std:setting-task_serializer
CELERY_TASK_SERIALIZER = "json"
-# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-result_serializer
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#std:setting-result_serializer
CELERY_RESULT_SERIALIZER = "json"
-# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-time-limit
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#task-time-limit
# TODO: set to whatever value is adequate in your circumstances
CELERY_TASK_TIME_LIMIT = 5 * 60
-# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-soft-time-limit
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#task-soft-time-limit
# TODO: set to whatever value is adequate in your circumstances
CELERY_TASK_SOFT_TIME_LIMIT = 60
-# http://docs.celeryproject.org/en/latest/userguide/configuration.html#beat-scheduler
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#beat-scheduler
CELERY_BEAT_SCHEDULER = "django_celery_beat.schedulers:DatabaseScheduler"
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#worker-send-task-events
+CELERY_WORKER_SEND_TASK_EVENTS = True
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#std-setting-task_send_sent_event
+CELERY_TASK_SEND_SENT_EVENT = True
{%- endif %}
# django-allauth
@@ -306,9 +319,13 @@ ACCOUNT_EMAIL_REQUIRED = True
ACCOUNT_EMAIL_VERIFICATION = "mandatory"
# https://django-allauth.readthedocs.io/en/latest/configuration.html
ACCOUNT_ADAPTER = "{{cookiecutter.project_slug}}.users.adapters.AccountAdapter"
+# https://django-allauth.readthedocs.io/en/latest/forms.html
+ACCOUNT_FORMS = {"signup": "{{cookiecutter.project_slug}}.users.forms.UserSignupForm"}
# https://django-allauth.readthedocs.io/en/latest/configuration.html
SOCIALACCOUNT_ADAPTER = "{{cookiecutter.project_slug}}.users.adapters.SocialAccountAdapter"
-{% if cookiecutter.use_compressor == 'y' -%}
+# https://django-allauth.readthedocs.io/en/latest/forms.html
+SOCIALACCOUNT_FORMS = {"signup": "{{cookiecutter.project_slug}}.users.forms.UserSocialSignupForm"}
+{% if cookiecutter.frontend_pipeline == 'Django Compressor' -%}
# django-compressor
# ------------------------------------------------------------------------------
# https://django-compressor.readthedocs.io/en/latest/quickstart/#installation
@@ -325,11 +342,20 @@ REST_FRAMEWORK = {
"rest_framework.authentication.TokenAuthentication",
),
"DEFAULT_PERMISSION_CLASSES": ("rest_framework.permissions.IsAuthenticated",),
+ "DEFAULT_SCHEMA_CLASS": "drf_spectacular.openapi.AutoSchema",
}
# django-cors-headers - https://github.com/adamchainz/django-cors-headers#setup
CORS_URLS_REGEX = r"^/api/.*$"
+# By Default swagger ui is available only to admin user(s). You can change permission classes to change that
+# See more configuration options at https://drf-spectacular.readthedocs.io/en/latest/settings.html#settings
+SPECTACULAR_SETTINGS = {
+ "TITLE": "{{ cookiecutter.project_name }} API",
+ "DESCRIPTION": "Documentation of API endpoints of {{ cookiecutter.project_name }}",
+ "VERSION": "1.0.0",
+ "SERVE_PERMISSIONS": ["rest_framework.permissions.IsAdminUser"],
+}
{%- endif %}
# Your stuff...
# ------------------------------------------------------------------------------
diff --git a/{{cookiecutter.project_slug}}/config/settings/local.py b/{{cookiecutter.project_slug}}/config/settings/local.py
index 3ce150e1c..a5fe0f71c 100644
--- a/{{cookiecutter.project_slug}}/config/settings/local.py
+++ b/{{cookiecutter.project_slug}}/config/settings/local.py
@@ -69,7 +69,7 @@ if env("USE_DOCKER") == "yes":
hostname, _, ips = socket.gethostbyname_ex(socket.gethostname())
INTERNAL_IPS += [".".join(ip.split(".")[:-1] + ["1"]) for ip in ips]
- {%- if cookiecutter.js_task_runner == 'Gulp' %}
+ {%- if cookiecutter.frontend_pipeline == 'Gulp' %}
try:
_, _, ips = socket.gethostbyname_ex("node")
INTERNAL_IPS.extend(ips)
@@ -88,10 +88,10 @@ INSTALLED_APPS += ["django_extensions"] # noqa F405
# Celery
# ------------------------------------------------------------------------------
{% if cookiecutter.use_docker == 'n' -%}
-# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-always-eager
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#task-always-eager
CELERY_TASK_ALWAYS_EAGER = True
{%- endif %}
-# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-eager-propagates
+# https://docs.celeryq.dev/en/stable/userguide/configuration.html#task-eager-propagates
CELERY_TASK_EAGER_PROPAGATES = True
{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/config/settings/production.py b/{{cookiecutter.project_slug}}/config/settings/production.py
index a9e38b07a..5de0529e2 100644
--- a/{{cookiecutter.project_slug}}/config/settings/production.py
+++ b/{{cookiecutter.project_slug}}/config/settings/production.py
@@ -2,11 +2,11 @@
import logging
import sentry_sdk
-from sentry_sdk.integrations.django import DjangoIntegration
-from sentry_sdk.integrations.logging import LoggingIntegration
{%- if cookiecutter.use_celery == 'y' %}
from sentry_sdk.integrations.celery import CeleryIntegration
-{% endif %}
+{%- endif %}
+from sentry_sdk.integrations.django import DjangoIntegration
+from sentry_sdk.integrations.logging import LoggingIntegration
from sentry_sdk.integrations.redis import RedisIntegration
{% endif -%}
@@ -22,8 +22,6 @@ ALLOWED_HOSTS = env.list("DJANGO_ALLOWED_HOSTS", default=["{{ cookiecutter.domai
# DATABASES
# ------------------------------------------------------------------------------
-DATABASES["default"] = env.db("DATABASE_URL") # noqa F405
-DATABASES["default"]["ATOMIC_REQUESTS"] = True # noqa F405
DATABASES["default"]["CONN_MAX_AGE"] = env.int("CONN_MAX_AGE", default=60) # noqa F405
# CACHES
@@ -88,6 +86,11 @@ AWS_S3_OBJECT_PARAMETERS = {
"CacheControl": f"max-age={_AWS_EXPIRY}, s-maxage={_AWS_EXPIRY}, must-revalidate"
}
# https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#settings
+AWS_S3_MAX_MEMORY_SIZE = env.int(
+ "DJANGO_AWS_S3_MAX_MEMORY_SIZE",
+ default=100_000_000, # 100MB
+)
+# https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#settings
AWS_S3_REGION_NAME = env("DJANGO_AWS_S3_REGION_NAME", default=None)
# https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#cloudfront
AWS_S3_CUSTOM_DOMAIN = env("DJANGO_AWS_S3_CUSTOM_DOMAIN", default=None)
@@ -95,6 +98,10 @@ aws_s3_domain = AWS_S3_CUSTOM_DOMAIN or f"{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws
{% elif cookiecutter.cloud_provider == 'GCP' %}
GS_BUCKET_NAME = env("DJANGO_GCP_STORAGE_BUCKET_NAME")
GS_DEFAULT_ACL = "publicRead"
+{% elif cookiecutter.cloud_provider == 'Azure' %}
+AZURE_ACCOUNT_KEY = env("DJANGO_AZURE_ACCOUNT_KEY")
+AZURE_ACCOUNT_NAME = env("DJANGO_AZURE_ACCOUNT_NAME")
+AZURE_CONTAINER = env("DJANGO_AZURE_CONTAINER_NAME")
{% endif -%}
{% if cookiecutter.cloud_provider != 'None' or cookiecutter.use_whitenoise == 'y' -%}
@@ -111,6 +118,9 @@ STATIC_URL = f"https://{aws_s3_domain}/static/"
STATICFILES_STORAGE = "{{cookiecutter.project_slug}}.utils.storages.StaticRootGoogleCloudStorage"
COLLECTFAST_STRATEGY = "collectfast.strategies.gcloud.GoogleCloudStrategy"
STATIC_URL = f"https://storage.googleapis.com/{GS_BUCKET_NAME}/static/"
+{% elif cookiecutter.cloud_provider == 'Azure' -%}
+STATICFILES_STORAGE = "{{cookiecutter.project_slug}}.utils.storages.StaticRootAzureStorage"
+STATIC_URL = f"https://{AZURE_ACCOUNT_NAME}.blob.core.windows.net/static/"
{% endif -%}
# MEDIA
@@ -121,26 +131,17 @@ MEDIA_URL = f"https://{aws_s3_domain}/media/"
{%- elif cookiecutter.cloud_provider == 'GCP' %}
DEFAULT_FILE_STORAGE = "{{cookiecutter.project_slug}}.utils.storages.MediaRootGoogleCloudStorage"
MEDIA_URL = f"https://storage.googleapis.com/{GS_BUCKET_NAME}/media/"
+{%- elif cookiecutter.cloud_provider == 'Azure' %}
+DEFAULT_FILE_STORAGE = "{{cookiecutter.project_slug}}.utils.storages.MediaRootAzureStorage"
+MEDIA_URL = f"https://{AZURE_ACCOUNT_NAME}.blob.core.windows.net/media/"
{%- endif %}
-# TEMPLATES
-# ------------------------------------------------------------------------------
-# https://docs.djangoproject.com/en/dev/ref/settings/#templates
-TEMPLATES[-1]["OPTIONS"]["loaders"] = [ # type: ignore[index] # noqa F405
- (
- "django.template.loaders.cached.Loader",
- [
- "django.template.loaders.filesystem.Loader",
- "django.template.loaders.app_directories.Loader",
- ],
- )
-]
-
# EMAIL
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#default-from-email
DEFAULT_FROM_EMAIL = env(
- "DJANGO_DEFAULT_FROM_EMAIL", default="{{cookiecutter.project_name}} "
+ "DJANGO_DEFAULT_FROM_EMAIL",
+ default="{{cookiecutter.project_name}} ",
)
# https://docs.djangoproject.com/en/dev/ref/settings/#server-email
SERVER_EMAIL = env("DJANGO_SERVER_EMAIL", default=DEFAULT_FROM_EMAIL)
@@ -179,7 +180,6 @@ EMAIL_BACKEND = "anymail.backends.mailjet.EmailBackend"
ANYMAIL = {
"MAILJET_API_KEY": env("MAILJET_API_KEY"),
"MAILJET_SECRET_KEY": env("MAILJET_SECRET_KEY"),
- "MAILJET_API_URL": env("MAILJET_API_URL", default="https://api.mailjet.com/v3"),
}
{%- elif cookiecutter.mail_service == 'Mandrill' %}
# https://anymail.readthedocs.io/en/stable/esps/mandrill/
@@ -202,8 +202,6 @@ ANYMAIL = {
EMAIL_BACKEND = "anymail.backends.sendgrid.EmailBackend"
ANYMAIL = {
"SENDGRID_API_KEY": env("SENDGRID_API_KEY"),
- "SENDGRID_GENERATE_MESSAGE_ID": env("SENDGRID_GENERATE_MESSAGE_ID"),
- "SENDGRID_MERGE_FIELD_FORMAT": env("SENDGRID_MERGE_FIELD_FORMAT"),
"SENDGRID_API_URL": env("SENDGRID_API_URL", default="https://api.sendgrid.com/v3/"),
}
{%- elif cookiecutter.mail_service == 'SendinBlue' %}
@@ -230,7 +228,7 @@ EMAIL_BACKEND = "django.core.mail.backends.smtp.EmailBackend"
ANYMAIL = {}
{%- endif %}
-{% if cookiecutter.use_compressor == 'y' -%}
+{% if cookiecutter.frontend_pipeline == 'Django Compressor' -%}
# django-compressor
# ------------------------------------------------------------------------------
# https://django-compressor.readthedocs.io/en/latest/settings/#django.conf.settings.COMPRESS_ENABLED
@@ -238,7 +236,7 @@ COMPRESS_ENABLED = env.bool("COMPRESS_ENABLED", default=True)
{%- if cookiecutter.cloud_provider == 'None' %}
# https://django-compressor.readthedocs.io/en/latest/settings/#django.conf.settings.COMPRESS_STORAGE
COMPRESS_STORAGE = "compressor.storage.GzipCompressorFileStorage"
-{%- elif cookiecutter.cloud_provider in ('AWS', 'GCP') and cookiecutter.use_whitenoise == 'n' %}
+{%- elif cookiecutter.cloud_provider in ('AWS', 'GCP', 'Azure') and cookiecutter.use_whitenoise == 'n' %}
# https://django-compressor.readthedocs.io/en/latest/settings/#django.conf.settings.COMPRESS_STORAGE
COMPRESS_STORAGE = STATICFILES_STORAGE
{%- endif %}
@@ -370,5 +368,15 @@ sentry_sdk.init(
traces_sample_rate=env.float("SENTRY_TRACES_SAMPLE_RATE", default=0.0),
)
{% endif %}
+{% if cookiecutter.use_drf == "y" -%}
+
+# django-rest-framework
+# -------------------------------------------------------------------------------
+# Tools that generate code samples can use SERVERS to point to the correct domain
+SPECTACULAR_SETTINGS["SERVERS"] = [ # noqa F405
+ {"url": "https://{{ cookiecutter.domain_name }}", "description": "Production server"}
+]
+
+{%- endif %}
# Your stuff...
# ------------------------------------------------------------------------------
diff --git a/{{cookiecutter.project_slug}}/config/settings/test.py b/{{cookiecutter.project_slug}}/config/settings/test.py
index 222597ad9..198b5aa5d 100644
--- a/{{cookiecutter.project_slug}}/config/settings/test.py
+++ b/{{cookiecutter.project_slug}}/config/settings/test.py
@@ -20,22 +20,14 @@ TEST_RUNNER = "django.test.runner.DiscoverRunner"
# https://docs.djangoproject.com/en/dev/ref/settings/#password-hashers
PASSWORD_HASHERS = ["django.contrib.auth.hashers.MD5PasswordHasher"]
-# TEMPLATES
-# ------------------------------------------------------------------------------
-TEMPLATES[-1]["OPTIONS"]["loaders"] = [ # type: ignore[index] # noqa F405
- (
- "django.template.loaders.cached.Loader",
- [
- "django.template.loaders.filesystem.Loader",
- "django.template.loaders.app_directories.Loader",
- ],
- )
-]
-
# EMAIL
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#email-backend
EMAIL_BACKEND = "django.core.mail.backends.locmem.EmailBackend"
+# DEBUGGING FOR TEMPLATES
+# ------------------------------------------------------------------------------
+TEMPLATES[0]["OPTIONS"]["debug"] = True # type: ignore # noqa F405
+
# Your stuff...
# ------------------------------------------------------------------------------
diff --git a/{{cookiecutter.project_slug}}/config/urls.py b/{{cookiecutter.project_slug}}/config/urls.py
index 168d77a8b..ab42cc103 100644
--- a/{{cookiecutter.project_slug}}/config/urls.py
+++ b/{{cookiecutter.project_slug}}/config/urls.py
@@ -8,6 +8,7 @@ from django.urls import include, path
from django.views import defaults as default_views
from django.views.generic import TemplateView
{%- if cookiecutter.use_drf == 'y' %}
+from drf_spectacular.views import SpectacularAPIView, SpectacularSwaggerView
from rest_framework.authtoken.views import obtain_auth_token
{%- endif %}
@@ -35,6 +36,12 @@ urlpatterns += [
path("api/", include("config.api_router")),
# DRF auth token
path("auth-token/", obtain_auth_token),
+ path("api/schema/", SpectacularAPIView.as_view(), name="api-schema"),
+ path(
+ "api/docs/",
+ SpectacularSwaggerView.as_view(url_name="api-schema"),
+ name="api-docs",
+ ),
]
{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/config/wsgi.py b/{{cookiecutter.project_slug}}/config/wsgi.py
index a7de581ca..3fd809ef3 100644
--- a/{{cookiecutter.project_slug}}/config/wsgi.py
+++ b/{{cookiecutter.project_slug}}/config/wsgi.py
@@ -21,8 +21,8 @@ from django.core.wsgi import get_wsgi_application
# This allows easy placement of apps within the interior
# {{ cookiecutter.project_slug }} directory.
-ROOT_DIR = Path(__file__).resolve(strict=True).parent.parent
-sys.path.append(str(ROOT_DIR / "{{ cookiecutter.project_slug }}"))
+BASE_DIR = Path(__file__).resolve(strict=True).parent.parent
+sys.path.append(str(BASE_DIR / "{{ cookiecutter.project_slug }}"))
# We defer to a DJANGO_SETTINGS_MODULE already in the environment. This breaks
# if running multiple sites in the same mod_wsgi process. To fix this, use
# mod_wsgi daemon mode with each site in its own daemon process, or use
diff --git a/{{cookiecutter.project_slug}}/docs/Makefile b/{{cookiecutter.project_slug}}/docs/Makefile
index 0b56e1f86..cf080e476 100644
--- a/{{cookiecutter.project_slug}}/docs/Makefile
+++ b/{{cookiecutter.project_slug}}/docs/Makefile
@@ -3,8 +3,8 @@
# You can set these variables from the command line, and also
# from the environment for the first two.
-SPHINXOPTS ?=
-SPHINXBUILD ?= sphinx-build -c .
+SPHINXOPTS ?=
+SPHINXBUILD ?= sphinx-build
SOURCEDIR = .
BUILDDIR = ./_build
{%- if cookiecutter.use_docker == 'y' %}
@@ -17,24 +17,20 @@ APP = ../{{cookiecutter.project_slug}}
# Put it first so that "make" without argument is like "make help".
help:
- @$(SPHINXBUILD) help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
+ @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) -c .
# Build, watch and serve docs with live reload
livehtml:
- sphinx-autobuild -b html
+ sphinx-autobuild -b html
{%- if cookiecutter.use_docker == 'y' %} --host 0.0.0.0
- {%- else %} --open-browser
- {%- endif %} --port 7000 --watch $(APP) -c . $(SOURCEDIR) $(BUILDDIR)/html
+ {%- else %} --open-browser
+ {%- endif %} --port 9000 --watch $(APP) -c . $(SOURCEDIR) $(BUILDDIR)/html
# Outputs rst files from django application code
apidocs:
- {%- if cookiecutter.use_docker == 'y' %}
- sphinx-apidoc -o $(SOURCEDIR)/api /app
- {%- else %}
- sphinx-apidoc -o $(SOURCEDIR)/api ../{{cookiecutter.project_slug}}
- {%- endif %}
+ sphinx-apidoc -o $(SOURCEDIR)/api $(APP)
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
- @$(SPHINXBUILD) -b $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
+ @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) -c .
diff --git a/{{cookiecutter.project_slug}}/docs/howto.rst b/{{cookiecutter.project_slug}}/docs/howto.rst
index 0ef90d023..7f2d26a1e 100644
--- a/{{cookiecutter.project_slug}}/docs/howto.rst
+++ b/{{cookiecutter.project_slug}}/docs/howto.rst
@@ -28,7 +28,7 @@ Docstrings to Documentation
The sphinx extension `apidoc `_ is used to automatically document code using signatures and docstrings.
-Numpy or Google style docstrings will be picked up from project files and availble for documentation. See the `Napoleon `_ extension for details.
+Numpy or Google style docstrings will be picked up from project files and available for documentation. See the `Napoleon `_ extension for details.
For an in-use example, see the `page source <_sources/users.rst.txt>`_ for :ref:`users`.
diff --git a/{{cookiecutter.project_slug}}/docs/make.bat b/{{cookiecutter.project_slug}}/docs/make.bat
index b19f42c6a..6cd1129f0 100644
--- a/{{cookiecutter.project_slug}}/docs/make.bat
+++ b/{{cookiecutter.project_slug}}/docs/make.bat
@@ -32,7 +32,7 @@ if errorlevel 9009 (
goto end
:livehtml
-sphinx-autobuild -b html --open-browser -p 7000 --watch %APP% -c . %SOURCEDIR% %BUILDDIR%/html
+sphinx-autobuild -b html --open-browser -p 9000 --watch %APP% -c . %SOURCEDIR% %BUILDDIR%/html
GOTO :EOF
:apidocs
diff --git a/{{cookiecutter.project_slug}}/gulpfile.js b/{{cookiecutter.project_slug}}/gulpfile.js
index 56a08e8fc..40d367bcd 100644
--- a/{{cookiecutter.project_slug}}/gulpfile.js
+++ b/{{cookiecutter.project_slug}}/gulpfile.js
@@ -9,9 +9,7 @@ const pjson = require('./package.json')
// Plugins
const autoprefixer = require('autoprefixer')
const browserSync = require('browser-sync').create()
-{% if cookiecutter.custom_bootstrap_compilation == 'y' %}
const concat = require('gulp-concat')
-{% endif %}
const cssnano = require ('cssnano')
const imagemin = require('gulp-imagemin')
const pixrem = require('pixrem')
@@ -19,7 +17,7 @@ const plumber = require('gulp-plumber')
const postcss = require('gulp-postcss')
const reload = browserSync.reload
const rename = require('gulp-rename')
-const sass = require('gulp-sass')
+const sass = require('gulp-sass')(require('sass'))
const spawn = require('child_process').spawn
const uglify = require('gulp-uglify-es').default
@@ -29,14 +27,11 @@ function pathsConfig(appName) {
const vendorsRoot = 'node_modules'
return {
- {%- if cookiecutter.custom_bootstrap_compilation == 'y' %}
bootstrapSass: `${vendorsRoot}/bootstrap/scss`,
vendorsJs: [
- `${vendorsRoot}/jquery/dist/jquery.slim.js`,
- `${vendorsRoot}/popper.js/dist/umd/popper.js`,
+ `${vendorsRoot}/@popperjs/core/dist/umd/popper.js`,
`${vendorsRoot}/bootstrap/dist/js/bootstrap.js`,
],
- {%- endif %}
app: this.app,
templates: `${this.app}/templates`,
css: `${this.app}/static/css`,
@@ -47,7 +42,7 @@ function pathsConfig(appName) {
}
}
-var paths = pathsConfig()
+const paths = pathsConfig()
////////////////////////////////
// Tasks
@@ -55,21 +50,19 @@ var paths = pathsConfig()
// Styles autoprefixing and minification
function styles() {
- var processCss = [
+ const processCss = [
autoprefixer(), // adds vendor prefixes
pixrem(), // add fallbacks for rem units
]
- var minifyCss = [
+ const minifyCss = [
cssnano({ preset: 'default' }) // minify result
]
return src(`${paths.sass}/project.scss`)
.pipe(sass({
includePaths: [
- {%- if cookiecutter.custom_bootstrap_compilation == 'y' %}
paths.bootstrapSass,
- {%- endif %}
paths.sass
]
}).on('error', sass.logError))
@@ -90,18 +83,16 @@ function scripts() {
.pipe(dest(paths.js))
}
-{%- if cookiecutter.custom_bootstrap_compilation == 'y' %}
// Vendor Javascript minification
function vendorScripts() {
- return src(paths.vendorsJs)
+ return src(paths.vendorsJs, { sourcemaps: true })
.pipe(concat('vendors.js'))
.pipe(dest(paths.js))
.pipe(plumber()) // Checks for errors
.pipe(uglify()) // Minifies the js
.pipe(rename({ suffix: '.min' }))
- .pipe(dest(paths.js))
+ .pipe(dest(paths.js, { sourcemaps: '.' }))
}
-{%- endif %}
// Image compression
function imgCompression() {
@@ -113,7 +104,7 @@ function imgCompression() {
{%- if cookiecutter.use_async == 'y' -%}
// Run django server
function asyncRunServer() {
- var cmd = spawn('gunicorn', [
+ const cmd = spawn('gunicorn', [
'config.asgi', '-k', 'uvicorn.workers.UvicornWorker', '--reload'
], {stdio: 'inherit'}
)
@@ -124,7 +115,7 @@ function asyncRunServer() {
{%- else %}
// Run django server
function runServer(cb) {
- var cmd = spawn('python', ['manage.py', 'runserver'], {stdio: 'inherit'})
+ const cmd = spawn('python', ['manage.py', 'runserver'], {stdio: 'inherit'})
cmd.on('close', function(code) {
console.log('runServer exited with code ' + code)
cb(code)
@@ -134,31 +125,33 @@ function runServer(cb) {
// Browser sync server for live reload
function initBrowserSync() {
- browserSync.init(
- [
- `${paths.css}/*.css`,
- `${paths.js}/*.js`,
- `${paths.templates}/*.html`
- ], {
- // https://www.browsersync.io/docs/options/#option-proxy
+ browserSync.init(
+ [
+ `${paths.css}/*.css`,
+ `${paths.js}/*.js`,
+ `${paths.templates}/*.html`
+ ], {
+ {%- if cookiecutter.use_docker == 'y' %}
+ // https://www.browsersync.io/docs/options/#option-open
+ // Disable as it doesn't work from inside a container
+ open: false,
+ {%- endif %}
+ // https://www.browsersync.io/docs/options/#option-proxy
+ proxy: {
{%- if cookiecutter.use_docker == 'n' %}
- proxy: 'localhost:8000'
+ target: '127.0.0.1:8000',
{%- else %}
- proxy: {
- target: 'django:8000',
- proxyReq: [
- function(proxyReq, req) {
- // Assign proxy "host" header same as current request at Browsersync server
- proxyReq.setHeader('Host', req.headers.host)
- }
- ]
- },
- // https://www.browsersync.io/docs/options/#option-open
- // Disable as it doesn't work from inside a container
- open: false
+ target: 'django:8000',
{%- endif %}
+ proxyReq: [
+ function(proxyReq, req) {
+ // Assign proxy "host" header same as current request at Browsersync server
+ proxyReq.setHeader('Host', req.headers.host)
+ }
+ ]
}
- )
+ }
+ )
}
// Watch
@@ -172,7 +165,7 @@ function watchPaths() {
const generateAssets = parallel(
styles,
scripts,
- {%- if cookiecutter.custom_bootstrap_compilation == 'y' %}vendorScripts,{% endif %}
+ vendorScripts,
imgCompression
)
diff --git a/{{cookiecutter.project_slug}}/local.yml b/{{cookiecutter.project_slug}}/local.yml
index 8cce827c1..38b9d77d3 100644
--- a/{{cookiecutter.project_slug}}/local.yml
+++ b/{{cookiecutter.project_slug}}/local.yml
@@ -1,8 +1,8 @@
version: '3'
volumes:
- local_postgres_data: {}
- local_postgres_data_backups: {}
+ {{ cookiecutter.project_slug }}_local_postgres_data: {}
+ {{ cookiecutter.project_slug }}_local_postgres_data_backups: {}
services:
django:{% if cookiecutter.use_celery == 'y' %} &django{% endif %}
@@ -10,9 +10,12 @@ services:
context: .
dockerfile: ./compose/local/django/Dockerfile
image: {{ cookiecutter.project_slug }}_local_django
- container_name: django
+ container_name: {{ cookiecutter.project_slug }}_local_django
depends_on:
- postgres
+ {%- if cookiecutter.use_celery == 'y' %}
+ - redis
+ {%- endif %}
{%- if cookiecutter.use_mailhog == 'y' %}
- mailhog
{%- endif %}
@@ -30,16 +33,16 @@ services:
context: .
dockerfile: ./compose/production/postgres/Dockerfile
image: {{ cookiecutter.project_slug }}_production_postgres
- container_name: postgres
+ container_name: {{ cookiecutter.project_slug }}_local_postgres
volumes:
- - local_postgres_data:/var/lib/postgresql/data:Z
- - local_postgres_data_backups:/backups:z
+ - {{ cookiecutter.project_slug }}_local_postgres_data:/var/lib/postgresql/data
+ - {{ cookiecutter.project_slug }}_local_postgres_data_backups:/backups
env_file:
- ./.envs/.local/.postgres
docs:
image: {{ cookiecutter.project_slug }}_local_docs
- container_name: docs
+ container_name: {{ cookiecutter.project_slug }}_local_docs
build:
context: .
dockerfile: ./compose/local/docs/Dockerfile
@@ -50,13 +53,13 @@ services:
- ./config:/app/config:z
- ./{{ cookiecutter.project_slug }}:/app/{{ cookiecutter.project_slug }}:z
ports:
- - "7000:7000"
+ - "9000:9000"
command: /start-docs
{%- if cookiecutter.use_mailhog == 'y' %}
mailhog:
image: mailhog/mailhog:v1.0.0
- container_name: mailhog
+ container_name: {{ cookiecutter.project_slug }}_local_mailhog
ports:
- "8025:8025"
@@ -65,12 +68,12 @@ services:
redis:
image: redis:6
- container_name: redis
+ container_name: {{ cookiecutter.project_slug }}_local_redis
celeryworker:
<<: *django
image: {{ cookiecutter.project_slug }}_local_celeryworker
- container_name: celeryworker
+ container_name: {{ cookiecutter.project_slug }}_local_celeryworker
depends_on:
- redis
- postgres
@@ -83,7 +86,7 @@ services:
celerybeat:
<<: *django
image: {{ cookiecutter.project_slug }}_local_celerybeat
- container_name: celerybeat
+ container_name: {{ cookiecutter.project_slug }}_local_celerybeat
depends_on:
- redis
- postgres
@@ -96,20 +99,20 @@ services:
flower:
<<: *django
image: {{ cookiecutter.project_slug }}_local_flower
- container_name: flower
+ container_name: {{ cookiecutter.project_slug }}_local_flower
ports:
- "5555:5555"
command: /start-flower
{%- endif %}
- {%- if cookiecutter.js_task_runner == 'Gulp' %}
+ {%- if cookiecutter.frontend_pipeline == 'Gulp' %}
node:
build:
context: .
dockerfile: ./compose/local/node/Dockerfile
image: {{ cookiecutter.project_slug }}_local_node
- container_name: node
+ container_name: {{ cookiecutter.project_slug }}_local_node
depends_on:
- django
volumes:
diff --git a/{{cookiecutter.project_slug}}/merge_production_dotenvs_in_dotenv.py b/{{cookiecutter.project_slug}}/merge_production_dotenvs_in_dotenv.py
index d1170eff6..35139fb2e 100644
--- a/{{cookiecutter.project_slug}}/merge_production_dotenvs_in_dotenv.py
+++ b/{{cookiecutter.project_slug}}/merge_production_dotenvs_in_dotenv.py
@@ -1,67 +1,26 @@
import os
+from collections.abc import Sequence
from pathlib import Path
-from typing import Sequence
-import pytest
-
-ROOT_DIR_PATH = Path(__file__).parent.resolve()
-PRODUCTION_DOTENVS_DIR_PATH = ROOT_DIR_PATH / ".envs" / ".production"
-PRODUCTION_DOTENV_FILE_PATHS = [
- PRODUCTION_DOTENVS_DIR_PATH / ".django",
- PRODUCTION_DOTENVS_DIR_PATH / ".postgres",
+BASE_DIR = Path(__file__).parent.resolve()
+PRODUCTION_DOTENVS_DIR = BASE_DIR / ".envs" / ".production"
+PRODUCTION_DOTENV_FILES = [
+ PRODUCTION_DOTENVS_DIR / ".django",
+ PRODUCTION_DOTENVS_DIR / ".postgres",
]
-DOTENV_FILE_PATH = ROOT_DIR_PATH / ".env"
+DOTENV_FILE = BASE_DIR / ".env"
def merge(
- output_file_path: str, merged_file_paths: Sequence[str], append_linesep: bool = True
+ output_file: Path,
+ files_to_merge: Sequence[Path],
) -> None:
- with open(output_file_path, "w") as output_file:
- for merged_file_path in merged_file_paths:
- with open(merged_file_path, "r") as merged_file:
- merged_file_content = merged_file.read()
- output_file.write(merged_file_content)
- if append_linesep:
- output_file.write(os.linesep)
-
-
-def main():
- merge(DOTENV_FILE_PATH, PRODUCTION_DOTENV_FILE_PATHS)
-
-
-@pytest.mark.parametrize("merged_file_count", range(3))
-@pytest.mark.parametrize("append_linesep", [True, False])
-def test_merge(tmpdir_factory, merged_file_count: int, append_linesep: bool):
- tmp_dir_path = Path(str(tmpdir_factory.getbasetemp()))
-
- output_file_path = tmp_dir_path / ".env"
-
- expected_output_file_content = ""
- merged_file_paths = []
- for i in range(merged_file_count):
- merged_file_ord = i + 1
-
- merged_filename = ".service{}".format(merged_file_ord)
- merged_file_path = tmp_dir_path / merged_filename
-
- merged_file_content = merged_filename * merged_file_ord
-
- with open(merged_file_path, "w+") as file:
- file.write(merged_file_content)
-
- expected_output_file_content += merged_file_content
- if append_linesep:
- expected_output_file_content += os.linesep
-
- merged_file_paths.append(merged_file_path)
-
- merge(output_file_path, merged_file_paths, append_linesep)
-
- with open(output_file_path, "r") as output_file:
- actual_output_file_content = output_file.read()
-
- assert actual_output_file_content == expected_output_file_content
+ merged_content = ""
+ for merge_file in files_to_merge:
+ merged_content += merge_file.read_text()
+ merged_content += os.linesep
+ output_file.write_text(merged_content)
if __name__ == "__main__":
- main()
+ merge(DOTENV_FILE, PRODUCTION_DOTENV_FILES)
diff --git a/{{cookiecutter.project_slug}}/package.json b/{{cookiecutter.project_slug}}/package.json
index 6edf2e114..bff0a34af 100644
--- a/{{cookiecutter.project_slug}}/package.json
+++ b/{{cookiecutter.project_slug}}/package.json
@@ -3,36 +3,31 @@
"version": "{{ cookiecutter.version }}",
"dependencies": {},
"devDependencies": {
- {% if cookiecutter.js_task_runner == 'Gulp' -%}
- {% if cookiecutter.custom_bootstrap_compilation == 'y' -%}
- "bootstrap": "4.3.1",
+ "bootstrap": "^5.1.3",
"gulp-concat": "^2.6.1",
- "jquery": "3.3.1",
- "popper.js": "1.14.3",
- {% endif -%}
- "autoprefixer": "^9.4.7",
- "browser-sync": "^2.14.0",
- "cssnano": "^4.1.10",
- "gulp": "^4.0.0",
- "gulp-imagemin": "^5.0.3",
+ "@popperjs/core": "^2.10.2",
+ "autoprefixer": "^10.4.0",
+ "browser-sync": "^2.27.7",
+ "cssnano": "^5.0.11",
+ "gulp": "^4.0.2",
+ "gulp-imagemin": "^7.1.0",
"gulp-plumber": "^1.2.1",
- "gulp-postcss": "^8.0.0",
- "gulp-rename": "^1.2.2",
- "gulp-sass": "^4.0.2",
- "gulp-uglify-es": "^1.0.4",
- "pixrem": "^5.0.0"
- {%- endif %}
+ "gulp-postcss": "^9.0.1",
+ "gulp-rename": "^2.0.0",
+ "gulp-sass": "^5.0.0",
+ "gulp-uglify-es": "^3.0.0",
+ "pixrem": "^5.0.0",
+ "postcss": "^8.3.11",
+ "sass": "^1.43.4"
},
"engines": {
- "node": ">=8"
+ "node": "16"
},
"browserslist": [
"last 2 versions"
],
"scripts": {
- {% if cookiecutter.js_task_runner == 'Gulp' -%}
"dev": "gulp",
"build": "gulp generate-assets"
- {%- endif %}
}
}
diff --git a/{{cookiecutter.project_slug}}/production.yml b/{{cookiecutter.project_slug}}/production.yml
index ea4292a0d..cc4c9bca1 100644
--- a/{{cookiecutter.project_slug}}/production.yml
+++ b/{{cookiecutter.project_slug}}/production.yml
@@ -25,8 +25,8 @@ services:
dockerfile: ./compose/production/postgres/Dockerfile
image: {{ cookiecutter.project_slug }}_production_postgres
volumes:
- - production_postgres_data:/var/lib/postgresql/data:Z
- - production_postgres_data_backups:/backups:z
+ - production_postgres_data:/var/lib/postgresql/data
+ - production_postgres_data_backups:/backups
env_file:
- ./.envs/.production/.postgres
@@ -38,7 +38,7 @@ services:
depends_on:
- django
volumes:
- - production_traefik:/etc/traefik/acme:z
+ - production_traefik:/etc/traefik/acme
ports:
- "0.0.0.0:80:80"
- "0.0.0.0:443:443"
diff --git a/{{cookiecutter.project_slug}}/pytest.ini b/{{cookiecutter.project_slug}}/pytest.ini
index e3b2248d9..969c7921e 100644
--- a/{{cookiecutter.project_slug}}/pytest.ini
+++ b/{{cookiecutter.project_slug}}/pytest.ini
@@ -1,6 +1,6 @@
[pytest]
addopts = --ds=config.settings.test --reuse-db
python_files = tests.py test_*.py
-{%- if cookiecutter.js_task_runner != 'None' %}
+{%- if cookiecutter.frontend_pipeline == 'Gulp' %}
norecursedirs = node_modules
{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/requirements/base.txt b/{{cookiecutter.project_slug}}/requirements/base.txt
index 54b88cf58..b695d5af8 100644
--- a/{{cookiecutter.project_slug}}/requirements/base.txt
+++ b/{{cookiecutter.project_slug}}/requirements/base.txt
@@ -1,45 +1,48 @@
-pytz==2021.3 # https://github.com/stub42/pytz
-python-slugify==5.0.2 # https://github.com/un33k/python-slugify
-Pillow==8.3.2 # https://github.com/python-pillow/Pillow
-{%- if cookiecutter.use_compressor == "y" %}
+pytz==2022.7.1 # https://github.com/stub42/pytz
+python-slugify==7.0.0 # https://github.com/un33k/python-slugify
+Pillow==9.4.0 # https://github.com/python-pillow/Pillow
+{%- if cookiecutter.frontend_pipeline == 'Django Compressor' %}
{%- if cookiecutter.windows == 'y' and cookiecutter.use_docker == 'n' %}
-rcssmin==1.0.6 --install-option="--without-c-extensions" # https://github.com/ndparker/rcssmin
+rcssmin==1.1.0 --install-option="--without-c-extensions" # https://github.com/ndparker/rcssmin
{%- else %}
-rcssmin==1.0.6 # https://github.com/ndparker/rcssmin
+rcssmin==1.1.1 # https://github.com/ndparker/rcssmin
{%- endif %}
{%- endif %}
-argon2-cffi==21.1.0 # https://github.com/hynek/argon2_cffi
+argon2-cffi==21.3.0 # https://github.com/hynek/argon2_cffi
{%- if cookiecutter.use_whitenoise == 'y' %}
-whitenoise==5.3.0 # https://github.com/evansd/whitenoise
+whitenoise==6.3.0 # https://github.com/evansd/whitenoise
{%- endif %}
-redis==3.5.3 # https://github.com/andymccurdy/redis-py
+redis==4.4.2 # https://github.com/redis/redis-py
{%- if cookiecutter.use_docker == "y" or cookiecutter.windows == "n" %}
-hiredis==2.0.0 # https://github.com/redis/hiredis-py
+hiredis==2.1.1 # https://github.com/redis/hiredis-py
{%- endif %}
{%- if cookiecutter.use_celery == "y" %}
-celery==5.1.2 # pyup: < 6.0 # https://github.com/celery/celery
-django-celery-beat==2.2.1 # https://github.com/celery/django-celery-beat
+celery==5.2.7 # pyup: < 6.0 # https://github.com/celery/celery
+django-celery-beat==2.4.0 # https://github.com/celery/django-celery-beat
{%- if cookiecutter.use_docker == 'y' %}
-flower==1.0.0 # https://github.com/mher/flower
+flower==1.2.0 # https://github.com/mher/flower
{%- endif %}
{%- endif %}
{%- if cookiecutter.use_async == 'y' %}
-uvicorn[standard]==0.15.0 # https://github.com/encode/uvicorn
+uvicorn[standard]==0.20.0 # https://github.com/encode/uvicorn
{%- endif %}
# Django
# ------------------------------------------------------------------------------
-django==3.1.13 # pyup: < 3.2 # https://www.djangoproject.com/
-django-environ==0.7.0 # https://github.com/joke2k/django-environ
-django-model-utils==4.1.1 # https://github.com/jazzband/django-model-utils
-django-allauth==0.45.0 # https://github.com/pennersr/django-allauth
-django-crispy-forms==1.13.0 # https://github.com/django-crispy-forms/django-crispy-forms
-{%- if cookiecutter.use_compressor == "y" %}
-django-compressor==2.4.1 # https://github.com/django-compressor/django-compressor
+django==4.0.8 # pyup: < 4.1 # https://www.djangoproject.com/
+django-environ==0.9.0 # https://github.com/joke2k/django-environ
+django-model-utils==4.3.1 # https://github.com/jazzband/django-model-utils
+django-allauth==0.52.0 # https://github.com/pennersr/django-allauth
+django-crispy-forms==1.14.0 # https://github.com/django-crispy-forms/django-crispy-forms
+crispy-bootstrap5==0.7 # https://github.com/django-crispy-forms/crispy-bootstrap5
+{%- if cookiecutter.frontend_pipeline == 'Django Compressor' %}
+django-compressor==4.3.1 # https://github.com/django-compressor/django-compressor
{%- endif %}
-django-redis==5.0.0 # https://github.com/jazzband/django-redis
-{%- if cookiecutter.use_drf == "y" %}
+django-redis==5.2.0 # https://github.com/jazzband/django-redis
+{%- if cookiecutter.use_drf == 'y' %}
# Django REST Framework
-djangorestframework==3.12.4 # https://github.com/encode/django-rest-framework
-django-cors-headers==3.10.0 # https://github.com/adamchainz/django-cors-headers
+djangorestframework==3.14.0 # https://github.com/encode/django-rest-framework
+django-cors-headers==3.13.0 # https://github.com/adamchainz/django-cors-headers
+# DRF-spectacular for api documentation
+drf-spectacular==0.25.1 # https://github.com/tfranzel/drf-spectacular
{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/requirements/local.txt b/{{cookiecutter.project_slug}}/requirements/local.txt
index 7bb4d00be..ef0342a32 100644
--- a/{{cookiecutter.project_slug}}/requirements/local.txt
+++ b/{{cookiecutter.project_slug}}/requirements/local.txt
@@ -1,48 +1,48 @@
-r base.txt
-Werkzeug==2.0.2 # https://github.com/pallets/werkzeug
-ipdb==0.13.9 # https://github.com/gotcha/ipdb
+Werkzeug[watchdog]==2.2.2 # https://github.com/pallets/werkzeug
+ipdb==0.13.11 # https://github.com/gotcha/ipdb
{%- if cookiecutter.use_docker == 'y' %}
-psycopg2==2.9.1 # https://github.com/psycopg/psycopg2
+psycopg2==2.9.5 # https://github.com/psycopg/psycopg2
{%- else %}
-psycopg2-binary==2.9.1 # https://github.com/psycopg/psycopg2
+psycopg2-binary==2.9.5 # https://github.com/psycopg/psycopg2
{%- endif %}
{%- if cookiecutter.use_async == 'y' or cookiecutter.use_celery == 'y' %}
-watchgod==0.7 # https://github.com/samuelcolvin/watchgod
+watchfiles==0.18.1 # https://github.com/samuelcolvin/watchfiles
{%- endif %}
# Testing
# ------------------------------------------------------------------------------
-mypy==0.910 # https://github.com/python/mypy
-django-stubs==1.8.0 # https://github.com/typeddjango/django-stubs
-pytest==6.2.5 # https://github.com/pytest-dev/pytest
-pytest-sugar==0.9.4 # https://github.com/Frozenball/pytest-sugar
+mypy==0.982 # https://github.com/python/mypy
+django-stubs==1.14.0 # https://github.com/typeddjango/django-stubs
+pytest==7.2.1 # https://github.com/pytest-dev/pytest
+pytest-sugar==0.9.6 # https://github.com/Frozenball/pytest-sugar
{%- if cookiecutter.use_drf == "y" %}
-djangorestframework-stubs==1.4.0 # https://github.com/typeddjango/djangorestframework-stubs
+djangorestframework-stubs==1.8.0 # https://github.com/typeddjango/djangorestframework-stubs
{%- endif %}
# Documentation
# ------------------------------------------------------------------------------
-sphinx==4.2.0 # https://github.com/sphinx-doc/sphinx
+sphinx==5.3.0 # https://github.com/sphinx-doc/sphinx
sphinx-autobuild==2021.3.14 # https://github.com/GaretJax/sphinx-autobuild
# Code quality
# ------------------------------------------------------------------------------
-flake8==3.9.2 # https://github.com/PyCQA/flake8
-flake8-isort==4.0.0 # https://github.com/gforcada/flake8-isort
-coverage==6.0.2 # https://github.com/nedbat/coveragepy
-black==21.9b0 # https://github.com/psf/black
-pylint-django==2.4.4 # https://github.com/PyCQA/pylint-django
+flake8==6.0.0 # https://github.com/PyCQA/flake8
+flake8-isort==6.0.0 # https://github.com/gforcada/flake8-isort
+coverage==7.1.0 # https://github.com/nedbat/coveragepy
+black==22.12.0 # https://github.com/psf/black
+pylint-django==2.5.3 # https://github.com/PyCQA/pylint-django
{%- if cookiecutter.use_celery == 'y' %}
pylint-celery==0.3 # https://github.com/PyCQA/pylint-celery
{%- endif %}
-pre-commit==2.15.0 # https://github.com/pre-commit/pre-commit
+pre-commit==3.0.1 # https://github.com/pre-commit/pre-commit
# Django
# ------------------------------------------------------------------------------
-factory-boy==3.2.0 # https://github.com/FactoryBoy/factory_boy
+factory-boy==3.2.1 # https://github.com/FactoryBoy/factory_boy
-django-debug-toolbar==3.2.2 # https://github.com/jazzband/django-debug-toolbar
-django-extensions==3.1.3 # https://github.com/django-extensions/django-extensions
-django-coverage-plugin==2.0.1 # https://github.com/nedbat/django_coverage_plugin
-pytest-django==4.4.0 # https://github.com/pytest-dev/pytest-django
+django-debug-toolbar==3.8.1 # https://github.com/jazzband/django-debug-toolbar
+django-extensions==3.2.1 # https://github.com/django-extensions/django-extensions
+django-coverage-plugin==3.0.0 # https://github.com/nedbat/django_coverage_plugin
+pytest-django==4.5.2 # https://github.com/pytest-dev/pytest-django
diff --git a/{{cookiecutter.project_slug}}/requirements/production.txt b/{{cookiecutter.project_slug}}/requirements/production.txt
index 3d70a8a4c..6936adffd 100644
--- a/{{cookiecutter.project_slug}}/requirements/production.txt
+++ b/{{cookiecutter.project_slug}}/requirements/production.txt
@@ -3,40 +3,42 @@
-r base.txt
gunicorn==20.1.0 # https://github.com/benoitc/gunicorn
-psycopg2==2.9.1 # https://github.com/psycopg/psycopg2
+psycopg2==2.9.5 # https://github.com/psycopg/psycopg2
{%- if cookiecutter.use_whitenoise == 'n' %}
Collectfast==2.2.0 # https://github.com/antonagestam/collectfast
{%- endif %}
{%- if cookiecutter.use_sentry == "y" %}
-sentry-sdk==1.4.3 # https://github.com/getsentry/sentry-python
+sentry-sdk==1.14.0 # https://github.com/getsentry/sentry-python
{%- endif %}
{%- if cookiecutter.use_docker == "n" and cookiecutter.windows == "y" %}
-hiredis==2.0.0 # https://github.com/redis/hiredis-py
+hiredis==2.1.1 # https://github.com/redis/hiredis-py
{%- endif %}
# Django
# ------------------------------------------------------------------------------
{%- if cookiecutter.cloud_provider == 'AWS' %}
-django-storages[boto3]==1.12.1 # https://github.com/jschneier/django-storages
+django-storages[boto3]==1.13.2 # https://github.com/jschneier/django-storages
{%- elif cookiecutter.cloud_provider == 'GCP' %}
-django-storages[google]==1.12.1 # https://github.com/jschneier/django-storages
+django-storages[google]==1.13.2 # https://github.com/jschneier/django-storages
+{%- elif cookiecutter.cloud_provider == 'Azure' %}
+django-storages[azure]==1.13.2 # https://github.com/jschneier/django-storages
{%- endif %}
{%- if cookiecutter.mail_service == 'Mailgun' %}
-django-anymail[mailgun]==8.4 # https://github.com/anymail/django-anymail
+django-anymail[mailgun]==9.0 # https://github.com/anymail/django-anymail
{%- elif cookiecutter.mail_service == 'Amazon SES' %}
-django-anymail[amazon_ses]==8.4 # https://github.com/anymail/django-anymail
+django-anymail[amazon_ses]==9.0 # https://github.com/anymail/django-anymail
{%- elif cookiecutter.mail_service == 'Mailjet' %}
-django-anymail[mailjet]==8.4 # https://github.com/anymail/django-anymail
+django-anymail[mailjet]==9.0 # https://github.com/anymail/django-anymail
{%- elif cookiecutter.mail_service == 'Mandrill' %}
-django-anymail[mandrill]==8.4 # https://github.com/anymail/django-anymail
+django-anymail[mandrill]==9.0 # https://github.com/anymail/django-anymail
{%- elif cookiecutter.mail_service == 'Postmark' %}
-django-anymail[postmark]==8.4 # https://github.com/anymail/django-anymail
+django-anymail[postmark]==9.0 # https://github.com/anymail/django-anymail
{%- elif cookiecutter.mail_service == 'Sendgrid' %}
-django-anymail[sendgrid]==8.4 # https://github.com/anymail/django-anymail
+django-anymail[sendgrid]==9.0 # https://github.com/anymail/django-anymail
{%- elif cookiecutter.mail_service == 'SendinBlue' %}
-django-anymail[sendinblue]==8.4 # https://github.com/anymail/django-anymail
+django-anymail[sendinblue]==9.0 # https://github.com/anymail/django-anymail
{%- elif cookiecutter.mail_service == 'SparkPost' %}
-django-anymail[sparkpost]==8.4 # https://github.com/anymail/django-anymail
+django-anymail[sparkpost]==9.0 # https://github.com/anymail/django-anymail
{%- elif cookiecutter.mail_service == 'Other SMTP' %}
-django-anymail==8.4 # https://github.com/anymail/django-anymail
+django-anymail==9.0 # https://github.com/anymail/django-anymail
{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/runtime.txt b/{{cookiecutter.project_slug}}/runtime.txt
index 2153d1e15..69b0ccfc8 100644
--- a/{{cookiecutter.project_slug}}/runtime.txt
+++ b/{{cookiecutter.project_slug}}/runtime.txt
@@ -1 +1 @@
-python-3.9.7
+python-3.10.8
diff --git a/{{cookiecutter.project_slug}}/setup.cfg b/{{cookiecutter.project_slug}}/setup.cfg
index dad64e1fa..7ee60215a 100644
--- a/{{cookiecutter.project_slug}}/setup.cfg
+++ b/{{cookiecutter.project_slug}}/setup.cfg
@@ -1,10 +1,10 @@
[flake8]
max-line-length = 120
-exclude = .tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules,venv
+exclude = .tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules,venv,.venv
[pycodestyle]
max-line-length = 120
-exclude = .tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules,venv
+exclude = .tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules,venv,.venv
[isort]
line_length = 88
@@ -18,7 +18,7 @@ force_grid_wrap = 0
use_parentheses = true
[mypy]
-python_version = 3.9
+python_version = 3.10
check_untyped_defs = True
ignore_missing_imports = True
warn_unused_ignores = True
@@ -34,7 +34,7 @@ django_settings_module = config.settings.test
ignore_errors = True
[coverage:run]
-include = {{cookiecutter.project_slug}}/*
+include = {{cookiecutter.project_slug}}/**
omit = *migrations*, *tests*
plugins =
django_coverage_plugin
diff --git a/{{cookiecutter.project_slug}}/tests/test_merge_production_dotenvs_in_dotenv.py b/{{cookiecutter.project_slug}}/tests/test_merge_production_dotenvs_in_dotenv.py
new file mode 100644
index 000000000..c0e68f60a
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/tests/test_merge_production_dotenvs_in_dotenv.py
@@ -0,0 +1,34 @@
+from pathlib import Path
+
+import pytest
+
+from merge_production_dotenvs_in_dotenv import merge
+
+
+@pytest.mark.parametrize(
+ ("input_contents", "expected_output"),
+ [
+ ([], ""),
+ ([""], "\n"),
+ (["JANE=doe"], "JANE=doe\n"),
+ (["SEP=true", "AR=ator"], "SEP=true\nAR=ator\n"),
+ (["A=0", "B=1", "C=2"], "A=0\nB=1\nC=2\n"),
+ (["X=x\n", "Y=y", "Z=z\n"], "X=x\n\nY=y\nZ=z\n\n"),
+ ],
+)
+def test_merge(
+ tmp_path: Path,
+ input_contents: list[str],
+ expected_output: str,
+):
+ output_file = tmp_path / ".env"
+
+ files_to_merge = []
+ for num, input_content in enumerate(input_contents, start=1):
+ merge_file = tmp_path / f".service{num}"
+ merge_file.write_text(input_content)
+ files_to_merge.append(merge_file)
+
+ merge(output_file, files_to_merge)
+
+ assert output_file.read_text() == expected_output
diff --git a/{{cookiecutter.project_slug}}/utility/requirements-bullseye.apt b/{{cookiecutter.project_slug}}/utility/requirements-bullseye.apt
new file mode 100644
index 000000000..60f602873
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/utility/requirements-bullseye.apt
@@ -0,0 +1,23 @@
+##basic build dependencies of various Django apps for Debian Bullseye 11.x
+#build-essential metapackage install: make, gcc, g++,
+build-essential
+#required to translate
+gettext
+python3-dev
+
+##shared dependencies of:
+##Pillow, pylibmc
+zlib1g-dev
+
+##Postgresql and psycopg2 dependencies
+libpq-dev
+
+##Pillow dependencies
+libtiff5-dev
+libjpeg62-turbo-dev
+libfreetype6-dev
+liblcms2-dev
+libwebp-dev
+
+##django-extensions
+libgraphviz-dev
diff --git a/{{cookiecutter.project_slug}}/utility/requirements-jammy.apt b/{{cookiecutter.project_slug}}/utility/requirements-jammy.apt
new file mode 100644
index 000000000..63d1587e6
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/utility/requirements-jammy.apt
@@ -0,0 +1,23 @@
+##basic build dependencies of various Django apps for Ubuntu Jammy 22.04
+#build-essential metapackage install: make, gcc, g++,
+build-essential
+#required to translate
+gettext
+python3-dev
+
+##shared dependencies of:
+##Pillow, pylibmc
+zlib1g-dev
+
+##Postgresql and psycopg2 dependencies
+libpq-dev
+
+##Pillow dependencies
+libtiff5-dev
+libjpeg8-dev
+libfreetype6-dev
+liblcms2-dev
+libwebp-dev
+
+##django-extensions
+graphviz-dev
diff --git a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/__init__.py b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/__init__.py
index b4056707a..fb6532709 100644
--- a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/__init__.py
+++ b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/__init__.py
@@ -1,7 +1,5 @@
__version__ = "{{ cookiecutter.version }}"
__version_info__ = tuple(
- [
- int(num) if num.isdigit() else num
- for num in __version__.replace("-", ".", 1).split(".")
- ]
+ int(num) if num.isdigit() else num
+ for num in __version__.replace("-", ".", 1).split(".")
)
diff --git a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/conftest.py b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/conftest.py
index 335648e07..7095a4714 100644
--- a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/conftest.py
+++ b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/conftest.py
@@ -10,5 +10,5 @@ def media_storage(settings, tmpdir):
@pytest.fixture
-def user() -> User:
+def user(db) -> User:
return UserFactory()
diff --git a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/contrib/sites/migrations/0003_set_site_domain_and_name.py b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/contrib/sites/migrations/0003_set_site_domain_and_name.py
index 8f4a8f997..080c734bb 100644
--- a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/contrib/sites/migrations/0003_set_site_domain_and_name.py
+++ b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/contrib/sites/migrations/0003_set_site_domain_and_name.py
@@ -7,23 +7,52 @@ from django.conf import settings
from django.db import migrations
+def _update_or_create_site_with_sequence(site_model, connection, domain, name):
+ """Update or create the site with default ID and keep the DB sequence in sync."""
+ site, created = site_model.objects.update_or_create(
+ id=settings.SITE_ID,
+ defaults={
+ "domain": domain,
+ "name": name,
+ },
+ )
+ if created:
+ # We provided the ID explicitly when creating the Site entry, therefore the DB
+ # sequence to auto-generate them wasn't used and is now out of sync. If we
+ # don't do anything, we'll get a unique constraint violation the next time a
+ # site is created.
+ # To avoid this, we need to manually update DB sequence and make sure it's
+ # greater than the maximum value.
+ max_id = site_model.objects.order_by('-id').first().id
+ with connection.cursor() as cursor:
+ cursor.execute("SELECT last_value from django_site_id_seq")
+ (current_id,) = cursor.fetchone()
+ if current_id <= max_id:
+ cursor.execute(
+ "alter sequence django_site_id_seq restart with %s",
+ [max_id + 1],
+ )
+
+
def update_site_forward(apps, schema_editor):
"""Set site domain and name."""
Site = apps.get_model("sites", "Site")
- Site.objects.update_or_create(
- id=settings.SITE_ID,
- defaults={
- "domain": "{{cookiecutter.domain_name}}",
- "name": "{{cookiecutter.project_name}}",
- },
+ _update_or_create_site_with_sequence(
+ Site,
+ schema_editor.connection,
+ "{{cookiecutter.domain_name}}",
+ "{{cookiecutter.project_name}}",
)
def update_site_backward(apps, schema_editor):
"""Revert site domain and name to default."""
Site = apps.get_model("sites", "Site")
- Site.objects.update_or_create(
- id=settings.SITE_ID, defaults={"domain": "example.com", "name": "example.com"}
+ _update_or_create_site_with_sequence(
+ Site,
+ schema_editor.connection,
+ "example.com",
+ "example.com",
)
diff --git a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/contrib/sites/migrations/0004_alter_options_ordering_domain.py b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/contrib/sites/migrations/0004_alter_options_ordering_domain.py
index 00c9a74d3..f7118ca81 100644
--- a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/contrib/sites/migrations/0004_alter_options_ordering_domain.py
+++ b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/contrib/sites/migrations/0004_alter_options_ordering_domain.py
@@ -6,12 +6,16 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
- ('sites', '0003_set_site_domain_and_name'),
+ ("sites", "0003_set_site_domain_and_name"),
]
operations = [
migrations.AlterModelOptions(
- name='site',
- options={'ordering': ['domain'], 'verbose_name': 'site', 'verbose_name_plural': 'sites'},
+ name="site",
+ options={
+ "ordering": ["domain"],
+ "verbose_name": "site",
+ "verbose_name_plural": "sites",
+ },
),
]
diff --git a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/static/sass/project.scss b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/static/sass/project.scss
index 9fbf1b746..370096bb3 100644
--- a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/static/sass/project.scss
+++ b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/static/sass/project.scss
@@ -1,8 +1,5 @@
-{% if cookiecutter.custom_bootstrap_compilation == 'y' %}
@import "custom_bootstrap_vars";
@import "bootstrap";
-{% endif %}
-
// project specific CSS goes here
diff --git a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/templates/account/email.html b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/templates/account/email.html
index 07b5789e6..1faa2b9fd 100644
--- a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/templates/account/email.html
+++ b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/templates/account/email.html
@@ -66,17 +66,14 @@ window.addEventListener('DOMContentLoaded',function() {
const message = "{% translate 'Do you really want to remove the selected e-mail address?' %}";
const actions = document.getElementsByName('action_remove');
if (actions.length) {
- actions[0].addEventListener("click", function(e) {
+ actions[0].addEventListener("click",function(e) {
if (!confirm(message)) {
e.preventDefault();
}
});
}
+ Array.from(document.getElementsByClassName('form-group')).forEach(x => x.classList.remove('row'));
});
-
-document.addEventListener('DOMContentLoaded', function() {
- $('.form-group').removeClass('row');
-})
{% endblock %}
{%- endraw %}
diff --git a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/templates/account/login.html b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/templates/account/login.html
index 8c7b5ae60..25a292eda 100644
--- a/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/templates/account/login.html
+++ b/{{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}/templates/account/login.html
@@ -13,25 +13,37 @@
{% get_providers as socialaccount_providers %}
{% if socialaccount_providers %}
-
{% blocktranslate with site.name as site_name %}Please sign in with one
-of your existing third party accounts. Or, sign up
-for a {{ site_name }} account and sign in below:{% endblocktranslate %}
+
+ {% translate "Please sign in with one of your existing third party accounts:" %}
+ {% if ACCOUNT_ALLOW_REGISTRATION %}
+ {% blocktranslate trimmed %}
+ Or, sign up
+ for a {{ site_name }} account and sign in below:
+ {% endblocktranslate %}
+ {% endif %}
+
-
+
-
- {% include "socialaccount/snippets/provider_list.html" with process="login" %}
-
+
+ {% include "socialaccount/snippets/provider_list.html" with process="login" %}
+
-
{% translate 'or' %}
+
{% translate "or" %}
-
+
-{% include "socialaccount/snippets/login_extra.html" %}
+ {% include "socialaccount/snippets/login_extra.html" %}
{% else %}
-
{% blocktranslate %}If you have not created an account yet, then please
-sign up first.{% endblocktranslate %}
+ {% if ACCOUNT_ALLOW_REGISTRATION %}
+
+ {% blocktranslate trimmed %}
+ If you have not created an account yet, then please
+ sign up first.
+ {% endblocktranslate %}
+