Merge pull request #4 from pydanny/master

UPDATE from pydanny:master
This commit is contained in:
Jimmy Gitonga 2020-02-02 08:50:08 +03:00 committed by GitHub
commit c4479031e9
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
114 changed files with 2106 additions and 1430 deletions

12
.github/FUNDING.yml vendored Normal file
View File

@ -0,0 +1,12 @@
# These are supported funding model platforms
github: # Replace with up to 4 GitHub Sponsors-enabled usernames e.g., [user1, user2]
patreon: danielroygreenfeld
open_collective: # Replace with a single Open Collective username
ko_fi: # Replace with a single Ko-fi username
tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
liberapay: # Replace with a single Liberapay username
issuehunt: # Replace with a single IssueHunt username
otechie: # Replace with a single Otechie username
custom: ['https://www.patreon.com/browniebroke']

17
.pyup.yml Normal file
View File

@ -0,0 +1,17 @@
# configure updates globally
# default: all
# allowed: all, insecure, False
update: all
# configure dependency pinning globally
# default: True
# allowed: True, False
pin: True
# Specify requirement files by hand, pyup seems to struggle to
# find the ones in the project_slug folder
requirements:
- "requirements.txt"
- "{{cookiecutter.project_slug}}/requirements/base.txt"
- "{{cookiecutter.project_slug}}/requirements/local.txt"
- "{{cookiecutter.project_slug}}/requirements/production.txt"

View File

@ -1,22 +1,37 @@
sudo: required
dist: xenial
services:
- docker
language: python
python: 3.6
env:
- TOX_ENV=py36
python: 3.7
before_install:
- docker-compose -v
- docker -v
script:
- tox -e $TOX_ENV
- sh tests/test_docker.sh
matrix:
include:
- name: Test results
script: tox -e py37
- name: Run flake8 on result
script: tox -e flake8
- name: Run black on result
script: tox -e black
- name: Black template
script: tox -e black-template
- name: Basic Docker
script: sh tests/test_docker.sh
- name: Docker with Celery
script: sh tests/test_docker.sh use_celery=y
- name: Bare metal
script: sh tests/test_bare.sh use_celery=y use_compressor=y
services:
- postgresql
- redis-server
env:
- CELERY_BROKER_URL=redis://localhost:6379/0
install:
- pip install tox

View File

@ -1,6 +1,258 @@
# Change Log
All enhancements and patches to Cookiecutter Django will be documented in this file.
This project adheres to [Semantic Versioning](http://semver.org/).
## [2020-01-23]
### Changed
- Fix UserFactory to set the password if provided (@BoPeng)
- Update documentation files with latest Sphinx (@howiezhao)
## [2020-01-12]
### Changed
- Fix mypy setup and added django-stubs (@danifus)
- Add Gitlab CI as option (@ikhomutov)
## [2020-01-11]
### Changed
- Speed up & reduce size for production Django image (@maxp)
- Bumped runtime version for Heroku (@Isaac12x)
- Added Debian 10 (Buster) OS dependencies (@ddiazpinto)
- Update Traefik to v2 (@blaxpy)
- Switched Docker images from Alpine based to Debian based (@trungdong)
## [2019-10-06]
### Changed
- Default Python version is now 3.7 (@nicolas471)
## [2019-10-04]
### Fixed
- Fix static files handling on GCP (@caioariede)
## [2019-10-03]
### Fixed
- Fix incompatible combination between Whitenoise and no cloud provider (@caioariede)
## [2019-07-09]
### Fixed
- Always use test settings in pytest (@danihodovic)
- Remove gunicorn from `INSTALLED_APPS` (@danihodovic)
- Remove `EMAIL_HOST` and `EMAIL_PORT` with locmem backend (@danihodovic)
### Added
- Add `EMAIL_TIMEOUT` (@danihodovic)
## [2019-06-22]
### Fixed
- Remove redundant template debug setting (@danihodovic)
## [2019-06-19]
### Fixed
- Fix removal carriage returns in docker scripts (@timclaessens)
## [2019-06-15]
### Fixed
- Issue with Pycharm setup for running things in Docker compose (@foarsitter)
## [2019-06-06]
### Changed
- Update generated Travis config (@browniebroke)
## [2019-06-03]
### Added
- Installed `django-celery-beat` to keep scheduled tasks in DB (@keyvanm)
## [2019-05-28]
### Changed
- Use GCP acronym rather than inconsistent GCE/GCS (@tanoabeleyra)
## [2019-05-27]
### Changed
- Made cloud provider optional (@tanoabeleyra)
- Updated to Django 2.2.1 (@browniebroke)
### Fixed
- Celery worker-related setting names (@browniebroke)
## [2019-05-18]
### Removed
- Remove the user list view (@browniebroke)
### Fixed
- Static storage default ACL (@browniebroke)
## [2019-05-17]
### Fixed
- Added `LocaleMiddleware` to the list of middlewares (@tanoabeleyra)
- Added `LOCALE_PATH` to settings (@tanoabeleyra)
## [2019-05-16]
### Changed
- Users app to have a translated verbose name (@tanoabeleyra)
- Logging configuration for local (@browniebroke)
## [2019-05-08]
### Changed
- Upgraded to Django 2.1 (@browniebroke)
## [2019-04-07]
### Added
- Support for Google Cloud Storage (@ahhda)
## [2019-04-03]
### Added
- Command to backup Db to AWS S3 (@foarsitter)
## [2019-03-25]
### Added
- Node image to run Gulp with Docker (@browniebroke)
## [2019-03-19]
### Changed
- Replaced Caddy with Traefik (@demestav)
## [2019-03-11]
### Changed
- Sentry integration from Raven to Sentry-SDK (@gfabricio)
- Made Redis config conditional on Celery locally (@demestav)
## [2019-03-11]
### Added
- Automatic migrations on Heroku (@yunti)
## [2019-03-06]
### Fixed
- Missing script tag in Travis config (@btknu)
## [2019-03-02]
### Changed
- Celery eager setting in local setting with Docker (@keithjeb)
## [2019-03-01]
### Updated
- All NPM dependencies (@takkaria)
## [2018-11-13]
### Changed
- Security settings in Dev (@carlmjohnson)
## [2018-11-20]
### Fixed
- Passing the CSRF header from the reverse proxy to Django server for DRF (@hpbruna)
## [2018-11-12]
### Fixed
- Initialisation of Celery app (@glasslion)
## [2018-10-24]
### Fixed
- Persisting of iPython history between sessions (@davitovmasyan)
### Added
- Postgres 10.5 option (@jleclanche)
## [2018-09-18]
### Added
- Included `mypy` in dependencies and run it in tests (@apirobot)
## [2018-09-18]
### Fixed
- Avoid `$` in environment variables to workaround a bug from django-environ (@browniebroke)
## [2018-09-16]
### Fixed
- Bug in ordering of Middleware for production config (@ChrisPappalardo)
## [2018-09-12]
### Fixed
- URLs for Static and Media for S3 buckets in regions other than N. Virginia (@umrashrf)
## [2018-09-09]
### Changed
- Name of static and media storage classes (@sfdye)
## [2018-09-01]
### Changed
- Make static and media storage fully-fledged classes (@erfaan)
## [2018-08-28]
### Fixed
- Running tests in docker test script (@apirobot)
## [2018-07-23]
### Changed
- Test commands to use pytest (@jcass77)
### Removed
- Some hacks leftovers from Bootstrap v4 beta in `project.js` (@hendrikschneider)
## [2018-07-12]
### Changed
- Upgraded to Bootstrap 4.1.1 (@mostaszewski)
## [2018-06-25]
### Added
- Flower integration with Docker (@webyneter)
## [2018-06-25]
### Changed
- Rewrite user app test to use a pytest style (@webyneter)
## [2018-06-21]
### Added
- Extend & update Celery config (@webyneter & @apirobot)
## [2018-05-25]
### Fixed
- Build issues due to incompatibility between libressl & openssl (@SassanoM)
## [2018-05-21]
### Changed
- Updated Caddy to 0.11 and pin its version (@webyneter)
## [2018-05-14]
### Changed
- Replace `awesome-slugify` by `python-slugify` (@hongquan)
- Migrate to Django 2.0+ URL style (@saschalalala)
## [2018-05-05]
### Fixed
- Postgres backup & restore commands (@webyneter)
## [2018-04-10]
### Changed
- Simplify configuration (@danidee10)
## [2018-04-08]
### Added
- Adopt Black code style (@pydanny)
## [2018-03-27]
### Fixed
- Simplified extra Celery config generated when opted out (@webyneter)
## [2018-03-21]
### Removed
- Remove Opbeat support (@sfdye)
## [2018-03-16]
### Fixed
- Install `psycopg2-binary` when using Docker locally (@browniebroke)
## [2018-03-14]
### Fixed
- Fixed and improved Postgres backup & restore scripts (@webyneter)
## [2018-03-10]
### Changed
- Simplify Mailgun setting (@browniebroke)
## [2018-03-06]
### Changed
- Convert string formatting to f-strings (@sfdye)
## [2018-03-01]
### Changed
- Celery to use JSON serialization by default (@adammsteele)
- Use Docker version from Travis to run tests (@browniebroke)
## [2018-02-16]
### Changed

View File

@ -39,9 +39,9 @@ To run all tests using various versions of python in virtualenvs defined in tox.
It is possible to test with a specific version of python. To do this, the command
is::
$ tox -e py36
$ tox -e py37
This will run py.test with the python3.6 interpreter, for example.
This will run py.test with the python3.7 interpreter, for example.
To run a particular test with tox for against your current Python version::

View File

@ -7,9 +7,9 @@ Core Developers
These contributors have commit flags for the repository,
and are able to accept and merge pull requests.
=========================== ================ ===========
=========================== ================= ===========
Name Github Twitter
=========================== ================ ===========
=========================== ================= ===========
Daniel Roy Greenfeld `@pydanny`_ @pydanny
Audrey Roy Greenfeld* `@audreyr`_ @audreyr
Fábio C. Barrionuevo da Luz `@luzfcb`_ @luzfcb
@ -19,7 +19,7 @@ Burhan Khalid `@burhan`_ @burhan
Nikita Shupeyko `@webyneter`_ @webyneter
Bruno Alla               `@browniebroke`_ @_BrunoAlla
Wan Liuyang `@sfdye`_ @sfdye
=========================== ================ ===========
=========================== ================= ===========
*Audrey is also the creator of Cookiecutter. Audrey and
Daniel are on the Cookiecutter core team.*
@ -42,11 +42,12 @@ Listed in alphabetical order.
Name Github Twitter
========================== ============================ ==============
18 `@dezoito`_
2O4 `@2O4`_
a7p `@a7p`_
Aaron Eikenberry `@aeikenberry`_
Adam Bogdał `@bogdal`_
Adam Dobrawy `@ad-m`_
Adam Steele `@adammsteele`
Adam Steele `@adammsteele`_
Agam Dua
Alberto Sanchez `@alb3rto`_
Alex Tsai `@caffodian`_
@ -64,15 +65,18 @@ Listed in alphabetical order.
Areski Belaid `@areski`_
Ashley Camba
Barclay Gauld `@yunti`_
Ben Warren `@bwarren2`
Bartek `@btknu`_
Ben Lopatin
Ben Warren `@bwarren2`_
Benjamin Abel
Bert de Miranda `@bertdemiranda`_
Bo Lopker `@blopker`_
Bo Peng `@BoPeng`_
Bouke Haarsma
Brent Payne `@brentpayne`_ @brentpayne
Bartek `@btknu`
Bruce Olivier `@bolivierjr`_
Burhan Khalid            `@burhan`_                   @burhan
Caio Ariede `@caioariede`_ @caioariede
Carl Johnson `@carlmjohnson`_ @carlmjohnson
Catherine Devlin `@catherinedevlin`_
Cédric Gaspoz `@cgaspoz`_
@ -84,33 +88,49 @@ Listed in alphabetical order.
Christopher Clarke `@chrisdev`_
Cole Mackenzie `@cmackenzie1`_
Collederas `@Collederas`_
Craig Margieson `@cmargieson`_
Cristian Vargas `@cdvv7788`_
Cullen Rhodes `@c-rhodes`_
Curtis St Pierre `@curtisstpierre`_ @cstpierre1388
Dan Shultz `@shultz`_
Dani Hodovic `@danihodovic`_
Daniel Hepper `@dhepper`_ @danielhepper
Daniel Hillier `@danifus`_
Daniele Tricoli `@eriol`_
David Díaz `@ddiazpinto`_ @DavidDiazPinto
Davit Tovmasyan `@davitovmasyan`_
Davur Clementsen `@dsclementsen`_ @davur
Delio Castillo `@jangeador`_ @jangeador
Demetris Stavrou `@demestav`_
Denis Bobrov `@delneg`_
Denis Orehovsky `@apirobot`_
Dónal Adams `@epileptic-fish`_
Denis Savran `@blaxpy`_
Diane Chen `@purplediane`_ @purplediane88
Dónal Adams `@epileptic-fish`_
Dong Huynh `@trungdong`_
Emanuel Calso `@bloodpet`_ @bloodpet
Eraldo Energy `@eraldo`_
Eric Groom `@ericgroom`_
Eyad Al Sibai `@eyadsibai`_
Felipe Arruda `@arruda`_
Florian Idelberger `@step21`_ @windrush
Garry Cairns `@garry-cairns`_
Garry Polley `@garrypolley`_
Gilbishkosma `@Gilbishkosma`_
Hamish Durkin `@durkode`_
Hana Quadara `@hanaquadara`_
Harry Moreno `@morenoh149`_ @morenoh149
Harry Percival `@hjwp`_
Hendrik Schneider `@hendrikschneider`_
Henrique G. G. Pereira `@ikkebr`_
Howie Zhao `@howiezhao`_
Ian Lee `@IanLee1521`_
Irfan Ahmad `@erfaan`_ @erfaan
Isaac12x `@Isaac12x`_
Ivan Khomutov `@ikhomutov`_
Jan Van Bruggen `@jvanbrug`_
Jelmer Draaijer `@foarsitter`_
Jerome Caisip `@jeromecaisip`_
Jens Nilsson `@phiberjenz`_
Jerome Leclanche `@jleclanche`_ @Adys
Jimmy Gitonga `@afrowave`_ @afrowave
@ -120,13 +140,16 @@ Listed in alphabetical order.
Kaido Kert `@kaidokert`_
kappataumu `@kappataumu`_ @kappataumu
Kaveh `@ka7eh`_
Keith Bailey `@keithjeb`_
Keith Webber `@townie`_
Kevin A. Stone
Kevin Ndung'u `@kevgathuku`_
Keith Webber `@townie`_
Keyvan Mosharraf `@keyvanm`_
Krzysztof Szumny `@noisy`_
Krzysztof Żuraw `@krzysztofzuraw`_
Leonardo Jimenez `@xpostudio4`_
Leo won `@leollon`_
Leo Zhou `@glasslion`_
Leonardo Jimenez `@xpostudio4`_
Lin Xianyi `@iynaix`_
Luis Nell `@originell`_
Lukas Klein
@ -137,6 +160,7 @@ Listed in alphabetical order.
Mateusz Ostaszewski `@mostaszewski`_
Mathijs Hoogland `@MathijsHoogland`_
Matt Braymer-Hayes `@mattayes`_ @mattayes
Matt Knapper `@mknapper1`_
Matt Linares
Matt Menzenski `@menzenski`_
Matt Warren `@mfwarren`_
@ -144,66 +168,86 @@ Listed in alphabetical order.
Meghan Heintz `@dot2dotseurat`_
Mesut Yılmaz `@myilmaz`_
Michael Gecht `@mimischi`_ @_mischi
Michael Samoylov `@msamoylov`_
Min ho Kim `@minho42`_
mozillazg `@mozillazg`_
Nico Stefani `@nicolas471`_ @moby_dick91
Oleg Russkin `@rolep`_
Pablo `@oubiga`_
Parbhat Puri `@parbhat`_
Peter Bittner `@bittner`_
Peter Coles `@mrcoles`_
Philipp Matthies `@canonnervio`_
Pierre Chiquet `@pchiquet`_
Raphael Pierzina `@hackebrot`_
Raony Guimarães Corrêa `@raonyguimaraes`_
Raphael Pierzina `@hackebrot`_
Reggie Riser `@reggieriser`_
René Muhl `@rm--`_
Roman Afanaskin `@siauPatrick`_
Roman Osipenko `@romanosipenko`_
Russell Davies
Sascha `@saschalalala` @saschalalala
Sam Collins `@MightySCollins`_
Sascha `@saschalalala`_ @saschalalala
Shupeyko Nikita `@webyneter`_
Sławek Ehlert `@slafs`_
Srinivas Nyayapati `@shireenrao`_
stepmr `@stepmr`_
Steve Steiner `@ssteinerX`_
Sule Marshall `@suledev`_
Tano Abeleyra `@tanoabeleyra`_
Taylor Baldwin
Théo Segonds `@show0k`_
Tim Claessens `@timclaessens`_
Tim Freund `@timfreund`_
Tom Atkins `@knitatoms`_
Tom Offermann
Travis McNeill `@Travistock`_ @tavistock_esq
Tubo Shi `@Tubo`_
Umair Ashraf `@umrashrf`_ @fabumair
Vadim Iskuchekov `@Egregors`_ @egregors
Vitaly Babiy
Vivian Guillen `@viviangb`_
Vlad Doster `@vladdoster`_
Will Farley `@goldhand`_ @g01dhand
William Archinal `@archinal`_
Xaver Y.R. Chen `@yrchen`_ @yrchen
Yaroslav Halchenko
Denis Bobrov `@delneg`_
Philipp Matthies `@canonnervio`_
Vadim Iskuchekov `@Egregors`_ @egregors
Keith Bailey `@keithjeb`_
Yuchen Xie `@mapx`_
========================== ============================ ==============
.. _@a7p: https://github.com/a7p
.. _@2O4: https://github.com/2O4
.. _@ad-m: https://github.com/ad-m
.. _@adammsteele: https://github.com/adammsteele
.. _@aeikenberry: https://github.com/aeikenberry
.. _@afrowave: https://github.com/afrowave
.. _@ahhda: https://github.com/ahhda
.. _@alb3rto: https://github.com/alb3rto
.. _@ameistad: https://github.com/ameistad
.. _@amjith: https://github.com/amjith
.. _@andor-pierdelacabeza: https://github.com/andor-pierdelacabeza
.. _@andresgz: https://github.com/andresgz
.. _@antoniablair: https://github.com/antoniablair
.. _@apirobot: https://github.com/apirobot
.. _@archinal: https://github.com/archinal
.. _@areski: https://github.com/areski
.. _@arruda: https://github.com/arruda
.. _@bertdemiranda: https://github.com/bertdemiranda
.. _@bittner: https://github.com/bittner
.. _@blaxpy: https://github.com/blaxpy
.. _@bloodpet: https://github.com/bloodpet
.. _@blopker: https://github.com/blopker
.. _@bogdal: https://github.com/bogdal
.. _@bolivierjr: https://github.com/bolivierjr
.. _@BoPeng: https://github.com/BoPeng
.. _@brentpayne: https://github.com/brentpayne
.. _@btknu: https://github.com/btknu
.. _@burhan: https://github.com/burhan
.. _@bwarren2: https://github.com/bwarren2
.. _@c-rhodes: https://github.com/c-rhodes
.. _@caffodian: https://github.com/caffodian
.. _@canonnervio: https://github.com/canonnervio
.. _@caioariede: https://github.com/caioariede
.. _@carlmjohnson: https://github.com/carlmjohnson
.. _@catherinedevlin: https://github.com/catherinedevlin
.. _@ccurvey: https://github.com/ccurvey
@ -213,94 +257,119 @@ Listed in alphabetical order.
.. _@ChrisPappalardo: https://github.com/ChrisPappalardo
.. _@chuckus: https://github.com/chuckus
.. _@cmackenzie1: https://github.com/cmackenzie1
.. _@cmargieson: https://github.com/cmargieson
.. _@Collederas: https://github.com/Collederas
.. _@curtisstpierre: https://github.com/curtisstpierre
.. _@dadokkio: https://github.com/dadokkio
.. _@danihodovic: https://github.com/danihodovic
.. _@danifus: https://github.com/danifus
.. _@davitovmasyan: https://github.com/davitovmasyan
.. _@ddiazpinto: https://github.com/ddiazpinto
.. _@delneg: https://github.com/delneg
.. _@demestav: https://github.com/demestav
.. _@dezoito: https://github.com/dezoito
.. _@dhepper: https://github.com/dhepper
.. _@dot2dotseurat: https://github.com/dot2dotseurat
.. _@dsclementsen: https://github.com/dsclementsen
.. _@durkode: https://github.com/durkode
.. _@Egregors: https://github.com/Egregors
.. _@epileptic-fish: https://gihub.com/epileptic-fish
.. _@eraldo: https://github.com/eraldo
.. _@erfaan: https://github.com/erfaan
.. _@ericgroom: https://github.com/ericgroom
.. _@eriol: https://github.com/eriol
.. _@eyadsibai: https://github.com/eyadsibai
.. _@flyudvik: https://github.com/flyudvik
.. _@foarsitter: https://github.com/foarsitter
.. _@garry-cairns: https://github.com/garry-cairns
.. _@garrypolley: https://github.com/garrypolley
.. _@goldhand: https://github.com/goldhand
.. _@Gilbishkosma: https://github.com/Gilbishkosma
.. _@glasslion: https://github.com/glasslion
.. _@goldhand: https://github.com/goldhand
.. _@hackebrot: https://github.com/hackebrot
.. _@hairychris: https://github.com/hairychris
.. _@hanaquadara: https://github.com/hanaquadara
.. _@hendrikschneider: https://github.com/hendrikschneider
.. _@hjwp: https://github.com/hjwp
.. _@howiezhao: https://github.com/howiezhao
.. _@IanLee1521: https://github.com/IanLee1521
.. _@ikhomutov: https://github.com/ikhomutov
.. _@ikkebr: https://github.com/ikkebr
.. _@Isaac12x: https://github.com/Isaac12x
.. _@iynaix: https://github.com/iynaix
.. _@jangeador: https://github.com/jangeador
.. _@jazztpt: https://github.com/jazztpt
.. _@jcass77: https://github.com/jcass77
.. _@jeromecaisip: https://github.com/jeromecaisip
.. _@jleclanche: https://github.com/jleclanche
.. _@juliocc: https://github.com/juliocc
.. _@jvanbrug: https://github.com/jvanbrug
.. _@ka7eh: https://github.com/ka7eh
.. _@kaidokert: https://github.com/kaidokert
.. _@kappataumu: https://github.com/kappataumu
.. _@keithjeb: https://github.com/keithjeb
.. _@kevgathuku: https://github.com/kevgathuku
.. _@keyvanm: https://github.com/keyvanm
.. _@knitatoms: https://github.com/knitatoms
.. _@krzysztofzuraw: https://github.com/krzysztofzuraw
.. _@msaizar: https://github.com/msaizar
.. _@leollon: https://github.com/leollon
.. _@MathijsHoogland: https://github.com/MathijsHoogland
.. _@mapx: https://github.com/mapx
.. _@mattayes: https://github.com/mattayes
.. _@menzenski: https://github.com/menzenski
.. _@mostaszewski: https://github.com/mostaszewski
.. _@mfwarren: https://github.com/mfwarren
.. _@MightySCollins: https://github.com/MightySCollins
.. _@mimischi: https://github.com/mimischi
.. _@minho42: https://github.com/minho42
.. _@mjsisley: https://github.com/mjsisley
.. _@myilmaz: https://github.com/myilmaz
.. _@mknapper1: https://github.com/mknapper1
.. _@morenoh149: https://github.com/morenoh149
.. _@mostaszewski: https://github.com/mostaszewski
.. _@mozillazg: https://github.com/mozillazg
.. _@mrcoles: https://github.com/mrcoles
.. _@msaizar: https://github.com/msaizar
.. _@msamoylov: https://github.com/msamoylov
.. _@myilmaz: https://github.com/myilmaz
.. _@nicolas471: https://github.com/nicolas471
.. _@noisy: https://github.com/noisy
.. _@originell: https://github.com/originell
.. _@oubiga: https://github.com/oubiga
.. _@parbhat: https://github.com/parbhat
.. _@pchiquet: https://github.com/pchiquet
.. _@phiberjenz: https://github.com/phiberjenz
.. _@purplediane: https://github.com/purplediane
.. _@raonyguimaraes: https://github.com/raonyguimaraes
.. _@reggieriser: https://github.com/reggieriser
.. _@rm--: https://github.com/rm--
.. _@rolep: https://github.com/rolep
.. _@romanosipenko: https://github.com/romanosipenko
.. _@saschalalala: https://github.com/saschalalala
.. _@shireenrao: https://github.com/shireenrao
.. _@show0k: https://github.com/show0k
.. _@shultz: https://github.com/shultz
.. _@siauPatrick: https://github.com/siauPatrick
.. _@sladinji: https://github.com/sladinji
.. _@slafs: https://github.com/slafs
.. _@ssteinerX: https://github.com/ssteinerx
.. _@step21: https://github.com/step21
.. _@stepmr: https://github.com/stepmr
.. _@suledev: https://github.com/suledev
.. _@takkaria: https://github.com/takkaria
.. _@tanoabeleyra: https://github.com/tanoabeleyra
.. _@timclaessens: https://github.com/timclaessens
.. _@timfreund: https://github.com/timfreund
.. _@townie: https://github.com/townie
.. _@Travistock: https://github.com/Tavistock
.. _@trungdong: https://github.com/trungdong
.. _@Tubo: https://github.com/tubo
.. _@umrashrf: https://github.com/umrashrf
.. _@viviangb: https://github.com/viviangb
.. _@vladdoster: https://github.com/vladdoster
.. _@xpostudio4: https://github.com/xpostudio4
.. _@yrchen: https://github.com/yrchen
.. _@yunti: https://github.com/yunti
.. _@zcho: https://github.com/zcho
.. _@phiberjenz: https://github.com/phiberjenz
.. _@sladinji: https://github.com/sladinji
.. _@andresgz: https://github.com/andresgz
.. _@jangeador: https://github.com/jangeador
.. _@townie: https://github.com/townie
.. _@MightySCollins: https://github.com/MightySCollins
.. _@dadokkio: https://github.com/dadokkio
.. _@bwarren2: https://github.com/bwarren2
.. _@bertdemiranda: https://github.com/bertdemiranda
.. _@brentpayne: https://github.com/brentpayne
.. _@afrowave: https://github.com/afrowave
.. _@pchiquet: https://github.com/pchiquet
.. _@delneg: https://github.com/delneg
.. _@purplediane: https://github.com/purplediane
.. _@umrashrf: https://github.com/umrashrf
.. _@ahhda: https://github.com/ahhda
.. _@keithjeb: https://github.com/keithjeb
.. _@btknu: https://github.com/btknu
Special Thanks
~~~~~~~~~~~~~~

View File

@ -9,8 +9,8 @@ Cookiecutter Django
:target: https://pyup.io/repos/github/pydanny/cookiecutter-django/
:alt: Updates
.. image:: https://badges.gitter.im/Join Chat.svg
:target: https://gitter.im/pydanny/cookiecutter-django?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
.. image:: https://img.shields.io/badge/cookiecutter-Join%20on%20Slack-green?style=flat&logo=slack
:target: https://join.slack.com/t/cookie-cutter/shared_invite/enQtNzI0Mzg5NjE5Nzk5LTRlYWI2YTZhYmQ4YmU1Y2Q2NmE1ZjkwOGM0NDQyNTIwY2M4ZTgyNDVkNjMxMDdhZGI5ZGE5YmJjM2M3ODJlY2U
.. image:: https://www.codetriage.com/pydanny/cookiecutter-django/badges/users.svg
:target: https://www.codetriage.com/pydanny/cookiecutter-django
@ -36,10 +36,10 @@ production-ready Django projects quickly.
Features
---------
* For Django 2.0
* Works with Python 3.6
* For Django 2.2
* Works with Python 3.7
* Renders Django projects with 100% starting test coverage
* Twitter Bootstrap_ v4.1.1 (`maintained Foundation fork`_ also available)
* Twitter Bootstrap_ v4 (`maintained Foundation fork`_ also available)
* 12-Factor_ based settings via django-environ_
* Secure by default. We believe in SSL.
* Optimized development and production settings
@ -47,12 +47,13 @@ Features
* Comes with custom user model ready to go
* Optional custom static build using Gulp and livereload
* Send emails via Anymail_ (using Mailgun_ by default, but switchable)
* Media storage using Amazon S3
* Docker support using docker-compose_ for development and production (using Caddy_ with LetsEncrypt_ support)
* Media storage using Amazon S3 or Google Cloud Storage
* Docker support using docker-compose_ for development and production (using Traefik_ with LetsEncrypt_ support)
* Procfile_ for deploying to Heroku
* Instructions for deploying to PythonAnywhere_
* Run tests with unittest or py.test
* Run tests with unittest or pytest
* Customizable PostgreSQL version
* Default integration with pre-commit_ for identifying simple issues before submission to code review
.. _`maintained Foundation fork`: https://github.com/Parbhat/cookiecutter-django-foundation
@ -62,7 +63,7 @@ Optional Integrations
*These features can be enabled during initial project setup.*
* Serve static files from Amazon S3 or Whitenoise_
* Serve static files from Amazon S3, Google Cloud Storage or Whitenoise_
* Configuration for Celery_ and Flower_ (the latter in Docker setup only)
* Integration with MailHog_ for local email testing
* Integration with Sentry_ for error logging
@ -82,15 +83,16 @@ Optional Integrations
.. _Sentry: https://sentry.io/welcome/
.. _docker-compose: https://github.com/docker/compose
.. _PythonAnywhere: https://www.pythonanywhere.com/
.. _Caddy: https://caddyserver.com/
.. _Traefik: https://traefik.io/
.. _LetsEncrypt: https://letsencrypt.org/
.. _pre-commit: https://github.com/pre-commit/pre-commit
Constraints
-----------
* Only maintained 3rd party libraries are used.
* Uses PostgreSQL everywhere (9.2+)
* Environment variables for configuration (This won't work with Apache/mod_wsgi except on AWS ELB).
* Uses PostgreSQL everywhere (9.4 - 11.3)
* Environment variables for configuration (This won't work with Apache/mod_wsgi).
Support this Project!
----------------------
@ -106,7 +108,7 @@ Projects that provide financial support to the maintainers:
Two Scoops of Django 1.11
~~~~~~~~~~~~~~~~~~~~~~~~~
.. image:: https://cdn.shopify.com/s/files/1/0304/6901/products/tsd-111-alpha_medium.jpg?v=1499531513
.. image:: https://cdn.shopify.com/s/files/1/0304/6901/products/2017-06-29-tsd11-sticker-02.png
:name: Two Scoops of Django 1.11 Cover
:align: center
:alt: Two Scoops of Django
@ -155,7 +157,7 @@ Answer the prompts with your own desired options_. For example::
project_slug [reddit_clone]: reddit
author_name [Daniel Roy Greenfeld]: Daniel Greenfeld
email [you@example.com]: pydanny@gmail.com
description [A short description of the project.]: A reddit clone.
description [Behold My Awesome Project!]: A reddit clone.
domain_name [example.com]: myreddit.com
version [0.1.0]: 0.0.1
timezone [UTC]: America/Los_Angeles
@ -169,18 +171,21 @@ Answer the prompts with your own desired options_. For example::
use_heroku [n]: y
use_compressor [n]: y
Select postgresql_version:
1 - 10.3
2 - 10.2
3 - 10.1
4 - 9.6
5 - 9.5
6 - 9.4
7 - 9.3
Choose from 1, 2, 3, 4 [1]: 1
1 - 11.3
2 - 10.8
3 - 9.6
4 - 9.5
5 - 9.4
Choose from 1, 2, 3, 4, 5 [1]: 1
Select js_task_runner:
1 - None
2 - Gulp
Choose from 1, 2 [1]: 1
Select cloud_provider:
1 - AWS
2 - GCP
3 - None
Choose from 1, 2, 3 [1]: 1
custom_bootstrap_compilation [n]: n
Select open_source_license:
1 - MIT
@ -221,11 +226,11 @@ Community
* Have questions? **Before you ask questions anywhere else**, please post your question on `Stack Overflow`_ under the *cookiecutter-django* tag. We check there periodically for questions.
* If you think you found a bug or want to request a feature, please open an issue_.
* For anything else, you can chat with us on `Gitter`_.
* For anything else, you can chat with us on `Slack`_.
.. _`Stack Overflow`: http://stackoverflow.com/questions/tagged/cookiecutter-django
.. _`issue`: https://github.com/pydanny/cookiecutter-django/issues
.. _`Gitter`: https://gitter.im/pydanny/cookiecutter-django?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
.. _`Slack`: https://join.slack.com/t/cookie-cutter/shared_invite/enQtNzI0Mzg5NjE5Nzk5LTRlYWI2YTZhYmQ4YmU1Y2Q2NmE1ZjkwOGM0NDQyNTIwY2M4ZTgyNDVkNjMxMDdhZGI5ZGE5YmJjM2M3ODJlY2U
For Readers of Two Scoops of Django
--------------------------------------------

View File

@ -18,20 +18,22 @@
"use_pycharm": "n",
"use_docker": "n",
"postgresql_version": [
"10.5",
"10.4",
"10.3",
"10.2",
"10.1",
"11.3",
"10.8",
"9.6",
"9.5",
"9.4",
"9.3"
"9.4"
],
"js_task_runner": [
"None",
"Gulp"
],
"cloud_provider": [
"AWS",
"GCP",
"None"
],
"use_drf": "n",
"custom_bootstrap_compilation": "n",
"use_compressor": "n",
"use_celery": "n",
@ -39,7 +41,11 @@
"use_sentry": "n",
"use_whitenoise": "n",
"use_heroku": "n",
"use_travisci": "n",
"ci_tool": [
"None",
"Travis",
"Gitlab"
],
"keep_local_envs_in_vcs": "y",
"debug": "n"

View File

@ -42,7 +42,7 @@ master_doc = "index"
# General information about the project.
project = "Cookiecutter Django"
copyright = "2013-2018, Daniel Roy Greenfeld".format(now.year)
copyright = "2013-{}, Daniel Roy Greenfeld".format(now.year)
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the

View File

@ -3,6 +3,9 @@ Deployment on Heroku
.. index:: Heroku
Commands to run
---------------
Run these commands to deploy the project to Heroku:
.. code-block:: bash
@ -17,11 +20,8 @@ Run these commands to deploy the project to Heroku:
heroku addons:create heroku-redis:hobby-dev
# If using mailgun:
heroku addons:create mailgun:starter
heroku addons:create sentry:f1
heroku config:set PYTHONHASHSEED=random
heroku config:set WEB_CONCURRENCY=4
@ -30,7 +30,7 @@ Run these commands to deploy the project to Heroku:
heroku config:set DJANGO_SETTINGS_MODULE=config.settings.production
heroku config:set DJANGO_SECRET_KEY="$(openssl rand -base64 64)"
# Generating a 32 character-long random string without any of the visually similiar characters "IOl01":
# Generating a 32 character-long random string without any of the visually similar characters "IOl01":
heroku config:set DJANGO_ADMIN_URL="$(openssl rand -base64 4096 | tr -dc 'A-HJ-NP-Za-km-z2-9' | head -c 32)/"
# Set this to your Heroku app url, e.g. 'bionic-beaver-28392.herokuapp.com'
@ -52,3 +52,70 @@ Run these commands to deploy the project to Heroku:
heroku run python manage.py check --deploy
heroku open
.. warning::
.. include:: mailgun.rst
Optional actions
----------------
Celery
++++++
Celery requires a few extra environment variables to be ready operational. Also, the worker is created,
it's in the ``Procfile``, but is turned off by default:
.. code-block:: bash
# Set the broker URL to Redis
heroku config:set CELERY_BROKER_URL=`heroku config:get REDIS_URL`
# Scale dyno to 1 instance
heroku ps:scale worker=1
Sentry
++++++
If you're opted for Sentry error tracking, you can either install it through the `Sentry add-on`_:
.. code-block:: bash
heroku addons:create sentry:f1
Or add the DSN for your account, if you already have one:
.. code-block:: bash
heroku config:set SENTRY_DSN=https://xxxx@sentry.io/12345
.. _Sentry add-on: https://elements.heroku.com/addons/sentry
Gulp & Bootstrap compilation
++++++++++++++++++++++++++++
If you've opted for a custom bootstrap build, you'll most likely need to setup
your app to use `multiple buildpacks`_: one for Python & one for Node.js:
.. code-block:: bash
heroku buildpacks:add --index 1 heroku/nodejs
At time of writing, this should do the trick: during deployment,
the Heroku should run ``npm install`` and then ``npm build``,
which runs Gulp in cookiecutter-django.
If things don't work, please refer to the Heroku docs.
.. _multiple buildpacks: https://devcenter.heroku.com/articles/using-multiple-buildpacks-for-an-app
About Heroku & Docker
---------------------
Although Heroku has some sort of `Docker support`_, it's not supported by cookiecutter-django.
We invite you to follow Heroku documentation about it.
.. _Docker support: https://devcenter.heroku.com/articles/build-docker-images-heroku-yml

View File

@ -29,13 +29,13 @@ Once you've been through this one-off config, future deployments are much simple
Getting your code and dependencies installed on PythonAnywhere
--------------------------------------------------------------
Make sure your project is fully commited and pushed up to Bitbucket or Github or wherever it may be. Then, log into your PythonAnywhere account, open up a **Bash** console, clone your repo, and create a virtualenv:
Make sure your project is fully committed and pushed up to Bitbucket or Github or wherever it may be. Then, log into your PythonAnywhere account, open up a **Bash** console, clone your repo, and create a virtualenv:
.. code-block:: bash
git clone <my-repo-url> # you can also use hg
cd my-project-name
mkvirtualenv --python=/usr/bin/python3.6 my-project-name
mkvirtualenv --python=/usr/bin/python3.7 my-project-name
pip install -r requirements/production.txt # may take a few minutes
@ -153,7 +153,7 @@ Back on the Web tab, hit **Reload**, and your app should be live!
**NOTE:** *you may see security warnings until you set up your SSL certificates. If you
want to supress them temporarily, set DJANGO_SECURE_SSL_REDIRECT to blank. Follow
want to suppress them temporarily, set DJANGO_SECURE_SSL_REDIRECT to blank. Follow
the instructions here to get SSL set up: https://help.pythonanywhere.com/pages/SSLOwnDomains/*

View File

@ -7,8 +7,8 @@ Deployment with Docker
Prerequisites
-------------
* Docker 1.10+.
* Docker Compose 1.6+
* Docker 17.05+.
* Docker Compose 1.17+
Understanding the Docker Compose Setup
@ -19,7 +19,7 @@ Before you begin, check out the ``production.yml`` file in the root of this proj
* ``django``: your application running behind ``Gunicorn``;
* ``postgres``: PostgreSQL database with the application's relational data;
* ``redis``: Redis instance for caching;
* ``caddy``: Caddy web server with HTTPS on by default.
* ``traefik``: Traefik reverse proxy with HTTPS on by default.
Provided you have opted for Celery (via setting ``use_celery`` to ``y``) there are three more services:
@ -35,7 +35,15 @@ Configuring the Stack
The majority of services above are configured through the use of environment variables. Just check out :ref:`envs` and you will know the drill.
To obtain logs and information about crashes in a production setup, make sure that you have access to an external Sentry instance (e.g. by creating an account with `sentry.io`_), and set the ``SENTRY_DSN`` variable.
To obtain logs and information about crashes in a production setup, make sure that you have access to an external Sentry instance (e.g. by creating an account with `sentry.io`_), and set the ``SENTRY_DSN`` variable. Logs of level `logging.ERROR` are sent as Sentry events. Therefore, in order to send a Sentry event use:
.. code-block:: python
import logging
logging.error("This event is sent to Sentry", extra={"<example_key>": "<example_value>"})
The `extra` parameter allows you to send additional information about the context of this error.
You will probably also need to setup the Mail backend, for example by adding a `Mailgun`_ API key and a `Mailgun`_ sender domain, otherwise, the account creation view will crash and result in a 500 error when the backend attempts to send an email to the account owner.
@ -43,6 +51,11 @@ You will probably also need to setup the Mail backend, for example by adding a `
.. _Mailgun: https://mailgun.com
.. warning::
.. include:: mailgun.rst
Optional: Use AWS IAM Role for EC2 instance
-------------------------------------------
@ -63,11 +76,11 @@ It is always better to deploy a site behind HTTPS and will become crucial as the
* Access to the Django admin is set up by default to require HTTPS in production or once *live*.
The Caddy web server used in the default configuration will get you a valid certificate from Lets Encrypt and update it automatically. All you need to do to enable this is to make sure that your DNS records are pointing to the server Caddy runs on.
The Traefik reverse proxy used in the default configuration will get you a valid certificate from Lets Encrypt and update it automatically. All you need to do to enable this is to make sure that your DNS records are pointing to the server Traefik runs on.
You can read more about this here at `Automatic HTTPS`_ in the Caddy docs.
You can read more about this feature and how to configure it, at `Automatic HTTPS`_ in the Traefik docs.
.. _Automatic HTTPS: https://caddyserver.com/docs/automatic-https
.. _Automatic HTTPS: https://docs.traefik.io/configuration/acme/
(Optional) Postgres Data Volume Modifications
@ -112,7 +125,7 @@ If you want to scale your application, run::
docker-compose -f production.yml scale django=4
docker-compose -f production.yml scale celeryworker=2
.. warning:: don't try to scale ``postgres``, ``celerybeat``, or ``caddy``.
.. warning:: don't try to scale ``postgres``, ``celerybeat``, or ``traefik``.
To see how your containers are doing run::
@ -139,8 +152,10 @@ If you are using ``supervisor``, you can use this file as a starting point::
Move it to ``/etc/supervisor/conf.d/{{cookiecutter.project_slug}}.conf`` and run::
supervisorctl reread
supervisorctl update
supervisorctl start {{cookiecutter.project_slug}}
For status check, run::
supervisorctl status

View File

@ -6,6 +6,12 @@ Getting Up and Running Locally With Docker
The steps below will get you up and running with a local development environment.
All of these commands assume you are in the root of your generated project.
.. note::
If you're new to Docker, please be aware that some resources are cached system-wide
and might reappear if you generate a project multiple times with the same name (e.g.
:ref:`this issue with Postgres <docker-postgres-auth-failed>`).
Prerequisites
-------------
@ -17,17 +23,6 @@ Prerequisites
.. _`installation guide`: https://docs.docker.com/compose/install/
Attention, Windows Users
------------------------
Currently PostgreSQL (``psycopg2`` python package) is not installed inside Docker containers for Windows users, while it is required by the generated Django project. To fix this, add ``psycopg2`` to the list of requirements inside ``requirements/base.txt``::
# Python-PostgreSQL Database Adapter
psycopg2==2.6.2
Doing this will prevent the project from being installed in an Windows-only environment (thus without usage of Docker). If you want to use this project without Docker, make sure to remove ``psycopg2`` from the requirements again.
Build the Stack
---------------
@ -105,7 +100,6 @@ The most important thing for us here now is ``env_file`` section enlisting ``./.
│   ├── .django
│   └── .postgres
└── .production
├── .caddy
├── .django
└── .postgres
@ -120,7 +114,7 @@ Consider the aforementioned ``.envs/.local/.postgres``: ::
POSTGRES_USER=XgOWtQtJecsAbaIyslwGvFvPawftNaqO
POSTGRES_PASSWORD=jSljDz4whHuwO3aJIgVBrqEml5Ycbghorep4uVJ4xjDYQu0LfuTZdctj7y0YcCLu
The three envs we are presented with here are ``POSTGRES_DB``, ``POSTGRES_USER``, and ``POSTGRES_PASSWORD`` (by the way, their values have also been generated for you). You might have figured out already where these definitions will end up; it's all the same with ``django`` and ``caddy`` service container envs.
The three envs we are presented with here are ``POSTGRES_DB``, ``POSTGRES_USER``, and ``POSTGRES_PASSWORD`` (by the way, their values have also been generated for you). You might have figured out already where these definitions will end up; it's all the same with ``django`` service container envs.
One final touch: should you ever need to merge ``.envs/production/*`` in a single ``.env`` run the ``merge_production_dotenvs_in_dotenv.py``: ::

View File

@ -9,7 +9,7 @@ Setting Up Development Environment
Make sure to have the following on your host:
* Python 3.6
* Python 3.7
* PostgreSQL_.
* Redis_, if using Celery
@ -17,7 +17,7 @@ First things first.
#. Create a virtualenv: ::
$ python3.6 -m venv <virtual env path>
$ python3.7 -m venv <virtual env path>
#. Activate the virtualenv you have just created: ::
@ -26,6 +26,12 @@ First things first.
#. Install development requirements: ::
$ pip install -r requirements/local.txt
$ pre-commit install
.. note::
the `pre-commit` exists in the generated project as default.
for the details of `pre-commit`, follow the [site of pre-commit](https://pre-commit.com/).
#. Create a new PostgreSQL database using createdb_: ::
@ -120,12 +126,12 @@ In production, we have Mailgun_ configured to have your back!
Celery
------
If the project is configured to use Celery as a task scheduler then by default tasks are set to run on the main thread
when developing locally. If you have the appropriate setup on your local machine then set
when developing locally. If you have the appropriate setup on your local machine then set the following
in ``config/settings/local.py``::
CELERY_TASK_ALWAYS_EAGER = False
in /config/settings/local.py
CELERY_TASK_ALWAYS_EAGER = False
Sass Compilation & Live Reloading

View File

@ -85,3 +85,11 @@ You will see something like ::
# ...
ALTER TABLE
SUCCESS: The 'my_project' database has been restored from the '/backups/backup_2018_03_13T09_05_07.sql.gz' backup.
Backup to Amazon S3
----------------------------------
For uploading your backups to Amazon S3 you can use the aws cli container. There is an upload command for uploading the postgres /backups directory recursively and there is a download command for downloading a specific backup. The default S3 environment variables are used. ::
$ docker-compose -f production.yml run --rm awscli upload
$ docker-compose -f production.yml run --rm awscli download backup_2018_03_13T09_05_07.sql.gz

45
docs/document.rst Normal file
View File

@ -0,0 +1,45 @@
.. _document:
Document
=========
This project uses Sphinx_ documentation generator.
After you have set up to `develop locally`_, run the following commands to generate the HTML documentation: ::
$ sphinx-build docs/ docs/_build/html/
If you set up your project to `develop locally with docker`_, run the following command: ::
$ docker-compose -f local.yml run --rm django sphinx-build docs/ docs/_build/html/
Generate API documentation
----------------------------
Sphinx can automatically generate documentation from docstrings, to enable this feature, follow these steps:
1. Add Sphinx extension in ``docs/conf.py`` file, like below: ::
extensions = [
'sphinx.ext.autodoc',
]
2. Uncomment the following lines in the ``docs/conf.py`` file: ::
# import django
# sys.path.insert(0, os.path.abspath('..'))
# os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.local")
# django.setup()
3. Run the following command: ::
$ sphinx-apidoc -f -o ./docs/modules/ ./tpub/ migrations/*
If you set up your project to `develop locally with docker`_, run the following command: ::
$ docker-compose -f local.yml run --rm django sphinx-apidoc -f -o ./docs/modules ./tpub/ migrations/*
4. Regenerate HTML documentation as written above.
.. _Sphinx: https://www.sphinx-doc.org/en/master/index.html
.. _develop locally: ./developing-locally.html
.. _develop locally with docker: ./developing-locally-docker.html

View File

@ -18,6 +18,7 @@ Contents:
settings
linters
testing
document
deployment-on-pythonanywhere
deployment-on-heroku
deployment-with-docker

View File

@ -25,7 +25,7 @@ This is included in flake8's checks, but you can also run it separately to see a
The config for pylint is located in .pylintrc. It specifies:
* Use the pylint_common and pylint_django plugins. If using Celery, also use pylint_celery.
* Use the pylint_django plugin. If using Celery, also use pylint_celery.
* Set max line length to 120 chars
* Disable linting messages for missing docstring and invalid name
* max-parents=13

13
docs/mailgun.rst Normal file
View File

@ -0,0 +1,13 @@
If your email server used to send email isn't configured properly (Mailgun by default),
attempting to send an email will cause an Internal Server Error.
By default, django-allauth is setup to `have emails verifications mandatory`_,
which means it'll send a verification email when an unverified user tries to
log-in or when someone tries to sign-up.
This may happen just after you've setup your Mailgun account, which is running in a
sandbox subdomain by default. Either add your email to the list of authorized recipients
or verify your domain.
.. _have emails verifications mandatory: https://django-allauth.readthedocs.io/en/latest/configuration.html?highlight=ACCOUNT_EMAIL_VERIFICATION

View File

@ -49,13 +49,11 @@ use_docker:
postgresql_version:
Select a PostgreSQL_ version to use. The choices are:
1. 10.3
2. 10.2
3. 10.1
4. 9.6
5. 9.5
6. 9.4
7. 9.3
1. 11.3
2. 10.8
3. 9.6
4. 9.5
5. 9.4
js_task_runner:
Select a JavaScript task runner. The choices are:
@ -63,6 +61,15 @@ js_task_runner:
1. None
2. Gulp_
cloud_provider:
Select a cloud provider for static & media files. The choices are:
1. AWS_
2. GCP_
3. None
Note that if you choose no cloud provider, media files won't work.
custom_bootstrap_compilation:
Indicates whether the project should support Bootstrap recompilation
via the selected JavaScript task runner's task. This can be useful
@ -87,8 +94,12 @@ use_heroku:
Indicates whether the project should be configured so as to be deployable
to Heroku_.
use_travisci:
Indicates whether the project should be configured to use `Travis CI`_.
ci_tool:
Select a CI tool for running tests. The choices are:
1. None
2. Travis_
3. Gitlab_
keep_local_envs_in_vcs:
Indicates whether the project's ``.envs/.local/`` should be kept in VCS
@ -115,6 +126,9 @@ debug:
.. _Gulp: https://github.com/gulpjs/gulp
.. _AWS: https://aws.amazon.com/s3/
.. _GCP: https://cloud.google.com/storage/
.. _Django Compressor: https://github.com/django-compressor/django-compressor
.. _Celery: https://github.com/celery/celery
@ -128,3 +142,6 @@ debug:
.. _Heroku: https://github.com/heroku/heroku-buildpack-python
.. _Travis CI: https://travis-ci.org/
.. _GitLab CI: https://docs.gitlab.com/ee/ci/

View File

@ -44,11 +44,14 @@ CELERY_BROKER_URL CELERY_BROKER_URL auto w/ Dock
DJANGO_AWS_ACCESS_KEY_ID AWS_ACCESS_KEY_ID n/a raises error
DJANGO_AWS_SECRET_ACCESS_KEY AWS_SECRET_ACCESS_KEY n/a raises error
DJANGO_AWS_STORAGE_BUCKET_NAME AWS_STORAGE_BUCKET_NAME n/a raises error
DJANGO_AWS_S3_REGION_NAME AWS_S3_REGION_NAME n/a None
DJANGO_GCP_STORAGE_BUCKET_NAME GS_BUCKET_NAME n/a raises error
GOOGLE_APPLICATION_CREDENTIALS n/a n/a raises error
SENTRY_DSN SENTRY_DSN n/a raises error
DJANGO_SENTRY_CLIENT SENTRY_CLIENT n/a raven.contrib.django.raven_compat.DjangoClient
DJANGO_SENTRY_LOG_LEVEL SENTRY_LOG_LEVEL n/a logging.INFO
MAILGUN_API_KEY MAILGUN_ACCESS_KEY n/a raises error
MAILGUN_API_KEY MAILGUN_API_KEY n/a raises error
MAILGUN_DOMAIN MAILGUN_SENDER_DOMAIN n/a raises error
MAILGUN_API_URL n/a n/a "https://api.mailgun.net/v3"
======================================= =========================== ============================================== ======================================================================
--------------------------

View File

@ -19,20 +19,20 @@ You will get a readout of the `users` app that has already been set up with test
If you set up your project to `develop locally with docker`_, run the following command: ::
$ docker-compose -f local.yml run django pytest
$ docker-compose -f local.yml run --rm django pytest
Targetting particular apps for testing in ``docker`` follows a similar pattern as previously shown above.
Targeting particular apps for testing in ``docker`` follows a similar pattern as previously shown above.
Coverage
--------
You should build your tests to provide the highest level of **code coverage**. You can run the ``pytest`` with code ``coverage`` by typing in the following command: ::
$ docker-compose -f local.yml run django coverage run -m pytest
$ docker-compose -f local.yml run --rm django coverage run -m pytest
Once the tests are complete, in order to see the code coverage, run the following command: ::
$ docker-compose -f local.yml run django coverage report
$ docker-compose -f local.yml run --rm django coverage report
.. note::
@ -49,8 +49,8 @@ Once the tests are complete, in order to see the code coverage, run the followin
Since this is a fresh install, and there are no tests built using the Python `unittest`_ library yet, you should get feedback that says there were no tests carried out.
.. _Pytest: https://docs.pytest.org/en/latest/example/simple.html
.. _develop locally: ../developing-locally.rst
.. _develop locally with docker: ..../developing-locally-docker.rst
.. _develop locally: ./developing-locally.html
.. _develop locally with docker: ./developing-locally-docker.html
.. _customize: https://docs.pytest.org/en/latest/customize.html
.. _unittest: https://docs.python.org/3/library/unittest.html#module-unittest
.. _configuring: https://coverage.readthedocs.io/en/v4.5.x/config.html

View File

@ -3,12 +3,48 @@ Troubleshooting
This page contains some advice about errors and problems commonly encountered during the development of Cookiecutter Django applications.
Server Error on sign-up/log-in
------------------------------
Make sure you have configured the mail backend (e.g. Mailgun) by adding the API key and sender domain
.. include:: mailgun.rst
.. _docker-postgres-auth-failed:
Docker: Postgres authentication failed
--------------------------------------
Examples of logs::
postgres_1 | 2018-06-07 19:11:23.963 UTC [81] FATAL: password authentication failed for user "pydanny"
postgres_1 | 2018-06-07 19:11:23.963 UTC [81] DETAIL: Password does not match for user "pydanny".
postgres_1 | Connection matched pg_hba.conf line 95: "host all all all md5"
If you recreate the project multiple times with the same name, Docker would preserve the volumes for the postgres container between projects. Here is what happens:
#. You generate the project the first time. The .env postgres file is populated with the random password
#. You run the docker-compose and the containers are created. The postgres container creates the database based on the .env file credentials
#. You "regenerate" the project with the same name, so the postgres .env file is populated with a new random password
#. You run docker-compose. Since the names of the containers are the same, docker will try to start them (not create them from scratch i.e. it won't execute the Dockerfile to recreate the database). When this happens, it tries to start the database based on the new credentials which do not match the ones that the database was created with, and you get the error message above.
To fix this, you can either:
- Clear your project-related Docker cache with ``docker-compose -f local.yml down --volumes --rmi all``.
- Use the Docker volume sub-commands to find volumes (`ls`_) and remove them (`rm`_).
- Use the `prune`_ command to clear system-wide (use with care!).
.. _ls: https://docs.docker.com/engine/reference/commandline/volume_ls/
.. _rm: https://docs.docker.com/engine/reference/commandline/volume_rm/
.. _prune: https://docs.docker.com/v17.09/engine/reference/commandline/system_prune/
Others
------
#. ``project_slug`` must be a valid Python module name or you will have issues on imports.
#. ``jinja2.exceptions.TemplateSyntaxError: Encountered unknown tag 'now'.``: please upgrade your cookiecutter version to >= 1.4 (see `#528`_)
#. Internal server error on user registration: make sure you have configured the mail backend (e.g. Mailgun) by adding the API key and sender domain
#. New apps not getting created in project root: This is the expected behavior, because cookiecutter-django does not change the way that django startapp works, you'll have to fix this manually (see `#1725`_)
.. _#528: https://github.com/pydanny/cookiecutter-django/issues/528#issuecomment-212650373

View File

@ -32,10 +32,7 @@ DEBUG_VALUE = "debug"
def remove_open_source_files():
file_names = [
"CONTRIBUTORS.txt",
"LICENSE",
]
file_names = ["CONTRIBUTORS.txt", "LICENSE"]
for file_name in file_names:
os.remove(file_name)
@ -71,7 +68,10 @@ def remove_utility_files():
def remove_heroku_files():
file_names = ["Procfile", "runtime.txt", "requirements.txt"]
for file_name in file_names:
if file_name == "requirements.txt" and "{{ cookiecutter.use_travisci }}".lower() == "y":
if (
file_name == "requirements.txt"
and "{{ cookiecutter.ci_tool }}".lower() == "travis"
):
# don't remove the file if we are using travisci but not using heroku
continue
os.remove(file_name)
@ -89,14 +89,26 @@ def remove_packagejson_file():
os.remove(file_name)
def remove_celery_app():
shutil.rmtree(os.path.join("{{ cookiecutter.project_slug }}", "taskapp"))
def remove_celery_files():
file_names = [
os.path.join("config", "celery_app.py"),
os.path.join("{{ cookiecutter.project_slug }}", "users", "tasks.py"),
os.path.join(
"{{ cookiecutter.project_slug }}", "users", "tests", "test_tasks.py"
),
]
for file_name in file_names:
os.remove(file_name)
def remove_dottravisyml_file():
os.remove(".travis.yml")
def remove_dotgitlabciyml_file():
os.remove(".gitlab-ci.yml")
def append_to_project_gitignore(path):
gitignore_file_path = ".gitignore"
with open(gitignore_file_path, "a") as gitignore_file:
@ -183,11 +195,7 @@ def generate_postgres_user(debug=False):
def set_postgres_user(file_path, value):
postgres_user = set_flag(
file_path,
"!!!SET POSTGRES_USER!!!",
value=value,
)
postgres_user = set_flag(file_path, "!!!SET POSTGRES_USER!!!", value=value)
return postgres_user
@ -205,9 +213,7 @@ def set_postgres_password(file_path, value=None):
def set_celery_flower_user(file_path, value):
celery_flower_user = set_flag(
file_path,
"!!!SET CELERY_FLOWER_USER!!!",
value=value,
file_path, "!!!SET CELERY_FLOWER_USER!!!", value=value
)
return celery_flower_user
@ -230,11 +236,7 @@ def append_to_gitignore_file(s):
gitignore_file.write(os.linesep)
def set_flags_in_envs(
postgres_user,
celery_flower_user,
debug=False,
):
def set_flags_in_envs(postgres_user, celery_flower_user, debug=False):
local_django_envs_path = os.path.join(".envs", ".local", ".django")
production_django_envs_path = os.path.join(".envs", ".production", ".django")
local_postgres_envs_path = os.path.join(".envs", ".local", ".postgres")
@ -244,14 +246,22 @@ def set_flags_in_envs(
set_django_admin_url(production_django_envs_path)
set_postgres_user(local_postgres_envs_path, value=postgres_user)
set_postgres_password(local_postgres_envs_path, value=DEBUG_VALUE if debug else None)
set_postgres_password(
local_postgres_envs_path, value=DEBUG_VALUE if debug else None
)
set_postgres_user(production_postgres_envs_path, value=postgres_user)
set_postgres_password(production_postgres_envs_path, value=DEBUG_VALUE if debug else None)
set_postgres_password(
production_postgres_envs_path, value=DEBUG_VALUE if debug else None
)
set_celery_flower_user(local_django_envs_path, value=celery_flower_user)
set_celery_flower_password(local_django_envs_path, value=DEBUG_VALUE if debug else None)
set_celery_flower_password(
local_django_envs_path, value=DEBUG_VALUE if debug else None
)
set_celery_flower_user(production_django_envs_path, value=celery_flower_user)
set_celery_flower_password(production_django_envs_path, value=DEBUG_VALUE if debug else None)
set_celery_flower_password(
production_django_envs_path, value=DEBUG_VALUE if debug else None
)
def set_flags_in_settings_files():
@ -269,6 +279,19 @@ def remove_celery_compose_dirs():
shutil.rmtree(os.path.join("compose", "production", "django", "celery"))
def remove_node_dockerfile():
shutil.rmtree(os.path.join("compose", "local", "node"))
def remove_aws_dockerfile():
shutil.rmtree(os.path.join("compose", "production", "aws"))
def remove_drf_starter_files():
os.remove(os.path.join("config", "api_router.py"))
shutil.rmtree(os.path.join("{{cookiecutter.project_slug}}", "users", "api"))
def main():
debug = "{{ cookiecutter.debug }}".lower() == "y"
@ -292,6 +315,12 @@ def main():
else:
remove_docker_files()
if (
"{{ cookiecutter.use_docker }}".lower() == "y"
and "{{ cookiecutter.cloud_provider}}".lower() != "aws"
):
remove_aws_dockerfile()
if "{{ cookiecutter.use_heroku }}".lower() == "n":
remove_heroku_files()
@ -315,30 +344,29 @@ def main():
if "{{ cookiecutter.js_task_runner}}".lower() == "none":
remove_gulp_files()
remove_packagejson_file()
if (
"{{ cookiecutter.js_task_runner }}".lower() != "none"
and "{{ cookiecutter.use_docker }}".lower() == "y"
):
if "{{ cookiecutter.use_docker }}".lower() == "y":
remove_node_dockerfile()
if "{{ cookiecutter.cloud_provider}}".lower() == "none":
print(
WARNING
+ "Docker and {} JS task runner ".format(
"{{ cookiecutter.js_task_runner }}".lower().capitalize()
)
+ "working together not supported yet. "
"You can continue using the generated project like you "
"normally would, however you would need to add a JS "
"task runner service to your Docker Compose configuration "
"manually." + TERMINATOR
WARNING + "You chose not to use a cloud provider, "
"media files won't be served in production." + TERMINATOR
)
if "{{ cookiecutter.use_celery }}".lower() == "n":
remove_celery_app()
remove_celery_files()
if "{{ cookiecutter.use_docker }}".lower() == "y":
remove_celery_compose_dirs()
if "{{ cookiecutter.use_travisci }}".lower() == "n":
if "{{ cookiecutter.ci_tool }}".lower() != "travis":
remove_dottravisyml_file()
if "{{ cookiecutter.ci_tool }}".lower() != "gitlab":
remove_dotgitlabciyml_file()
if "{{ cookiecutter.use_drf }}".lower() == "n":
remove_drf_starter_files()
print(SUCCESS + "Project initialized, keep up the good work!" + TERMINATOR)

View File

@ -18,19 +18,24 @@ SUCCESS = "\x1b[1;32m [SUCCESS]: "
project_slug = "{{ cookiecutter.project_slug }}"
if hasattr(project_slug, "isidentifier"):
assert project_slug.isidentifier(), "'{}' project slug is not a valid Python identifier.".format(
project_slug
)
assert (
project_slug.isidentifier()
), "'{}' project slug is not a valid Python identifier.".format(project_slug)
assert "\\" not in "{{ cookiecutter.author_name }}", "Don't include backslashes in author name."
assert (
project_slug == project_slug.lower()
), "'{}' project slug should be all lowercase".format(project_slug)
assert (
"\\" not in "{{ cookiecutter.author_name }}"
), "Don't include backslashes in author name."
if "{{ cookiecutter.use_docker }}".lower() == "n":
python_major_version = sys.version_info[0]
if python_major_version == 2:
print(
WARNING + "Cookiecutter Django does not support Python 2. "
"Stability is guaranteed with Python 3.6+ only, "
"are you sure you want to proceed (y/n)? " + TERMINATOR
WARNING + "You're running cookiecutter under Python 2, but the generated "
"project requires Python 3.7+. Do you want to proceed (y/n)? " + TERMINATOR
)
yes_options, no_options = frozenset(["y"]), frozenset(["n"])
while True:
@ -54,3 +59,12 @@ if "{{ cookiecutter.use_docker }}".lower() == "n":
)
+ TERMINATOR
)
if (
"{{ cookiecutter.use_whitenoise }}".lower() == "n"
and "{{ cookiecutter.cloud_provider }}" == "None"
):
print(
"You should either use Whitenoise or select a Cloud Provider to serve static files"
)
sys.exit(1)

View File

@ -1,3 +1,7 @@
[pytest]
addopts = -x --tb=short
python_paths = .
norecursedirs = .tox .git */migrations/* */static/* docs venv */{{cookiecutter.project_slug}}/*
markers =
flake8: Run flake8 on all possible template combinations
black: Run black on all possible template combinations

View File

@ -1,14 +1,17 @@
cookiecutter==1.6.0
cookiecutter==1.7.0
sh==1.12.14
binaryornot==0.4.4
# Code quality
# ------------------------------------------------------------------------------
flake8==3.7.6
black==19.10b0
flake8==3.7.9
# Testing
# ------------------------------------------------------------------------------
tox==3.6.1
pytest==4.3.1
pytest-cookies==0.3.0
pyyaml==5.1
tox==3.14.3
pytest==5.3.5
pytest_cases==1.12.1
pytest-cookies==0.4.0
pytest-xdist==1.31.0
pyyaml==5.3

View File

@ -10,10 +10,10 @@ except ImportError:
# Our version ALWAYS matches the version of Django we support
# If Django has a new release, we branch, tag, then update this setting after the tag.
version = "2.0.2"
version = "2.2.1"
if sys.argv[-1] == "tag":
os.system('git tag -a %s -m "version %s"' % (version, version))
os.system(f'git tag -a {version} -m "version {version}"')
os.system("git push --tags")
sys.exit()
@ -34,13 +34,13 @@ setup(
classifiers=[
"Development Status :: 4 - Beta",
"Environment :: Console",
"Framework :: Django :: 2.0",
"Framework :: Django :: 2.2",
"Intended Audience :: Developers",
"Natural Language :: English",
"License :: OSI Approved :: BSD License",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: Implementation :: CPython",
"Topic :: Software Development",
],

26
tests/test_bare.sh Executable file
View File

@ -0,0 +1,26 @@
#!/bin/sh
# this is a very simple script that tests the docker configuration for cookiecutter-django
# it is meant to be run from the root directory of the repository, eg:
# sh tests/test_docker.sh
set -o errexit
# install test requirements
pip install -r requirements.txt
# create a cache directory
mkdir -p .cache/bare
cd .cache/bare
# create the project using the default settings in cookiecutter.json
cookiecutter ../../ --no-input --overwrite-if-exists use_docker=n $@
cd my_awesome_project
# Install OS deps
sudo utility/install_os_dependencies.sh install
# Install Python deps
pip install -r requirements/local.txt
# run the project's tests
pytest

View File

@ -1,12 +1,14 @@
import os
import re
import sh
import yaml
import pytest
from cookiecutter.exceptions import FailedHookException
from pytest_cases import pytest_fixture_plus
import sh
import yaml
from binaryornot.check import is_binary
PATTERN = "{{(\s?cookiecutter)[.](.*?)}}"
PATTERN = r"{{(\s?cookiecutter)[.](.*?)}}"
RE_OBJ = re.compile(PATTERN)
@ -24,6 +26,51 @@ def context():
}
@pytest_fixture_plus
@pytest.mark.parametrize("windows", ["y", "n"], ids=lambda yn: f"win:{yn}")
@pytest.mark.parametrize("use_docker", ["y", "n"], ids=lambda yn: f"docker:{yn}")
@pytest.mark.parametrize("use_celery", ["y", "n"], ids=lambda yn: f"celery:{yn}")
@pytest.mark.parametrize("use_mailhog", ["y", "n"], ids=lambda yn: f"mailhog:{yn}")
@pytest.mark.parametrize("use_sentry", ["y", "n"], ids=lambda yn: f"sentry:{yn}")
@pytest.mark.parametrize("use_compressor", ["y", "n"], ids=lambda yn: f"cmpr:{yn}")
@pytest.mark.parametrize("use_drf", ["y", "n"], ids=lambda yn: f"drf:{yn}")
@pytest.mark.parametrize(
"use_whitenoise,cloud_provider",
[
("y", "AWS"),
("y", "GCP"),
("y", "None"),
("n", "AWS"),
("n", "GCP"),
# no whitenoise + no cloud provider is not supported
],
ids=lambda id: f"wnoise:{id[0]}-cloud:{id[1]}",
)
def context_combination(
windows,
use_docker,
use_celery,
use_mailhog,
use_sentry,
use_compressor,
use_whitenoise,
use_drf,
cloud_provider,
):
"""Fixture that parametrize the function where it's used."""
return {
"windows": windows,
"use_docker": use_docker,
"use_compressor": use_compressor,
"use_celery": use_celery,
"use_mailhog": use_mailhog,
"use_sentry": use_sentry,
"use_whitenoise": use_whitenoise,
"use_drf": use_drf,
"cloud_provider": cloud_provider,
}
def build_files_list(root_dir):
"""Build a list containing absolute paths to the generated files."""
return [
@ -48,8 +95,13 @@ def check_paths(paths):
assert match is None, msg.format(path)
def test_default_configuration(cookies, context):
result = cookies.bake(extra_context=context)
def test_project_generation(cookies, context, context_combination):
"""
Test that project is generated and fully rendered.
This is parametrized for each combination from ``context_combination`` fixture
"""
result = cookies.bake(extra_context={**context, **context_combination})
assert result.exit_code == 0
assert result.exception is None
assert result.project.basename == context["project_slug"]
@ -60,27 +112,14 @@ def test_default_configuration(cookies, context):
check_paths(paths)
@pytest.fixture(params=["use_mailhog", "use_celery", "windows"])
def feature_context(request, context):
context.update({request.param: "y"})
return context
@pytest.mark.flake8
def test_flake8_passes(cookies, context_combination):
"""
Generated project should pass flake8.
def test_enabled_features(cookies, feature_context):
result = cookies.bake(extra_context=feature_context)
assert result.exit_code == 0
assert result.exception is None
assert result.project.basename == feature_context["project_slug"]
assert result.project.isdir()
paths = build_files_list(str(result.project))
assert paths
check_paths(paths)
def test_flake8_compliance(cookies):
"""generated project should pass flake8"""
result = cookies.bake()
This is parametrized for each combination from ``context_combination`` fixture
"""
result = cookies.bake(extra_context=context_combination)
try:
sh.flake8(str(result.project))
@ -88,8 +127,23 @@ def test_flake8_compliance(cookies):
pytest.fail(e)
@pytest.mark.black
def test_black_passes(cookies, context_combination):
"""
Generated project should pass black.
This is parametrized for each combination from ``context_combination`` fixture
"""
result = cookies.bake(extra_context=context_combination)
try:
sh.black("--check", "--diff", "--exclude", "migrations", f"{result.project}/")
except sh.ErrorReturnCode as e:
pytest.fail(e)
def test_travis_invokes_pytest(cookies, context):
context.update({"use_travisci": "y"})
context.update({"ci_tool": "Travis"})
result = cookies.bake(extra_context=context)
assert result.exit_code == 0
@ -97,8 +151,46 @@ def test_travis_invokes_pytest(cookies, context):
assert result.project.basename == context["project_slug"]
assert result.project.isdir()
with open(f'{result.project}/.travis.yml', 'r') as travis_yml:
with open(f"{result.project}/.travis.yml", "r") as travis_yml:
try:
assert yaml.load(travis_yml)['script'] == ['pytest']
assert yaml.load(travis_yml)["script"] == ["pytest"]
except yaml.YAMLError as e:
pytest.fail(e)
def test_gitlab_invokes_flake8_and_pytest(cookies, context):
context.update({"ci_tool": "Gitlab"})
result = cookies.bake(extra_context=context)
assert result.exit_code == 0
assert result.exception is None
assert result.project.basename == context["project_slug"]
assert result.project.isdir()
with open(f"{result.project}/.gitlab-ci.yml", "r") as gitlab_yml:
try:
gitlab_config = yaml.load(gitlab_yml)
assert gitlab_config["flake8"]["script"] == ["flake8"]
assert gitlab_config["pytest"]["script"] == ["pytest"]
except yaml.YAMLError as e:
pytest.fail(e)
@pytest.mark.parametrize("slug", ["project slug", "Project_Slug"])
def test_invalid_slug(cookies, context, slug):
"""Invalid slug should failed pre-generation hook."""
context.update({"project_slug": slug})
result = cookies.bake(extra_context=context)
assert result.exit_code != 0
assert isinstance(result.exception, FailedHookException)
def test_no_whitenoise_and_no_cloud_provider(cookies, context):
"""It should not generate project if neither whitenoise or cloud provider are set"""
context.update({"use_whitenoise": "n", "cloud_provider": "None"})
result = cookies.bake(extra_context=context)
assert result.exit_code != 0
assert isinstance(result.exception, FailedHookException)

View File

@ -3,6 +3,8 @@
# it is meant to be run from the root directory of the repository, eg:
# sh tests/test_docker.sh
set -o errexit
# install test requirements
pip install -r requirements.txt
@ -11,12 +13,15 @@ mkdir -p .cache/docker
cd .cache/docker
# create the project using the default settings in cookiecutter.json
cookiecutter ../../ --no-input --overwrite-if-exists use_docker=y
cookiecutter ../../ --no-input --overwrite-if-exists use_docker=y $@
cd my_awesome_project
# run the project's type checks
docker-compose -f local.yml run django mypy my_awesome_project
# Run black with --check option
docker-compose -f local.yml run django black --check --diff --exclude 'migrations' ./
# run the project's tests
docker-compose -f local.yml run django pytest

16
tox.ini
View File

@ -1,7 +1,19 @@
[tox]
skipsdist = true
envlist = py36
envlist = py37,flake8,black,black-template
[testenv]
deps = -rrequirements.txt
commands = pytest {posargs:./tests}
commands = pytest -m "not flake8" -m "not black" {posargs:./tests}
[testenv:flake8]
deps = -rrequirements.txt
commands = pytest -m flake8 {posargs:./tests}
[testenv:black]
deps = -rrequirements.txt
commands = pytest -m black {posargs:./tests}
[testenv:black-template]
deps = black
commands = black --check hooks tests setup.py docs

View File

@ -13,10 +13,16 @@ indent_style = space
indent_size = 4
[*.py]
line_length=120
known_first_party={{ cookiecutter.project_slug }}
multi_line_output=3
default_section=THIRDPARTY
line_length = 120
known_first_party = {{ cookiecutter.project_slug }}
multi_line_output = 3
default_section = THIRDPARTY
recursive = true
skip = venv/
skip_glob = **/migrations/*.py
include_trailing_comma = true
force_grid_wrap = 0
use_parentheses = true
[*.{html,css,scss,json,yml}]
indent_style = space

View File

@ -1,3 +0,0 @@
# Caddy
# ------------------------------------------------------------------------------
DOMAIN_NAME={{ cookiecutter.domain_name }}

View File

@ -16,13 +16,18 @@ DJANGO_SECURE_SSL_REDIRECT=False
MAILGUN_API_KEY=
DJANGO_SERVER_EMAIL=
MAILGUN_DOMAIN=
{% if cookiecutter.cloud_provider == 'AWS' %}
# AWS
# ------------------------------------------------------------------------------
DJANGO_AWS_ACCESS_KEY_ID=
DJANGO_AWS_SECRET_ACCESS_KEY=
DJANGO_AWS_STORAGE_BUCKET_NAME=
{% elif cookiecutter.cloud_provider == 'GCP' %}
# GCP
# ------------------------------------------------------------------------------
GOOGLE_APPLICATION_CREDENTIALS=
DJANGO_GCP_STORAGE_BUCKET_NAME=
{% endif %}
# django-allauth
# ------------------------------------------------------------------------------
DJANGO_ACCOUNT_ALLOW_REGISTRATION=True

View File

@ -325,7 +325,6 @@ tags
### VirtualEnv template
# Virtualenv
# http://iamzed.com/2009/05/07/a-primer-on-virtualenv/
[Bb]in
[Ii]nclude
[Ll]ib

View File

@ -0,0 +1,33 @@
stages:
- lint
- test
variables:
POSTGRES_USER: '{{ cookiecutter.project_slug }}'
POSTGRES_PASSWORD: ''
POSTGRES_DB: 'test_{{ cookiecutter.project_slug }}'
flake8:
stage: lint
image: python:3.7-alpine
before_script:
- pip install -q flake8
script:
- flake8
pytest:
stage: test
image: python:3.7
tags:
- docker
services:
- postgres:11
variables:
DATABASE_URL: pgsql://$POSTGRES_USER:$POSTGRES_PASSWORD@postgres/$POSTGRES_DB
before_script:
- pip install -r requirements/local.txt
script:
- pytest

View File

@ -0,0 +1,14 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
{%- if cookiecutter.use_celery == 'y' %}
<component name="DjangoConsoleOptions"
custom-start-script="import sys; print('Python %s on %s' % (sys.version, sys.platform))&#10;import django; print('Django %s' % django.get_version())&#10;import os&#10;os.environ.setdefault(&quot;DATABASE_URL&quot;,&quot;postgres://{}:{}@{}:{}/{}&quot;.format(os.environ['POSTGRES_USER'], os.environ['POSTGRES_PASSWORD'], os.environ['POSTGRES_HOST'], os.environ['POSTGRES_PORT'], os.environ['POSTGRES_DB']))&#10;os.environ.setdefault(&quot;CELERY_BROKER_URL&quot;, os.environ['REDIS_URL'])&#10;sys.path.extend([WORKING_DIR_AND_PYTHON_PATHS])&#10;if 'setup' in dir(django): django.setup()&#10;import django_manage_shell; django_manage_shell.run(PROJECT_ROOT)"
module-name="{{ cookiecutter.project_slug }}" is-module-sdk="true">
</component>
{%- else %}
<component name="DjangoConsoleOptions"
custom-start-script="import sys; print('Python %s on %s' % (sys.version, sys.platform))&#10;import django; print('Django %s' % django.get_version())&#10;import os&#10;os.environ.setdefault(&quot;DATABASE_URL&quot;,&quot;postgres://{}:{}@{}:{}/{}&quot;.format(os.environ['POSTGRES_USER'], os.environ['POSTGRES_PASSWORD'], os.environ['POSTGRES_HOST'], os.environ['POSTGRES_PORT'], os.environ['POSTGRES_DB']))&#10;sys.path.extend([WORKING_DIR_AND_PYTHON_PATHS])&#10;if 'setup' in dir(django): django.setup()&#10;import django_manage_shell; django_manage_shell.run(PROJECT_ROOT)"
module-name="{{ cookiecutter.project_slug }}" is-module-sdk="true">
</component>
{%- endif %}
</project>

View File

@ -41,7 +41,7 @@
</option>
</component>
<component name="TestRunnerService">
<option name="projectConfiguration" value="py.test" />
<option name="PROJECT_TEST_RUNNER" value="py.test" />
<option name="projectConfiguration" value="pytest" />
<option name="PROJECT_TEST_RUNNER" value="pytest" />
</component>
</module>

View File

@ -0,0 +1,19 @@
exclude: 'docs|node_modules|migrations|.git|.tox'
default_stages: [commit]
fail_fast: true
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: master
hooks:
- id: trailing-whitespace
files: (^|/)a/.+\.(py|html|sh|css|js)$
- repo: local
hooks:
- id: flake8
name: flake8
entry: flake8
language: python
types: [python]

View File

@ -1,5 +1,5 @@
[MASTER]
load-plugins=pylint_common, pylint_django{% if cookiecutter.use_celery == "y" %}, pylint_celery {% endif %}
load-plugins=pylint_django{% if cookiecutter.use_celery == "y" %}, pylint_celery {% endif %}
[FORMAT]
max-line-length=120

View File

@ -1,14 +1,16 @@
sudo: true
dist: xenial
services:
- postgresql
before_install:
- sudo apt-get update -qq
- sudo apt-get install -qq build-essential gettext python-dev zlib1g-dev libpq-dev xvfb
- sudo apt-get install -qq libtiff4-dev libjpeg8-dev libfreetype6-dev liblcms1-dev libwebp-dev
- sudo apt-get install -qq libjpeg8-dev libfreetype6-dev libwebp-dev
- sudo apt-get install -qq graphviz-dev python-setuptools python3-dev python-virtualenv python-pip
- sudo apt-get install -qq firefox automake libtool libreadline6 libreadline6-dev libreadline-dev
- sudo apt-get install -qq libsqlite3-dev libxml2 libxml2-dev libssl-dev libbz2-dev wget curl llvm
language: python
python:
- "3.6"
- "3.7"
install:
- pip install -r requirements/local.txt
script:

View File

@ -1,5 +1,5 @@
release: python manage.py migrate
web: gunicorn config.wsgi:application
{% if cookiecutter.use_celery == "y" -%}
worker: celery worker --app={{cookiecutter.project_slug}}.taskapp --loglevel=info
worker: celery worker --app=config.celery_app --loglevel=info
{%- endif %}

View File

@ -6,6 +6,9 @@
.. image:: https://img.shields.io/badge/built%20with-Cookiecutter%20Django-ff69b4.svg
:target: https://github.com/pydanny/cookiecutter-django/
:alt: Built with Cookiecutter Django
.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/ambv/black
:alt: Black code style
{% if cookiecutter.open_source_license != "Not open source" %}
:License: {{cookiecutter.open_source_license}}
@ -76,7 +79,7 @@ To run a celery worker:
.. code-block:: bash
cd {{cookiecutter.project_slug}}
celery -A {{cookiecutter.project_slug}}.taskapp worker -l info
celery -A config.celery_app worker -l info
Please note: For Celery's import magic to work, it is important *where* the celery commands are run. If you are in the same folder with *manage.py*, you should be right.
@ -156,7 +159,7 @@ Custom Bootstrap Compilation
^^^^^^
The generated CSS is set up with automatic Bootstrap recompilation with variables of your choice.
Bootstrap v4.1.1 is installed using npm and customised by tweaking your variables in ``static/sass/custom_bootstrap_vars``.
Bootstrap v4 is installed using npm and customised by tweaking your variables in ``static/sass/custom_bootstrap_vars``.
You can find a list of available variables `in the bootstrap source`_, or get explanations on them in the `Bootstrap docs`_.

View File

@ -1,42 +1,40 @@
FROM python:3.6-alpine
FROM python:3.7-slim-buster
ENV PYTHONUNBUFFERED 1
RUN apk update \
RUN apt-get update \
# dependencies for building Python packages
&& apt-get install -y build-essential \
# psycopg2 dependencies
&& apk add --virtual build-deps gcc python3-dev musl-dev \
&& apk add postgresql-dev \
# Pillow dependencies
&& apk add jpeg-dev zlib-dev freetype-dev lcms2-dev openjpeg-dev tiff-dev tk-dev tcl-dev \
# CFFI dependencies
&& apk add libffi-dev py-cffi \
&& apt-get install -y libpq-dev \
# Translations dependencies
&& apk add gettext \
# https://docs.djangoproject.com/en/dev/ref/django-admin/#dbshell
&& apk add postgresql-client
&& apt-get install -y gettext \
# cleaning up unused files
&& apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \
&& rm -rf /var/lib/apt/lists/*
# Requirements are installed here to ensure they will be cached.
COPY ./requirements /requirements
RUN pip install -r /requirements/local.txt
COPY ./compose/production/django/entrypoint /entrypoint
RUN sed -i 's/\r//' /entrypoint
RUN sed -i 's/\r$//g' /entrypoint
RUN chmod +x /entrypoint
COPY ./compose/local/django/start /start
RUN sed -i 's/\r//' /start
RUN sed -i 's/\r$//g' /start
RUN chmod +x /start
{% if cookiecutter.use_celery == "y" %}
COPY ./compose/local/django/celery/worker/start /start-celeryworker
RUN sed -i 's/\r//' /start-celeryworker
RUN sed -i 's/\r$//g' /start-celeryworker
RUN chmod +x /start-celeryworker
COPY ./compose/local/django/celery/beat/start /start-celerybeat
RUN sed -i 's/\r//' /start-celerybeat
RUN sed -i 's/\r$//g' /start-celerybeat
RUN chmod +x /start-celerybeat
COPY ./compose/local/django/celery/flower/start /start-flower
RUN sed -i 's/\r//' /start-flower
RUN sed -i 's/\r$//g' /start-flower
RUN chmod +x /start-flower
{% endif %}
WORKDIR /app

View File

@ -1,8 +1,8 @@
#!/bin/sh
#!/bin/bash
set -o errexit
set -o nounset
rm -f './celerybeat.pid'
celery -A {{cookiecutter.project_slug}}.taskapp beat -l INFO
celery -A config.celery_app beat -l INFO

View File

@ -1,10 +1,10 @@
#!/bin/sh
#!/bin/bash
set -o errexit
set -o nounset
celery flower \
--app={{cookiecutter.project_slug}}.taskapp \
--app=config.celery_app \
--broker="${CELERY_BROKER_URL}" \
--basic_auth="${CELERY_FLOWER_USER}:${CELERY_FLOWER_PASSWORD}"

View File

@ -1,7 +1,7 @@
#!/bin/sh
#!/bin/bash
set -o errexit
set -o nounset
celery -A {{cookiecutter.project_slug}}.taskapp worker -l INFO
celery -A config.celery_app worker -l INFO

View File

@ -1,4 +1,4 @@
#!/bin/sh
#!/bin/bash
set -o errexit
set -o pipefail

View File

@ -0,0 +1,9 @@
FROM node:10-stretch-slim
WORKDIR /app
COPY ./package.json /app
RUN npm install && npm cache clean --force
ENV PATH ./node_modules/.bin/:$PATH

View File

@ -0,0 +1,9 @@
FROM garland/aws-cli-docker:1.15.47
COPY ./compose/production/aws/maintenance /usr/local/bin/maintenance
COPY ./compose/production/postgres/maintenance/_sourced /usr/local/bin/maintenance/_sourced
RUN chmod +x /usr/local/bin/maintenance/*
RUN mv /usr/local/bin/maintenance/* /usr/local/bin \
&& rmdir /usr/local/bin/maintenance

View File

@ -0,0 +1,24 @@
#!/bin/sh
### Download a file from your Amazon S3 bucket to the postgres /backups folder
###
### Usage:
### $ docker-compose -f production.yml run --rm awscli <1>
set -o errexit
set -o pipefail
set -o nounset
working_dir="$(dirname ${0})"
source "${working_dir}/_sourced/constants.sh"
source "${working_dir}/_sourced/messages.sh"
export AWS_ACCESS_KEY_ID="${DJANGO_AWS_ACCESS_KEY_ID}"
export AWS_SECRET_ACCESS_KEY="${DJANGO_AWS_SECRET_ACCESS_KEY}"
export AWS_STORAGE_BUCKET_NAME="${DJANGO_AWS_STORAGE_BUCKET_NAME}"
aws s3 cp s3://${AWS_STORAGE_BUCKET_NAME}${BACKUP_DIR_PATH}/${1} ${BACKUP_DIR_PATH}/${1}
message_success "Finished downloading ${1}."

View File

@ -0,0 +1,30 @@
#!/bin/sh
### Upload the /backups folder to Amazon S3
###
### Usage:
### $ docker-compose -f production.yml run --rm awscli upload
set -o errexit
set -o pipefail
set -o nounset
working_dir="$(dirname ${0})"
source "${working_dir}/_sourced/constants.sh"
source "${working_dir}/_sourced/messages.sh"
export AWS_ACCESS_KEY_ID="${DJANGO_AWS_ACCESS_KEY_ID}"
export AWS_SECRET_ACCESS_KEY="${DJANGO_AWS_SECRET_ACCESS_KEY}"
export AWS_STORAGE_BUCKET_NAME="${DJANGO_AWS_STORAGE_BUCKET_NAME}"
message_info "Upload the backups directory to S3 bucket {$AWS_STORAGE_BUCKET_NAME}"
aws s3 cp ${BACKUP_DIR_PATH} s3://${AWS_STORAGE_BUCKET_NAME}${BACKUP_DIR_PATH} --recursive
message_info "Cleaning the directory ${BACKUP_DIR_PATH}"
rm -rf ${BACKUP_DIR_PATH}/*
message_success "Finished uploading and cleaning."

View File

@ -1,15 +0,0 @@
www.{% raw %}{$DOMAIN_NAME}{% endraw %} {
redir https://{% raw %}{$DOMAIN_NAME}{% endraw %}
}
{% raw %}{$DOMAIN_NAME}{% endraw %} {
proxy / django:5000 {
header_upstream Host {host}
header_upstream X-Real-IP {remote}
header_upstream X-Forwarded-Proto {scheme}
header_upstream X-CSRFToken {~csrftoken}
}
log stdout
errors stdout
gzip
}

View File

@ -1,3 +0,0 @@
FROM abiosoft/caddy:0.11.0
COPY ./compose/production/caddy/Caddyfile /etc/Caddyfile

View File

@ -1,18 +1,31 @@
FROM python:3.6-alpine
{% if cookiecutter.js_task_runner == 'Gulp' -%}
FROM node:10-stretch-slim as client-builder
WORKDIR /app
COPY ./package.json /app
RUN npm install && npm cache clean --force
COPY . /app
RUN npm run build
# Python build stage
{%- endif %}
FROM python:3.7-slim-buster
ENV PYTHONUNBUFFERED 1
RUN apk update \
RUN apt-get update \
# dependencies for building Python packages
&& apt-get install -y build-essential \
# psycopg2 dependencies
&& apk add --virtual build-deps gcc python3-dev musl-dev \
&& apk add postgresql-dev \
# Pillow dependencies
&& apk add jpeg-dev zlib-dev freetype-dev lcms2-dev openjpeg-dev tiff-dev tk-dev tcl-dev \
# CFFI dependencies
&& apk add libffi-dev py-cffi
&& apt-get install -y libpq-dev \
# Translations dependencies
&& apt-get install -y gettext \
# cleaning up unused files
&& apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \
&& rm -rf /var/lib/apt/lists/*
RUN addgroup -S django \
&& adduser -S -G django django
RUN addgroup --system django \
&& adduser --system --ingroup django django
# Requirements are installed here to ensure they will be cached.
COPY ./requirements /requirements
@ -20,32 +33,36 @@ RUN pip install --no-cache-dir -r /requirements/production.txt \
&& rm -rf /requirements
COPY ./compose/production/django/entrypoint /entrypoint
RUN sed -i 's/\r//' /entrypoint
RUN sed -i 's/\r$//g' /entrypoint
RUN chmod +x /entrypoint
RUN chown django /entrypoint
COPY ./compose/production/django/start /start
RUN sed -i 's/\r//' /start
RUN sed -i 's/\r$//g' /start
RUN chmod +x /start
RUN chown django /start
{% if cookiecutter.use_celery == "y" %}
{%- if cookiecutter.use_celery == "y" %}
COPY ./compose/production/django/celery/worker/start /start-celeryworker
RUN sed -i 's/\r//' /start-celeryworker
RUN sed -i 's/\r$//g' /start-celeryworker
RUN chmod +x /start-celeryworker
RUN chown django /start-celeryworker
COPY ./compose/production/django/celery/beat/start /start-celerybeat
RUN sed -i 's/\r//' /start-celerybeat
RUN sed -i 's/\r$//g' /start-celerybeat
RUN chmod +x /start-celerybeat
RUN chown django /start-celerybeat
COPY ./compose/production/django/celery/flower/start /start-flower
RUN sed -i 's/\r//' /start-flower
RUN sed -i 's/\r$//g' /start-flower
RUN chmod +x /start-flower
{% endif %}
COPY . /app
{%- endif %}
RUN chown -R django /app
{%- if cookiecutter.js_task_runner == 'Gulp' %}
COPY --from=client-builder --chown=django:django /app /app
{% else %}
COPY --chown=django:django . /app
{%- endif %}
USER django

View File

@ -1,8 +1,8 @@
#!/bin/sh
#!/bin/bash
set -o errexit
set -o pipefail
set -o nounset
celery -A {{cookiecutter.project_slug}}.taskapp beat -l INFO
celery -A config.celery_app beat -l INFO

View File

@ -1,10 +1,10 @@
#!/bin/sh
#!/bin/bash
set -o errexit
set -o nounset
celery flower \
--app={{cookiecutter.project_slug}}.taskapp \
--app=config.celery_app \
--broker="${CELERY_BROKER_URL}" \
--basic_auth="${CELERY_FLOWER_USER}:${CELERY_FLOWER_PASSWORD}"

View File

@ -1,8 +1,8 @@
#!/bin/sh
#!/bin/bash
set -o errexit
set -o pipefail
set -o nounset
celery -A {{cookiecutter.project_slug}}.taskapp worker -l INFO
celery -A config.celery_app worker -l INFO

View File

@ -1,4 +1,4 @@
#!/bin/sh
#!/bin/bash
set -o errexit
set -o pipefail

View File

@ -1,4 +1,4 @@
#!/bin/sh
#!/bin/bash
set -o errexit
set -o pipefail

View File

@ -0,0 +1,5 @@
FROM traefik:v2.0
RUN mkdir -p /etc/traefik/acme
RUN touch /etc/traefik/acme/acme.json
RUN chmod 600 /etc/traefik/acme/acme.json
COPY ./compose/production/traefik/traefik.yml /etc/traefik

View File

@ -0,0 +1,67 @@
log:
level: INFO
entryPoints:
web:
# http
address: ":80"
web-secure:
# https
address: ":443"
certificatesResolvers:
letsencrypt:
# https://docs.traefik.io/master/https/acme/#lets-encrypt
acme:
email: "{{ cookiecutter.email }}"
storage: /etc/traefik/acme/acme.json
# https://docs.traefik.io/master/https/acme/#httpchallenge
httpChallenge:
entryPoint: web
http:
routers:
web-router:
rule: "Host(`{{ cookiecutter.domain_name }}`)"
entryPoints:
- web
middlewares:
- redirect
- csrf
service: django
web-secure-router:
rule: "Host(`{{ cookiecutter.domain_name }}`)"
entryPoints:
- web-secure
middlewares:
- csrf
service: django
tls:
# https://docs.traefik.io/master/routing/routers/#certresolver
certResolver: letsencrypt
middlewares:
redirect:
# https://docs.traefik.io/master/middlewares/redirectscheme/
redirectScheme:
scheme: https
permanent: true
csrf:
# https://docs.traefik.io/master/middlewares/headers/#hostsproxyheaders
# https://docs.djangoproject.com/en/dev/ref/csrf/#ajax
headers:
hostsProxyHeaders: ['X-CSRFToken']
services:
django:
loadBalancer:
servers:
- url: http://django:5000
providers:
# https://docs.traefik.io/master/providers/file/
file:
filename: /etc/traefik/traefik.yml
watch: true

View File

@ -0,0 +1,7 @@
{% if cookiecutter.use_celery == 'y' -%}
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery_app import app as celery_app
__all__ = ("celery_app",)
{% endif -%}

View File

@ -0,0 +1,14 @@
from rest_framework.routers import DefaultRouter, SimpleRouter
from django.conf import settings
from {{ cookiecutter.project_slug }}.users.api.views import UserViewSet
if settings.DEBUG:
router = DefaultRouter()
else:
router = SimpleRouter()
router.register("users", UserViewSet)
app_name = "api"
urlpatterns = router.urls

View File

@ -0,0 +1,16 @@
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.local")
app = Celery("{{cookiecutter.project_slug}}")
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object("django.conf:settings", namespace="CELERY")
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

View File

@ -4,27 +4,29 @@ Base settings to build other settings files upon.
import environ
ROOT_DIR = environ.Path(__file__) - 3 # ({{ cookiecutter.project_slug }}/config/settings/base.py - 3 = {{ cookiecutter.project_slug }}/)
APPS_DIR = ROOT_DIR.path('{{ cookiecutter.project_slug }}')
ROOT_DIR = (
environ.Path(__file__) - 3
) # ({{ cookiecutter.project_slug }}/config/settings/base.py - 3 = {{ cookiecutter.project_slug }}/)
APPS_DIR = ROOT_DIR.path("{{ cookiecutter.project_slug }}")
env = environ.Env()
READ_DOT_ENV_FILE = env.bool('DJANGO_READ_DOT_ENV_FILE', default=False)
READ_DOT_ENV_FILE = env.bool("DJANGO_READ_DOT_ENV_FILE", default=False)
if READ_DOT_ENV_FILE:
# OS environment variables take precedence over variables from .env
env.read_env(str(ROOT_DIR.path('.env')))
env.read_env(str(ROOT_DIR.path(".env")))
# GENERAL
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#debug
DEBUG = env.bool('DJANGO_DEBUG', False)
DEBUG = env.bool("DJANGO_DEBUG", False)
# Local time zone. Choices are
# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name
# though not all of them may be available with every OS.
# In Windows, this must be set to your system time zone.
TIME_ZONE = '{{ cookiecutter.timezone }}'
TIME_ZONE = "{{ cookiecutter.timezone }}"
# https://docs.djangoproject.com/en/dev/ref/settings/#language-code
LANGUAGE_CODE = 'en-us'
LANGUAGE_CODE = "en-us"
# https://docs.djangoproject.com/en/dev/ref/settings/#site-id
SITE_ID = 1
# https://docs.djangoproject.com/en/dev/ref/settings/#use-i18n
@ -33,49 +35,54 @@ USE_I18N = True
USE_L10N = True
# https://docs.djangoproject.com/en/dev/ref/settings/#use-tz
USE_TZ = True
# https://docs.djangoproject.com/en/dev/ref/settings/#locale-paths
LOCALE_PATHS = [ROOT_DIR.path("locale")]
# DATABASES
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#databases
{% if cookiecutter.use_docker == 'y' -%}
DATABASES = {
'default': env.db('DATABASE_URL'),
}
{% if cookiecutter.use_docker == "y" -%}
DATABASES = {"default": env.db("DATABASE_URL")}
{%- else %}
DATABASES = {
'default': env.db('DATABASE_URL', default='postgres://{% if cookiecutter.windows == 'y' %}localhost{% endif %}/{{cookiecutter.project_slug}}'),
"default": env.db("DATABASE_URL", default="postgres://{% if cookiecutter.windows == 'y' %}localhost{% endif %}/{{cookiecutter.project_slug}}")
}
{%- endif %}
DATABASES['default']['ATOMIC_REQUESTS'] = True
DATABASES["default"]["ATOMIC_REQUESTS"] = True
# URLS
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#root-urlconf
ROOT_URLCONF = 'config.urls'
ROOT_URLCONF = "config.urls"
# https://docs.djangoproject.com/en/dev/ref/settings/#wsgi-application
WSGI_APPLICATION = 'config.wsgi.application'
WSGI_APPLICATION = "config.wsgi.application"
# APPS
# ------------------------------------------------------------------------------
DJANGO_APPS = [
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.sites',
'django.contrib.messages',
'django.contrib.staticfiles',
# 'django.contrib.humanize', # Handy template tags
'django.contrib.admin',
"django.contrib.auth",
"django.contrib.contenttypes",
"django.contrib.sessions",
"django.contrib.sites",
"django.contrib.messages",
"django.contrib.staticfiles",
# "django.contrib.humanize", # Handy template tags
"django.contrib.admin",
"django.forms",
]
THIRD_PARTY_APPS = [
'crispy_forms',
'allauth',
'allauth.account',
'allauth.socialaccount',
'rest_framework',
"crispy_forms",
"allauth",
"allauth.account",
"allauth.socialaccount",
"rest_framework",
{%- if cookiecutter.use_celery == 'y' %}
"django_celery_beat",
{%- endif %}
]
LOCAL_APPS = [
'{{ cookiecutter.project_slug }}.users.apps.UsersAppConfig',
"{{ cookiecutter.project_slug }}.users.apps.UsersConfig",
# Your stuff: custom apps go here
]
# https://docs.djangoproject.com/en/dev/ref/settings/#installed-apps
@ -84,86 +91,80 @@ INSTALLED_APPS = DJANGO_APPS + THIRD_PARTY_APPS + LOCAL_APPS
# MIGRATIONS
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#migration-modules
MIGRATION_MODULES = {
'sites': '{{ cookiecutter.project_slug }}.contrib.sites.migrations'
}
MIGRATION_MODULES = {"sites": "{{ cookiecutter.project_slug }}.contrib.sites.migrations"}
# AUTHENTICATION
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#authentication-backends
AUTHENTICATION_BACKENDS = [
'django.contrib.auth.backends.ModelBackend',
'allauth.account.auth_backends.AuthenticationBackend',
"django.contrib.auth.backends.ModelBackend",
"allauth.account.auth_backends.AuthenticationBackend",
]
# https://docs.djangoproject.com/en/dev/ref/settings/#auth-user-model
AUTH_USER_MODEL = 'users.User'
AUTH_USER_MODEL = "users.User"
# https://docs.djangoproject.com/en/dev/ref/settings/#login-redirect-url
LOGIN_REDIRECT_URL = 'users:redirect'
LOGIN_REDIRECT_URL = "users:redirect"
# https://docs.djangoproject.com/en/dev/ref/settings/#login-url
LOGIN_URL = 'account_login'
LOGIN_URL = "account_login"
# PASSWORDS
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#password-hashers
PASSWORD_HASHERS = [
# https://docs.djangoproject.com/en/dev/topics/auth/passwords/#using-argon2-with-django
'django.contrib.auth.hashers.Argon2PasswordHasher',
'django.contrib.auth.hashers.PBKDF2PasswordHasher',
'django.contrib.auth.hashers.PBKDF2SHA1PasswordHasher',
'django.contrib.auth.hashers.BCryptSHA256PasswordHasher',
'django.contrib.auth.hashers.BCryptPasswordHasher',
"django.contrib.auth.hashers.Argon2PasswordHasher",
"django.contrib.auth.hashers.PBKDF2PasswordHasher",
"django.contrib.auth.hashers.PBKDF2SHA1PasswordHasher",
"django.contrib.auth.hashers.BCryptSHA256PasswordHasher",
]
# https://docs.djangoproject.com/en/dev/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
"NAME": "django.contrib.auth.password_validation.UserAttributeSimilarityValidator"
},
{"NAME": "django.contrib.auth.password_validation.MinimumLengthValidator"},
{"NAME": "django.contrib.auth.password_validation.CommonPasswordValidator"},
{"NAME": "django.contrib.auth.password_validation.NumericPasswordValidator"},
]
# MIDDLEWARE
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#middleware
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
"django.middleware.security.SecurityMiddleware",
{%- if cookiecutter.use_whitenoise == 'y' %}
"whitenoise.middleware.WhiteNoiseMiddleware",
{%- endif %}
"django.contrib.sessions.middleware.SessionMiddleware",
"django.middleware.locale.LocaleMiddleware",
"django.middleware.common.CommonMiddleware",
"django.middleware.csrf.CsrfViewMiddleware",
"django.contrib.auth.middleware.AuthenticationMiddleware",
"django.contrib.messages.middleware.MessageMiddleware",
"django.middleware.common.BrokenLinkEmailsMiddleware",
"django.middleware.clickjacking.XFrameOptionsMiddleware",
]
# STATIC
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#static-root
STATIC_ROOT = str(ROOT_DIR('staticfiles'))
STATIC_ROOT = str(ROOT_DIR("staticfiles"))
# https://docs.djangoproject.com/en/dev/ref/settings/#static-url
STATIC_URL = '/static/'
STATIC_URL = "/static/"
# https://docs.djangoproject.com/en/dev/ref/contrib/staticfiles/#std:setting-STATICFILES_DIRS
STATICFILES_DIRS = [
str(APPS_DIR.path('static')),
]
STATICFILES_DIRS = [str(APPS_DIR.path("static"))]
# https://docs.djangoproject.com/en/dev/ref/contrib/staticfiles/#staticfiles-finders
STATICFILES_FINDERS = [
'django.contrib.staticfiles.finders.FileSystemFinder',
'django.contrib.staticfiles.finders.AppDirectoriesFinder',
"django.contrib.staticfiles.finders.FileSystemFinder",
"django.contrib.staticfiles.finders.AppDirectoriesFinder",
]
# MEDIA
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#media-root
MEDIA_ROOT = str(APPS_DIR('media'))
MEDIA_ROOT = str(APPS_DIR("media"))
# https://docs.djangoproject.com/en/dev/ref/settings/#media-url
MEDIA_URL = '/media/'
MEDIA_URL = "/media/"
# TEMPLATES
# ------------------------------------------------------------------------------
@ -171,43 +172,42 @@ MEDIA_URL = '/media/'
TEMPLATES = [
{
# https://docs.djangoproject.com/en/dev/ref/settings/#std:setting-TEMPLATES-BACKEND
'BACKEND': 'django.template.backends.django.DjangoTemplates',
"BACKEND": "django.template.backends.django.DjangoTemplates",
# https://docs.djangoproject.com/en/dev/ref/settings/#template-dirs
'DIRS': [
str(APPS_DIR.path('templates')),
],
'OPTIONS': {
# https://docs.djangoproject.com/en/dev/ref/settings/#template-debug
'debug': DEBUG,
"DIRS": [str(APPS_DIR.path("templates"))],
"OPTIONS": {
# https://docs.djangoproject.com/en/dev/ref/settings/#template-loaders
# https://docs.djangoproject.com/en/dev/ref/templates/api/#loader-types
'loaders': [
'django.template.loaders.filesystem.Loader',
'django.template.loaders.app_directories.Loader',
"loaders": [
"django.template.loaders.filesystem.Loader",
"django.template.loaders.app_directories.Loader",
],
# https://docs.djangoproject.com/en/dev/ref/settings/#template-context-processors
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.template.context_processors.i18n',
'django.template.context_processors.media',
'django.template.context_processors.static',
'django.template.context_processors.tz',
'django.contrib.messages.context_processors.messages',
"context_processors": [
"django.template.context_processors.debug",
"django.template.context_processors.request",
"django.contrib.auth.context_processors.auth",
"django.template.context_processors.i18n",
"django.template.context_processors.media",
"django.template.context_processors.static",
"django.template.context_processors.tz",
"django.contrib.messages.context_processors.messages",
"{{ cookiecutter.project_slug }}.utils.context_processors.settings_context",
],
},
},
}
]
# https://docs.djangoproject.com/en/dev/ref/settings/#form-renderer
FORM_RENDERER = "django.forms.renderers.TemplatesSetting"
# http://django-crispy-forms.readthedocs.io/en/latest/install.html#template-packs
CRISPY_TEMPLATE_PACK = 'bootstrap4'
CRISPY_TEMPLATE_PACK = "bootstrap4"
# FIXTURES
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#fixture-dirs
FIXTURE_DIRS = (
str(APPS_DIR.path('fixtures')),
)
FIXTURE_DIRS = (str(APPS_DIR.path("fixtures")),)
# SECURITY
# ------------------------------------------------------------------------------
@ -218,70 +218,107 @@ CSRF_COOKIE_HTTPONLY = True
# https://docs.djangoproject.com/en/dev/ref/settings/#secure-browser-xss-filter
SECURE_BROWSER_XSS_FILTER = True
# https://docs.djangoproject.com/en/dev/ref/settings/#x-frame-options
X_FRAME_OPTIONS = 'DENY'
X_FRAME_OPTIONS = "DENY"
# EMAIL
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#email-backend
EMAIL_BACKEND = env('DJANGO_EMAIL_BACKEND', default='django.core.mail.backends.smtp.EmailBackend')
EMAIL_BACKEND = env(
"DJANGO_EMAIL_BACKEND", default="django.core.mail.backends.smtp.EmailBackend"
)
# https://docs.djangoproject.com/en/2.2/ref/settings/#email-timeout
EMAIL_TIMEOUT = 5
# ADMIN
# ------------------------------------------------------------------------------
# Django Admin URL.
ADMIN_URL = 'admin/'
ADMIN_URL = "admin/"
# https://docs.djangoproject.com/en/dev/ref/settings/#admins
ADMINS = [
("""{{cookiecutter.author_name}}""", '{{cookiecutter.email}}'),
]
ADMINS = [("""{{cookiecutter.author_name}}""", "{{cookiecutter.email}}")]
# https://docs.djangoproject.com/en/dev/ref/settings/#managers
MANAGERS = ADMINS
# LOGGING
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#logging
# See https://docs.djangoproject.com/en/dev/topics/logging for
# more details on how to customize your logging configuration.
LOGGING = {
"version": 1,
"disable_existing_loggers": False,
"formatters": {
"verbose": {
"format": "%(levelname)s %(asctime)s %(module)s "
"%(process)d %(thread)d %(message)s"
}
},
"handlers": {
"console": {
"level": "DEBUG",
"class": "logging.StreamHandler",
"formatter": "verbose",
}
},
"root": {"level": "INFO", "handlers": ["console"]},
}
{% if cookiecutter.use_celery == 'y' -%}
# Celery
# ------------------------------------------------------------------------------
INSTALLED_APPS += ['{{cookiecutter.project_slug}}.taskapp.celery.CeleryAppConfig']
if USE_TZ:
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-timezone
CELERY_TIMEZONE = TIME_ZONE
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-broker_url
CELERY_BROKER_URL = env('CELERY_BROKER_URL')
CELERY_BROKER_URL = env("CELERY_BROKER_URL")
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-result_backend
CELERY_RESULT_BACKEND = CELERY_BROKER_URL
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-accept_content
CELERY_ACCEPT_CONTENT = ['json']
CELERY_ACCEPT_CONTENT = ["json"]
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-task_serializer
CELERY_TASK_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = "json"
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-result_serializer
CELERY_RESULT_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = "json"
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-time-limit
# TODO: set to whatever value is adequate in your circumstances
CELERYD_TASK_TIME_LIMIT = 5 * 60
CELERY_TASK_TIME_LIMIT = 5 * 60
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-soft-time-limit
# TODO: set to whatever value is adequate in your circumstances
CELERYD_TASK_SOFT_TIME_LIMIT = 60
CELERY_TASK_SOFT_TIME_LIMIT = 60
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#beat-scheduler
CELERY_BEAT_SCHEDULER = "django_celery_beat.schedulers:DatabaseScheduler"
{%- endif %}
# django-allauth
# ------------------------------------------------------------------------------
ACCOUNT_ALLOW_REGISTRATION = env.bool('DJANGO_ACCOUNT_ALLOW_REGISTRATION', True)
ACCOUNT_ALLOW_REGISTRATION = env.bool("DJANGO_ACCOUNT_ALLOW_REGISTRATION", True)
# https://django-allauth.readthedocs.io/en/latest/configuration.html
ACCOUNT_AUTHENTICATION_METHOD = 'username'
ACCOUNT_AUTHENTICATION_METHOD = "username"
# https://django-allauth.readthedocs.io/en/latest/configuration.html
ACCOUNT_EMAIL_REQUIRED = True
# https://django-allauth.readthedocs.io/en/latest/configuration.html
ACCOUNT_EMAIL_VERIFICATION = 'mandatory'
ACCOUNT_EMAIL_VERIFICATION = "mandatory"
# https://django-allauth.readthedocs.io/en/latest/configuration.html
ACCOUNT_ADAPTER = '{{cookiecutter.project_slug}}.users.adapters.AccountAdapter'
ACCOUNT_ADAPTER = "{{cookiecutter.project_slug}}.users.adapters.AccountAdapter"
# https://django-allauth.readthedocs.io/en/latest/configuration.html
SOCIALACCOUNT_ADAPTER = '{{cookiecutter.project_slug}}.users.adapters.SocialAccountAdapter'
SOCIALACCOUNT_ADAPTER = "{{cookiecutter.project_slug}}.users.adapters.SocialAccountAdapter"
{% if cookiecutter.use_compressor == 'y' -%}
# django-compressor
# ------------------------------------------------------------------------------
# https://django-compressor.readthedocs.io/en/latest/quickstart/#installation
INSTALLED_APPS += ['compressor']
STATICFILES_FINDERS += ['compressor.finders.CompressorFinder']
INSTALLED_APPS += ["compressor"]
STATICFILES_FINDERS += ["compressor.finders.CompressorFinder"]
{%- endif %}
{% if cookiecutter.use_drf == "y" -%}
# django-reset-framework
# -------------------------------------------------------------------------------
# django-rest-framework - https://www.django-rest-framework.org/api-guide/settings/
REST_FRAMEWORK = {
"DEFAULT_AUTHENTICATION_CLASSES": (
"rest_framework.authentication.SessionAuthentication",
"rest_framework.authentication.TokenAuthentication",
),
"DEFAULT_PERMISSION_CLASSES": ("rest_framework.permissions.IsAuthenticated",),
}
{%- endif %}
# Your stuff...
# ------------------------------------------------------------------------------

View File

@ -6,72 +6,75 @@ from .base import env
# https://docs.djangoproject.com/en/dev/ref/settings/#debug
DEBUG = True
# https://docs.djangoproject.com/en/dev/ref/settings/#secret-key
SECRET_KEY = env('DJANGO_SECRET_KEY', default='!!!SET DJANGO_SECRET_KEY!!!')
SECRET_KEY = env(
"DJANGO_SECRET_KEY",
default="!!!SET DJANGO_SECRET_KEY!!!",
)
# https://docs.djangoproject.com/en/dev/ref/settings/#allowed-hosts
ALLOWED_HOSTS = [
"localhost",
"0.0.0.0",
"127.0.0.1",
]
ALLOWED_HOSTS = ["localhost", "0.0.0.0", "127.0.0.1"]
# CACHES
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#caches
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
'LOCATION': ''
"default": {
"BACKEND": "django.core.cache.backends.locmem.LocMemCache",
"LOCATION": "",
}
}
# TEMPLATES
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#templates
TEMPLATES[0]['OPTIONS']['debug'] = DEBUG # noqa F405
# EMAIL
# ------------------------------------------------------------------------------
{% if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'y' -%}
# https://docs.djangoproject.com/en/dev/ref/settings/#email-host
EMAIL_HOST = env('EMAIL_HOST', default='mailhog')
{%- elif cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' -%}
# https://docs.djangoproject.com/en/dev/ref/settings/#email-host
EMAIL_HOST = 'localhost'
{%- else -%}
# https://docs.djangoproject.com/en/dev/ref/settings/#email-backend
EMAIL_BACKEND = env('DJANGO_EMAIL_BACKEND', default='django.core.mail.backends.console.EmailBackend')
# https://docs.djangoproject.com/en/dev/ref/settings/#email-host
EMAIL_HOST = 'localhost'
{%- endif %}
EMAIL_HOST = env("EMAIL_HOST", default="mailhog")
# https://docs.djangoproject.com/en/dev/ref/settings/#email-port
EMAIL_PORT = 1025
{%- elif cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' -%}
# https://docs.djangoproject.com/en/dev/ref/settings/#email-host
EMAIL_HOST = "localhost"
# https://docs.djangoproject.com/en/dev/ref/settings/#email-port
EMAIL_PORT = 1025
{%- else -%}
# https://docs.djangoproject.com/en/dev/ref/settings/#email-backend
EMAIL_BACKEND = env(
"DJANGO_EMAIL_BACKEND", default="django.core.mail.backends.console.EmailBackend"
)
{%- endif %}
{%- if cookiecutter.use_whitenoise == 'y' %}
# WhiteNoise
# ------------------------------------------------------------------------------
# http://whitenoise.evans.io/en/latest/django.html#using-whitenoise-in-development
INSTALLED_APPS = ["whitenoise.runserver_nostatic"] + INSTALLED_APPS # noqa F405
{% endif %}
# django-debug-toolbar
# ------------------------------------------------------------------------------
# https://django-debug-toolbar.readthedocs.io/en/latest/installation.html#prerequisites
INSTALLED_APPS += ['debug_toolbar'] # noqa F405
INSTALLED_APPS += ["debug_toolbar"] # noqa F405
# https://django-debug-toolbar.readthedocs.io/en/latest/installation.html#middleware
MIDDLEWARE += ['debug_toolbar.middleware.DebugToolbarMiddleware'] # noqa F405
MIDDLEWARE += ["debug_toolbar.middleware.DebugToolbarMiddleware"] # noqa F405
# https://django-debug-toolbar.readthedocs.io/en/latest/configuration.html#debug-toolbar-config
DEBUG_TOOLBAR_CONFIG = {
'DISABLE_PANELS': [
'debug_toolbar.panels.redirects.RedirectsPanel',
],
'SHOW_TEMPLATE_CONTEXT': True,
"DISABLE_PANELS": ["debug_toolbar.panels.redirects.RedirectsPanel"],
"SHOW_TEMPLATE_CONTEXT": True,
}
# https://django-debug-toolbar.readthedocs.io/en/latest/installation.html#internal-ips
INTERNAL_IPS = ['127.0.0.1', '10.0.2.2']
INTERNAL_IPS = ["127.0.0.1", "10.0.2.2"]
{% if cookiecutter.use_docker == 'y' -%}
if env('USE_DOCKER') == 'yes':
if env("USE_DOCKER") == "yes":
import socket
hostname, _, ips = socket.gethostbyname_ex(socket.gethostname())
INTERNAL_IPS += [ip[:-1] + '1' for ip in ips]
INTERNAL_IPS += [ip[:-1] + "1" for ip in ips]
{%- endif %}
# django-extensions
# ------------------------------------------------------------------------------
# https://django-extensions.readthedocs.io/en/latest/installation_instructions.html#configuration
INSTALLED_APPS += ['django_extensions'] # noqa F405
INSTALLED_APPS += ["django_extensions"] # noqa F405
{% if cookiecutter.use_celery == 'y' -%}
# Celery

View File

@ -1,6 +1,14 @@
{% if cookiecutter.use_sentry == 'y' -%}
import logging
import sentry_sdk
from sentry_sdk.integrations.django import DjangoIntegration
from sentry_sdk.integrations.logging import LoggingIntegration
{%- if cookiecutter.use_celery == 'y' %}
from sentry_sdk.integrations.celery import CeleryIntegration
{% endif %}
{% endif -%}
from .base import * # noqa
from .base import env
@ -8,37 +16,37 @@ from .base import env
# GENERAL
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#secret-key
SECRET_KEY = env('DJANGO_SECRET_KEY')
SECRET_KEY = env("DJANGO_SECRET_KEY")
# https://docs.djangoproject.com/en/dev/ref/settings/#allowed-hosts
ALLOWED_HOSTS = env.list('DJANGO_ALLOWED_HOSTS', default=['{{ cookiecutter.domain_name }}'])
ALLOWED_HOSTS = env.list("DJANGO_ALLOWED_HOSTS", default=["{{ cookiecutter.domain_name }}"])
# DATABASES
# ------------------------------------------------------------------------------
DATABASES['default'] = env.db('DATABASE_URL') # noqa F405
DATABASES['default']['ATOMIC_REQUESTS'] = True # noqa F405
DATABASES['default']['CONN_MAX_AGE'] = env.int('CONN_MAX_AGE', default=60) # noqa F405
DATABASES["default"] = env.db("DATABASE_URL") # noqa F405
DATABASES["default"]["ATOMIC_REQUESTS"] = True # noqa F405
DATABASES["default"]["CONN_MAX_AGE"] = env.int("CONN_MAX_AGE", default=60) # noqa F405
# CACHES
# ------------------------------------------------------------------------------
CACHES = {
'default': {
'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': env('REDIS_URL'),
'OPTIONS': {
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": env("REDIS_URL"),
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
# Mimicing memcache behavior.
# http://niwinz.github.io/django-redis/latest/#_memcached_exceptions_behavior
'IGNORE_EXCEPTIONS': True,
}
"IGNORE_EXCEPTIONS": True,
},
}
}
# SECURITY
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#secure-proxy-ssl-header
SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
# https://docs.djangoproject.com/en/dev/ref/settings/#secure-ssl-redirect
SECURE_SSL_REDIRECT = env.bool('DJANGO_SECURE_SSL_REDIRECT', default=True)
SECURE_SSL_REDIRECT = env.bool("DJANGO_SECURE_SSL_REDIRECT", default=True)
# https://docs.djangoproject.com/en/dev/ref/settings/#session-cookie-secure
SESSION_COOKIE_SECURE = True
# https://docs.djangoproject.com/en/dev/ref/settings/#csrf-cookie-secure
@ -48,249 +56,257 @@ CSRF_COOKIE_SECURE = True
# TODO: set this to 60 seconds first and then to 518400 once you prove the former works
SECURE_HSTS_SECONDS = 60
# https://docs.djangoproject.com/en/dev/ref/settings/#secure-hsts-include-subdomains
SECURE_HSTS_INCLUDE_SUBDOMAINS = env.bool('DJANGO_SECURE_HSTS_INCLUDE_SUBDOMAINS', default=True)
SECURE_HSTS_INCLUDE_SUBDOMAINS = env.bool(
"DJANGO_SECURE_HSTS_INCLUDE_SUBDOMAINS", default=True
)
# https://docs.djangoproject.com/en/dev/ref/settings/#secure-hsts-preload
SECURE_HSTS_PRELOAD = env.bool('DJANGO_SECURE_HSTS_PRELOAD', default=True)
SECURE_HSTS_PRELOAD = env.bool("DJANGO_SECURE_HSTS_PRELOAD", default=True)
# https://docs.djangoproject.com/en/dev/ref/middleware/#x-content-type-options-nosniff
SECURE_CONTENT_TYPE_NOSNIFF = env.bool('DJANGO_SECURE_CONTENT_TYPE_NOSNIFF', default=True)
SECURE_CONTENT_TYPE_NOSNIFF = env.bool(
"DJANGO_SECURE_CONTENT_TYPE_NOSNIFF", default=True
)
{% if cookiecutter.cloud_provider != 'None' -%}
# STORAGES
# ------------------------------------------------------------------------------
# https://django-storages.readthedocs.io/en/latest/#installation
INSTALLED_APPS += ['storages'] # noqa F405
INSTALLED_APPS += ["storages"] # noqa F405
{%- endif -%}
{% if cookiecutter.cloud_provider == 'AWS' %}
# https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#settings
AWS_ACCESS_KEY_ID = env('DJANGO_AWS_ACCESS_KEY_ID')
AWS_ACCESS_KEY_ID = env("DJANGO_AWS_ACCESS_KEY_ID")
# https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#settings
AWS_SECRET_ACCESS_KEY = env('DJANGO_AWS_SECRET_ACCESS_KEY')
AWS_SECRET_ACCESS_KEY = env("DJANGO_AWS_SECRET_ACCESS_KEY")
# https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#settings
AWS_STORAGE_BUCKET_NAME = env('DJANGO_AWS_STORAGE_BUCKET_NAME')
AWS_STORAGE_BUCKET_NAME = env("DJANGO_AWS_STORAGE_BUCKET_NAME")
# https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#settings
AWS_QUERYSTRING_AUTH = False
# DO NOT change these unless you know what you're doing.
_AWS_EXPIRY = 60 * 60 * 24 * 7
# https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#settings
AWS_S3_OBJECT_PARAMETERS = {
'CacheControl': f'max-age={_AWS_EXPIRY}, s-maxage={_AWS_EXPIRY}, must-revalidate',
"CacheControl": f"max-age={_AWS_EXPIRY}, s-maxage={_AWS_EXPIRY}, must-revalidate"
}
# https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#settings
AWS_DEFAULT_ACL = None
# https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#settings
AWS_S3_REGION_NAME = env("DJANGO_AWS_S3_REGION_NAME", default=None)
{% elif cookiecutter.cloud_provider == 'GCP' %}
GS_BUCKET_NAME = env("DJANGO_GCP_STORAGE_BUCKET_NAME")
GS_DEFAULT_ACL = "publicRead"
{% endif -%}
{% if cookiecutter.cloud_provider != 'None' or cookiecutter.use_whitenoise == 'y' -%}
# STATIC
# ------------------------
{% endif -%}
{% if cookiecutter.use_whitenoise == 'y' -%}
STATICFILES_STORAGE = 'whitenoise.storage.CompressedManifestStaticFilesStorage'
{%- else %}
STATICFILES_STORAGE = 'config.settings.production.StaticRootS3Boto3Storage'
STATIC_URL = f'https://{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws.com/static/'
{%- endif %}
STATICFILES_STORAGE = "whitenoise.storage.CompressedManifestStaticFilesStorage"
{% elif cookiecutter.cloud_provider == 'AWS' -%}
STATICFILES_STORAGE = "config.settings.production.StaticRootS3Boto3Storage"
COLLECTFAST_STRATEGY = "collectfast.strategies.boto3.Boto3Strategy"
STATIC_URL = f"https://{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws.com/static/"
{% elif cookiecutter.cloud_provider == 'GCP' -%}
STATICFILES_STORAGE = "config.settings.production.StaticRootGoogleCloudStorage"
COLLECTFAST_STRATEGY = "collectfast.strategies.gcloud.GoogleCloudStrategy"
STATIC_URL = f"https://storage.googleapis.com/{GS_BUCKET_NAME}/static/"
{% endif -%}
# MEDIA
# ------------------------------------------------------------------------------
{% if cookiecutter.use_whitenoise == 'y' -%}
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
MEDIA_URL = f'https://{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws.com/'
{%- else %}
{%- if cookiecutter.cloud_provider == 'AWS' %}
# region http://stackoverflow.com/questions/10390244/
# Full-fledge class: https://stackoverflow.com/a/18046120/104731
from storages.backends.s3boto3 import S3Boto3Storage # noqa E402
class StaticRootS3Boto3Storage(S3Boto3Storage):
location = 'static'
location = "static"
default_acl = "public-read"
class MediaRootS3Boto3Storage(S3Boto3Storage):
location = 'media'
location = "media"
file_overwrite = False
# endregion
DEFAULT_FILE_STORAGE = 'config.settings.production.MediaRootS3Boto3Storage'
MEDIA_URL = f'https://{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws.com/media/'
DEFAULT_FILE_STORAGE = "config.settings.production.MediaRootS3Boto3Storage"
MEDIA_URL = f"https://{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws.com/media/"
{%- elif cookiecutter.cloud_provider == 'GCP' %}
from storages.backends.gcloud import GoogleCloudStorage # noqa E402
class StaticRootGoogleCloudStorage(GoogleCloudStorage):
location = "static"
default_acl = "publicRead"
class MediaRootGoogleCloudStorage(GoogleCloudStorage):
location = "media"
file_overwrite = False
DEFAULT_FILE_STORAGE = "config.settings.production.MediaRootGoogleCloudStorage"
MEDIA_URL = f"https://storage.googleapis.com/{GS_BUCKET_NAME}/media/"
{%- endif %}
# TEMPLATES
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#templates
TEMPLATES[0]['OPTIONS']['loaders'] = [ # noqa F405
TEMPLATES[-1]["OPTIONS"]["loaders"] = [ # type: ignore[index] # noqa F405
(
'django.template.loaders.cached.Loader',
"django.template.loaders.cached.Loader",
[
'django.template.loaders.filesystem.Loader',
'django.template.loaders.app_directories.Loader',
]
),
"django.template.loaders.filesystem.Loader",
"django.template.loaders.app_directories.Loader",
],
)
]
# EMAIL
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#default-from-email
DEFAULT_FROM_EMAIL = env(
'DJANGO_DEFAULT_FROM_EMAIL',
default='{{cookiecutter.project_name}} <noreply@{{cookiecutter.domain_name}}>'
"DJANGO_DEFAULT_FROM_EMAIL", default="{{cookiecutter.project_name}} <noreply@{{cookiecutter.domain_name}}>"
)
# https://docs.djangoproject.com/en/dev/ref/settings/#server-email
SERVER_EMAIL = env('DJANGO_SERVER_EMAIL', default=DEFAULT_FROM_EMAIL)
SERVER_EMAIL = env("DJANGO_SERVER_EMAIL", default=DEFAULT_FROM_EMAIL)
# https://docs.djangoproject.com/en/dev/ref/settings/#email-subject-prefix
EMAIL_SUBJECT_PREFIX = env('DJANGO_EMAIL_SUBJECT_PREFIX', default='[{{cookiecutter.project_name}}]')
EMAIL_SUBJECT_PREFIX = env(
"DJANGO_EMAIL_SUBJECT_PREFIX", default="[{{cookiecutter.project_name}}]"
)
# ADMIN
# ------------------------------------------------------------------------------
# Django Admin URL regex.
ADMIN_URL = env('DJANGO_ADMIN_URL')
ADMIN_URL = env("DJANGO_ADMIN_URL")
# Anymail (Mailgun)
# ------------------------------------------------------------------------------
# https://anymail.readthedocs.io/en/stable/installation/#installing-anymail
INSTALLED_APPS += ['anymail'] # noqa F405
EMAIL_BACKEND = 'anymail.backends.mailgun.EmailBackend'
INSTALLED_APPS += ["anymail"] # noqa F405
EMAIL_BACKEND = "anymail.backends.mailgun.EmailBackend"
# https://anymail.readthedocs.io/en/stable/installation/#anymail-settings-reference
ANYMAIL = {
'MAILGUN_API_KEY': env('MAILGUN_API_KEY'),
'MAILGUN_SENDER_DOMAIN': env('MAILGUN_DOMAIN')
"MAILGUN_API_KEY": env("MAILGUN_API_KEY"),
"MAILGUN_SENDER_DOMAIN": env("MAILGUN_DOMAIN"),
"MAILGUN_API_URL": env("MAILGUN_API_URL", default="https://api.mailgun.net/v3"),
}
# Gunicorn
# ------------------------------------------------------------------------------
INSTALLED_APPS += ['gunicorn'] # noqa F405
{% if cookiecutter.use_whitenoise == 'y' -%}
# WhiteNoise
# ------------------------------------------------------------------------------
# http://whitenoise.evans.io/en/latest/django.html#enable-whitenoise
MIDDLEWARE.insert(1, 'whitenoise.middleware.WhiteNoiseMiddleware') # noqa F405
{% endif %}
{%- if cookiecutter.use_compressor == 'y' -%}
{% if cookiecutter.use_compressor == 'y' -%}
# django-compressor
# ------------------------------------------------------------------------------
# https://django-compressor.readthedocs.io/en/latest/settings/#django.conf.settings.COMPRESS_ENABLED
COMPRESS_ENABLED = env.bool('COMPRESS_ENABLED', default=True)
COMPRESS_ENABLED = env.bool("COMPRESS_ENABLED", default=True)
# https://django-compressor.readthedocs.io/en/latest/settings/#django.conf.settings.COMPRESS_STORAGE
COMPRESS_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
COMPRESS_STORAGE = "storages.backends.s3boto3.S3Boto3Storage"
# https://django-compressor.readthedocs.io/en/latest/settings/#django.conf.settings.COMPRESS_URL
COMPRESS_URL = STATIC_URL
COMPRESS_URL = STATIC_URL{% if cookiecutter.use_whitenoise == 'y' or cookiecutter.cloud_provider == 'None' %} # noqa F405{% endif %}
{% endif %}
{%- if cookiecutter.use_whitenoise == 'n' -%}
# Collectfast
# ------------------------------------------------------------------------------
# https://github.com/antonagestam/collectfast#installation
INSTALLED_APPS = ['collectfast'] + INSTALLED_APPS # noqa F405
AWS_PRELOAD_METADATA = True
INSTALLED_APPS = ["collectfast"] + INSTALLED_APPS # noqa F405
{% endif %}
{%- if cookiecutter.use_sentry == 'y' -%}
# raven
# ------------------------------------------------------------------------------
# https://docs.sentry.io/clients/python/integrations/django/
INSTALLED_APPS += ['raven.contrib.django.raven_compat'] # noqa F405
MIDDLEWARE = ['raven.contrib.django.raven_compat.middleware.SentryResponseErrorIdMiddleware'] + MIDDLEWARE
# Sentry
# ------------------------------------------------------------------------------
SENTRY_DSN = env('SENTRY_DSN')
SENTRY_CLIENT = env('DJANGO_SENTRY_CLIENT', default='raven.contrib.django.raven_compat.DjangoClient')
LOGGING = {
'version': 1,
'disable_existing_loggers': True,
'root': {
'level': 'WARNING',
'handlers': ['sentry'],
},
'formatters': {
'verbose': {
'format': '%(levelname)s %(asctime)s %(module)s '
'%(process)d %(thread)d %(message)s'
},
},
'handlers': {
'sentry': {
'level': 'ERROR',
'class': 'raven.contrib.django.raven_compat.handlers.SentryHandler',
},
'console': {
'level': 'DEBUG',
'class': 'logging.StreamHandler',
'formatter': 'verbose'
}
},
'loggers': {
'django.db.backends': {
'level': 'ERROR',
'handlers': ['console'],
'propagate': False,
},
'raven': {
'level': 'DEBUG',
'handlers': ['console'],
'propagate': False,
},
'sentry.errors': {
'level': 'DEBUG',
'handlers': ['console'],
'propagate': False,
},
'django.security.DisallowedHost': {
'level': 'ERROR',
'handlers': ['console', 'sentry'],
'propagate': False,
},
},
}
SENTRY_CELERY_LOGLEVEL = env.int('DJANGO_SENTRY_LOG_LEVEL', logging.INFO)
RAVEN_CONFIG = {
'dsn': SENTRY_DSN
}
{%- else %}
# LOGGING
# ------------------------------------------------------------------------------
# See: https://docs.djangoproject.com/en/dev/ref/settings/#logging
# https://docs.djangoproject.com/en/dev/ref/settings/#logging
# See https://docs.djangoproject.com/en/dev/topics/logging for
# more details on how to customize your logging configuration.
{% if cookiecutter.use_sentry == 'n' -%}
# A sample logging configuration. The only tangible logging
# performed by this configuration is to send an email to
# the site admins on every HTTP 500 error when DEBUG=False.
# See https://docs.djangoproject.com/en/dev/topics/logging for
# more details on how to customize your logging configuration.
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'filters': {
'require_debug_false': {
'()': 'django.utils.log.RequireDebugFalse'
"version": 1,
"disable_existing_loggers": False,
"filters": {"require_debug_false": {"()": "django.utils.log.RequireDebugFalse"}},
"formatters": {
"verbose": {
"format": "%(levelname)s %(asctime)s %(module)s "
"%(process)d %(thread)d %(message)s"
}
},
'formatters': {
'verbose': {
'format': '%(levelname)s %(asctime)s %(module)s '
'%(process)d %(thread)d %(message)s'
"handlers": {
"mail_admins": {
"level": "ERROR",
"filters": ["require_debug_false"],
"class": "django.utils.log.AdminEmailHandler",
},
"console": {
"level": "DEBUG",
"class": "logging.StreamHandler",
"formatter": "verbose",
},
},
'handlers': {
'mail_admins': {
'level': 'ERROR',
'filters': ['require_debug_false'],
'class': 'django.utils.log.AdminEmailHandler'
"root": {"level": "INFO", "handlers": ["console"]},
"loggers": {
"django.request": {
"handlers": ["mail_admins"],
"level": "ERROR",
"propagate": True,
},
'console': {
'level': 'DEBUG',
'class': 'logging.StreamHandler',
'formatter': 'verbose',
"django.security.DisallowedHost": {
"level": "ERROR",
"handlers": ["console", "mail_admins"],
"propagate": True,
},
},
'loggers': {
'django.request': {
'handlers': ['mail_admins'],
'level': 'ERROR',
'propagate': True
},
'django.security.DisallowedHost': {
'level': 'ERROR',
'handlers': ['console', 'mail_admins'],
'propagate': True
}
{% else %}
LOGGING = {
"version": 1,
"disable_existing_loggers": True,
"formatters": {
"verbose": {
"format": "%(levelname)s %(asctime)s %(module)s "
"%(process)d %(thread)d %(message)s"
}
},
"handlers": {
"console": {
"level": "DEBUG",
"class": "logging.StreamHandler",
"formatter": "verbose",
}
},
"root": {"level": "INFO", "handlers": ["console"]},
"loggers": {
"django.db.backends": {
"level": "ERROR",
"handlers": ["console"],
"propagate": False,
},
# Errors logged by the SDK itself
"sentry_sdk": {"level": "ERROR", "handlers": ["console"], "propagate": False},
"django.security.DisallowedHost": {
"level": "ERROR",
"handlers": ["console"],
"propagate": False,
},
},
}
# Sentry
# ------------------------------------------------------------------------------
SENTRY_DSN = env("SENTRY_DSN")
SENTRY_LOG_LEVEL = env.int("DJANGO_SENTRY_LOG_LEVEL", logging.INFO)
sentry_logging = LoggingIntegration(
level=SENTRY_LOG_LEVEL, # Capture info and above as breadcrumbs
event_level=logging.ERROR, # Send errors as events
)
{%- if cookiecutter.use_celery == 'y' %}
sentry_sdk.init(
dsn=SENTRY_DSN,
integrations=[sentry_logging, DjangoIntegration(), CeleryIntegration()],
)
{% else %}
sentry_sdk.init(dsn=SENTRY_DSN, integrations=[sentry_logging, DjangoIntegration()])
{% endif -%}
{% endif %}
# Your stuff...
# ------------------------------------------------------------------------------

View File

@ -7,10 +7,11 @@ from .base import env
# GENERAL
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#debug
DEBUG = False
# https://docs.djangoproject.com/en/dev/ref/settings/#secret-key
SECRET_KEY = env("DJANGO_SECRET_KEY", default="!!!SET DJANGO_SECRET_KEY!!!")
SECRET_KEY = env(
"DJANGO_SECRET_KEY",
default="!!!SET DJANGO_SECRET_KEY!!!",
)
# https://docs.djangoproject.com/en/dev/ref/settings/#test-runner
TEST_RUNNER = "django.test.runner.DiscoverRunner"
@ -19,7 +20,8 @@ TEST_RUNNER = "django.test.runner.DiscoverRunner"
# https://docs.djangoproject.com/en/dev/ref/settings/#caches
CACHES = {
"default": {
"BACKEND": "django.core.cache.backends.locmem.LocMemCache", "LOCATION": ""
"BACKEND": "django.core.cache.backends.locmem.LocMemCache",
"LOCATION": "",
}
}
@ -30,9 +32,7 @@ PASSWORD_HASHERS = ["django.contrib.auth.hashers.MD5PasswordHasher"]
# TEMPLATES
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#templates
TEMPLATES[0]["OPTIONS"]["debug"] = DEBUG # noqa F405
TEMPLATES[0]["OPTIONS"]["loaders"] = [ # noqa F405
TEMPLATES[-1]["OPTIONS"]["loaders"] = [ # type: ignore[index] # noqa F405
(
"django.template.loaders.cached.Loader",
[
@ -46,10 +46,6 @@ TEMPLATES[0]["OPTIONS"]["loaders"] = [ # noqa F405
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#email-backend
EMAIL_BACKEND = "django.core.mail.backends.locmem.EmailBackend"
# https://docs.djangoproject.com/en/dev/ref/settings/#email-host
EMAIL_HOST = "localhost"
# https://docs.djangoproject.com/en/dev/ref/settings/#email-port
EMAIL_PORT = 1025
# Your stuff...
# ------------------------------------------------------------------------------

View File

@ -4,26 +4,31 @@ from django.conf.urls.static import static
from django.contrib import admin
from django.views.generic import TemplateView
from django.views import defaults as default_views
{% if cookiecutter.use_drf == 'y' -%}
from rest_framework.authtoken.views import obtain_auth_token
{%- endif %}
urlpatterns = [
path("", TemplateView.as_view(template_name="pages/home.html"), name="home"),
path(
"about/",
TemplateView.as_view(template_name="pages/about.html"),
name="about",
"about/", TemplateView.as_view(template_name="pages/about.html"), name="about"
),
# Django Admin, use {% raw %}{% url 'admin:index' %}{% endraw %}
path(settings.ADMIN_URL, admin.site.urls),
# User management
path(
"users/",
include("{{ cookiecutter.project_slug }}.users.urls", namespace="users"),
),
path("users/", include("{{ cookiecutter.project_slug }}.users.urls", namespace="users")),
path("accounts/", include("allauth.urls")),
# Your stuff: custom urls includes go here
] + static(
settings.MEDIA_URL, document_root=settings.MEDIA_ROOT
)
] + static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
{% if cookiecutter.use_drf == 'y' -%}
# API URLS
urlpatterns += [
# API base url
path("api/", include("config.api_router")),
# DRF auth token
path("auth-token/", obtain_auth_token),
]
{%- endif %}
if settings.DEBUG:
# This allows the error pages to be debugged during development, just visit

View File

@ -20,13 +20,10 @@ from django.core.wsgi import get_wsgi_application
# This allows easy placement of apps within the interior
# {{ cookiecutter.project_slug }} directory.
app_path = os.path.abspath(os.path.join(
os.path.dirname(os.path.abspath(__file__)), os.pardir))
sys.path.append(os.path.join(app_path, '{{ cookiecutter.project_slug }}'))
{% if cookiecutter.use_sentry == 'y' -%}
if os.environ.get('DJANGO_SETTINGS_MODULE') == 'config.settings.production':
from raven.contrib.django.raven_compat.middleware.wsgi import Sentry
{%- endif %}
app_path = os.path.abspath(
os.path.join(os.path.dirname(os.path.abspath(__file__)), os.pardir)
)
sys.path.append(os.path.join(app_path, "{{ cookiecutter.project_slug }}"))
# We defer to a DJANGO_SETTINGS_MODULE already in the environment. This breaks
# if running multiple sites in the same mod_wsgi process. To fix this, use
# mod_wsgi daemon mode with each site in its own daemon process, or use
@ -37,10 +34,6 @@ os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.production")
# file. This includes Django's development server, if the WSGI_APPLICATION
# setting points here.
application = get_wsgi_application()
{% if cookiecutter.use_sentry == 'y' -%}
if os.environ.get('DJANGO_SETTINGS_MODULE') == 'config.settings.production':
application = Sentry(application)
{%- endif %}
# Apply WSGI middleware here.
# from helloworld.wsgi import HelloWorldApplication
# application = HelloWorldApplication(application)

View File

@ -1,153 +1,20 @@
# Makefile for Sphinx documentation
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = .
BUILDDIR = _build
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
# Put it first so that "make" without argument is like "make help".
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " texinfo to make Texinfo files"
@echo " info to make Texinfo files and run them through makeinfo"
@echo " gettext to make PO message catalogs"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
clean:
-rm -rf $(BUILDDIR)/*
.PHONY: help Makefile
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/{{ cookiecutter.project_slug }}.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/{{ cookiecutter.project_slug }}.qhc"
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/{{ cookiecutter.project_slug }}"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/{{ cookiecutter.project_slug }}"
@echo "# devhelp"
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

View File

@ -1,255 +1,55 @@
# {{ cookiecutter.project_name }} documentation build configuration file, created by
# sphinx-quickstart.
# Configuration file for the Sphinx documentation builder.
#
# This file is execfile()d with the current directory set to its containing dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.
# This file only contains a selection of the most common options. For a full
# list see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
import os
import sys
# -- Path setup --------------------------------------------------------------
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
# sys.path.insert(0, os.path.abspath('.'))
#
import os
import sys
# -- General configuration -----------------------------------------------------
# import django
# sys.path.insert(0, os.path.abspath('..'))
# os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.local")
# django.setup()
# If your documentation needs a minimal Sphinx version, state it here.
# needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be extensions
# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
# -- Project information -----------------------------------------------------
project = "{{ cookiecutter.project_name }}"
copyright = """{% now 'utc', '%Y' %}, {{ cookiecutter.author_name }}"""
author = "{{ cookiecutter.author_name }}"
# -- General configuration ---------------------------------------------------
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = []
# Add any paths that contain templates here, relative to this directory.
templates_path = ["_templates"]
# The suffix of source filenames.
source_suffix = ".rst"
# The encoding of source files.
# source_encoding = 'utf-8-sig'
# The master toctree document.
master_doc = "index"
# General information about the project.
project = "{{ cookiecutter.project_name }}"
copyright = """{% now 'utc', '%Y' %}, {{ cookiecutter.author_name }}"""
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
version = "0.1"
# The full version, including alpha/beta/rc tags.
release = "0.1"
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
# language = None
# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
# today = ''
# Else, today_fmt is used as the format for a strftime call.
# today_fmt = '%B %d, %Y'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = ["_build"]
# The reST default role (used for this markup: `text`) to use for all documents.
# default_role = None
# If true, '()' will be appended to :func: etc. cross-reference text.
# add_function_parentheses = True
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
# add_module_names = True
# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
# show_authors = False
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = "sphinx"
# A list of ignored prefixes for module index sorting.
# modindex_common_prefix = []
# This pattern also affects html_static_path and html_extra_path.
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
# -- Options for HTML output ---------------------------------------------------
# -- Options for HTML output -------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = "default"
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
# html_theme_options = {}
# Add any paths that contain custom themes here, relative to this directory.
# html_theme_path = []
# The name for this set of Sphinx documents. If None, it defaults to
# "<project> v<release> documentation".
# html_title = None
# A shorter title for the navigation bar. Default is the same as html_title.
# html_short_title = None
# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
# html_logo = None
# The name of an image file (within the static path) to use as favicon of the
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
# html_favicon = None
#
html_theme = "alabaster"
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ["_static"]
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
# using the given strftime format.
# html_last_updated_fmt = '%b %d, %Y'
# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
# html_use_smartypants = True
# Custom sidebar templates, maps document names to template names.
# html_sidebars = {}
# Additional templates that should be rendered to pages, maps page names to
# template names.
# html_additional_pages = {}
# If false, no module index is generated.
# html_domain_indices = True
# If false, no index is generated.
# html_use_index = True
# If true, the index is split into individual pages for each letter.
# html_split_index = False
# If true, links to the reST sources are added to the pages.
# html_show_sourcelink = True
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
# html_show_sphinx = True
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
# html_show_copyright = True
# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it. The value of this option must be the
# base URL from which the finished HTML is served.
# html_use_opensearch = ''
# This is the file name suffix for HTML files (e.g. ".xhtml").
# html_file_suffix = None
# Output file base name for HTML help builder.
htmlhelp_basename = "{{ cookiecutter.project_slug }}doc"
# -- Options for LaTeX output --------------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
# 'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
# 'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
# 'preamble': '',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title, author, documentclass [howto/manual]).
latex_documents = [
(
"index",
"{{ cookiecutter.project_slug }}.tex",
"{{ cookiecutter.project_name }} Documentation",
"""{{ cookiecutter.author_name }}""",
"manual",
)
]
# The name of an image file (relative to this directory) to place at the top of
# the title page.
# latex_logo = None
# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
# latex_use_parts = False
# If true, show page references after internal links.
# latex_show_pagerefs = False
# If true, show URL addresses after external links.
# latex_show_urls = False
# Documents to append as an appendix to all manuals.
# latex_appendices = []
# If false, no module index is generated.
# latex_domain_indices = True
# -- Options for manual page output --------------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(
"index",
"{{ cookiecutter.project_slug }}",
"{{ cookiecutter.project_name }} Documentation",
["""{{ cookiecutter.author_name }}"""],
1,
)
]
# If true, show URL addresses after external links.
# man_show_urls = False
# -- Options for Texinfo output ------------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
(
"index",
"{{ cookiecutter.project_slug }}",
"{{ cookiecutter.project_name }} Documentation",
"""{{ cookiecutter.author_name }}""",
"{{ cookiecutter.project_name }}",
"""{{ cookiecutter.description }}""",
"Miscellaneous",
)
]
# Documents to append as an appendix to all manuals.
# texinfo_appendices = []
# If false, no module index is generated.
# texinfo_domain_indices = True
# How to display URL addresses: 'footnote', 'no', or 'inline'.
# texinfo_show_urls = 'footnote'

View File

@ -3,17 +3,19 @@
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
{{ cookiecutter.project_name }} Project Documentation
====================================================================
Table of Contents:
Welcome to {{ cookiecutter.project_name }}'s documentation!
======================================================================
.. toctree::
:maxdepth: 2
:caption: Contents:
pycharm/configuration
Indices & Tables
================
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`

View File

@ -1,190 +1,35 @@
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=.
set BUILDDIR=_build
set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% .
set I18NSPHINXOPTS=%SPHINXOPTS% .
if NOT "%PAPER%" == "" (
set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS%
set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS%
)
if "%1" == "" goto help
if "%1" == "help" (
:help
echo.Please use `make ^<target^>` where ^<target^> is one of
echo. html to make standalone HTML files
echo. dirhtml to make HTML files named index.html in directories
echo. singlehtml to make a single large HTML file
echo. pickle to make pickle files
echo. json to make JSON files
echo. htmlhelp to make HTML files and a HTML help project
echo. qthelp to make HTML files and a qthelp project
echo. devhelp to make HTML files and a Devhelp project
echo. epub to make an epub
echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter
echo. text to make text files
echo. man to make manual pages
echo. texinfo to make Texinfo files
echo. gettext to make PO message catalogs
echo. changes to make an overview over all changed/added/deprecated items
echo. linkcheck to check all external links for integrity
echo. doctest to run all doctests embedded in the documentation if enabled
goto end
)
if "%1" == "clean" (
for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i
del /q /s %BUILDDIR%\*
goto end
)
if "%1" == "html" (
%SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html
if errorlevel 1 exit /b 1
%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/html.
goto end
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
exit /b 1
)
if "%1" == "dirhtml" (
%SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml.
goto end
)
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end
if "%1" == "singlehtml" (
%SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml.
goto end
)
if "%1" == "pickle" (
%SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the pickle files.
goto end
)
if "%1" == "json" (
%SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the JSON files.
goto end
)
if "%1" == "htmlhelp" (
%SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run HTML Help Workshop with the ^
.hhp project file in %BUILDDIR%/htmlhelp.
goto end
)
if "%1" == "qthelp" (
%SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run "qcollectiongenerator" with the ^
.qhcp project file in %BUILDDIR%/qthelp, like this:
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\{{ cookiecutter.project_slug }}.qhcp
echo.To view the help file:
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\{{ cookiecutter.project_slug }}.ghc
goto end
)
if "%1" == "devhelp" (
%SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished.
goto end
)
if "%1" == "epub" (
%SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The epub file is in %BUILDDIR%/epub.
goto end
)
if "%1" == "latex" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
if errorlevel 1 exit /b 1
echo.
echo.Build finished; the LaTeX files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "text" (
%SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The text files are in %BUILDDIR%/text.
goto end
)
if "%1" == "man" (
%SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The manual pages are in %BUILDDIR%/man.
goto end
)
if "%1" == "texinfo" (
%SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo.
goto end
)
if "%1" == "gettext" (
%SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The message catalogs are in %BUILDDIR%/locale.
goto end
)
if "%1" == "changes" (
%SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes
if errorlevel 1 exit /b 1
echo.
echo.The overview file is in %BUILDDIR%/changes.
goto end
)
if "%1" == "linkcheck" (
%SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
if errorlevel 1 exit /b 1
echo.
echo.Link check complete; look for any errors in the above output ^
or in %BUILDDIR%/linkcheck/output.txt.
goto end
)
if "%1" == "doctest" (
%SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest
if errorlevel 1 exit /b 1
echo.
echo.Testing of doctests in the sources finished, look at the ^
results in %BUILDDIR%/doctest/output.txt.
goto end
)
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
:end
popd

View File

@ -14,7 +14,7 @@ This repository comes with already prepared "Run/Debug Configurations" for docke
.. image:: images/2.png
But as you can see, at the beggining there is something wrong with them. They have red X on django icon, and they cannot be used, without configuring remote python interpteter. To do that, you have to go to *Settings > Build, Execution, Deployment* first.
But as you can see, at the beginning there is something wrong with them. They have red X on django icon, and they cannot be used, without configuring remote python interpteter. To do that, you have to go to *Settings > Build, Execution, Deployment* first.
Next, you have to add new remote python interpreter, based on already tested deployment settings. Go to *Settings > Project > Project Interpreter*. Click on the cog icon, and click *Add Remote*.
@ -36,12 +36,18 @@ After few seconds, all *Run/Debug Configurations* should be ready to use.
**Things you can do with provided configuration**:
* run and debug python code
.. image:: images/f1.png
* run and debug tests
.. image:: images/f2.png
.. image:: images/f3.png
* run and debug migrations or different django management commands
.. image:: images/f4.png
* and many others..
Known issues

View File

@ -127,7 +127,23 @@ function initBrowserSync() {
`${paths.js}/*.js`,
`${paths.templates}/*.html`
], {
proxy: "localhost:8000"
// https://www.browsersync.io/docs/options/#option-proxy
{%- if cookiecutter.use_docker == 'n' %}
proxy: 'localhost:8000'
{% else %}
proxy: {
target: 'django:8000',
proxyReq: [
function(proxyReq, req) {
// Assign proxy "host" header same as current request at Browsersync server
proxyReq.setHeader('Host', req.headers.host)
}
]
},
// https://www.browsersync.io/docs/options/#option-open
// Disable as it doesn't work from inside a container
open: false
{%- endif %}
}
)
}
@ -149,7 +165,9 @@ const generateAssets = parallel(
// Set up dev environment
const dev = parallel(
{%- if cookiecutter.use_docker == 'n' %}
runServer,
{%- endif %}
initBrowserSync,
watchPaths
)

View File

@ -45,7 +45,7 @@ services:
{%- if cookiecutter.use_celery == 'y' %}
redis:
image: redis:3.2
image: redis:5.0
celeryworker:
<<: *django
@ -79,3 +79,23 @@ services:
command: /start-flower
{%- endif %}
{%- if cookiecutter.js_task_runner == 'Gulp' %}
node:
build:
context: .
dockerfile: ./compose/local/node/Dockerfile
image: {{ cookiecutter.project_slug }}_local_node
depends_on:
- django
volumes:
- .:/app
# http://jdlm.info/articles/2016/03/06/lessons-building-node-app-docker.html
- /app/node_modules
command: npm run dev
ports:
- "3000:3000"
# Expose browsersync UI: https://www.browsersync.io/docs/options/#option-ui
- "3001:3001"
{%- endif %}

View File

@ -8,7 +8,6 @@ PRODUCTION_DOTENVS_DIR_PATH = os.path.join(ROOT_DIR_PATH, ".envs", ".production"
PRODUCTION_DOTENV_FILE_PATHS = [
os.path.join(PRODUCTION_DOTENVS_DIR_PATH, ".django"),
os.path.join(PRODUCTION_DOTENVS_DIR_PATH, ".postgres"),
os.path.join(PRODUCTION_DOTENVS_DIR_PATH, ".caddy"),
]
DOTENV_FILE_PATH = os.path.join(ROOT_DIR_PATH, ".env")

View File

@ -5,7 +5,7 @@
"devDependencies": {
{% if cookiecutter.js_task_runner == 'Gulp' -%}
{% if cookiecutter.custom_bootstrap_compilation == 'y' -%}
"bootstrap": "4.1.1",
"bootstrap": "4.3.1",
"gulp-concat": "^2.6.1",
"jquery": "3.3.1",
"popper.js": "1.14.3",
@ -31,7 +31,8 @@
],
"scripts": {
{% if cookiecutter.js_task_runner == 'Gulp' -%}
"dev": "gulp"
"dev": "gulp",
"build": "gulp generate-assets"
{%- endif %}
}
}

View File

@ -3,7 +3,7 @@ version: '3'
volumes:
production_postgres_data: {}
production_postgres_data_backups: {}
production_caddy: {}
production_traefik: {}
services:
django:{% if cookiecutter.use_celery == 'y' %} &django{% endif %}
@ -30,23 +30,21 @@ services:
env_file:
- ./.envs/.production/.postgres
caddy:
traefik:
build:
context: .
dockerfile: ./compose/production/caddy/Dockerfile
image: {{ cookiecutter.project_slug }}_production_caddy
dockerfile: ./compose/production/traefik/Dockerfile
image: {{ cookiecutter.project_slug }}_production_traefik
depends_on:
- django
volumes:
- production_caddy:/root/.caddy
env_file:
- ./.envs/.production/.caddy
- production_traefik:/etc/traefik/acme
ports:
- "0.0.0.0:80:80"
- "0.0.0.0:443:443"
redis:
image: redis:3.2
image: redis:5.0
{%- if cookiecutter.use_celery == 'y' %}
celeryworker:
@ -67,3 +65,14 @@ services:
command: /start-flower
{%- endif %}
{% if cookiecutter.cloud_provider == 'AWS' %}
awscli:
build:
context: .
dockerfile: ./compose/production/aws/Dockerfile
env_file:
- ./.envs/.production/.django
volumes:
- production_postgres_data_backups:/backups
{%- endif %}

View File

@ -1,5 +1,6 @@
[pytest]
DJANGO_SETTINGS_MODULE=config.settings.test
addopts = --ds=config.settings.test --reuse-db
python_files = tests.py test_*.py
{%- if cookiecutter.js_task_runner != 'None' %}
norecursedirs = node_modules
{%- endif %}

View File

@ -1,33 +1,34 @@
pytz==2018.9 # https://github.com/stub42/pytz
python-slugify==2.0.1 # https://github.com/un33k/python-slugify
Pillow==5.4.1 # https://github.com/python-pillow/Pillow
pytz==2019.3 # https://github.com/stub42/pytz
python-slugify==4.0.0 # https://github.com/un33k/python-slugify
Pillow==7.0.0 # https://github.com/python-pillow/Pillow
{%- if cookiecutter.use_compressor == "y" %}
rcssmin==1.0.6{% if cookiecutter.windows == 'y' %} --install-option="--without-c-extensions"{% endif %} # https://github.com/ndparker/rcssmin
rcssmin==1.0.6{% if cookiecutter.windows == 'y' and cookiecutter.use_docker == 'n' %} --install-option="--without-c-extensions"{% endif %} # https://github.com/ndparker/rcssmin
{%- endif %}
argon2-cffi==19.1.0 # https://github.com/hynek/argon2_cffi
argon2-cffi==19.2.0 # https://github.com/hynek/argon2_cffi
{%- if cookiecutter.use_whitenoise == 'y' %}
whitenoise==4.1.2 # https://github.com/evansd/whitenoise
whitenoise==5.0.1 # https://github.com/evansd/whitenoise
{%- endif %}
redis>=2.10.6, < 3 # pyup: < 3 # https://github.com/antirez/redis
redis==3.3.11 # pyup: != 3.4.0 # https://github.com/andymccurdy/redis-py
{%- if cookiecutter.use_celery == "y" %}
celery==4.2.1 # pyup: < 5.0 # https://github.com/celery/celery
celery==4.4.0 # pyup: < 5.0 # https://github.com/celery/celery
django-celery-beat==1.6.0 # https://github.com/celery/django-celery-beat
{%- if cookiecutter.use_docker == 'y' %}
flower==0.9.2 # https://github.com/mher/flower
flower==0.9.3 # https://github.com/mher/flower
{%- endif %}
{%- endif %}
# Django
# ------------------------------------------------------------------------------
django==2.0.13 # pyup: < 2.1 # https://www.djangoproject.com/
django==2.2.9 # pyup: < 3.0 # https://www.djangoproject.com/
django-environ==0.4.5 # https://github.com/joke2k/django-environ
django-model-utils==3.1.2 # https://github.com/jazzband/django-model-utils
django-allauth==0.38.0 # https://github.com/pennersr/django-allauth
django-crispy-forms==1.7.2 # https://github.com/django-crispy-forms/django-crispy-forms
django-model-utils==4.0.0 # https://github.com/jazzband/django-model-utils
django-allauth==0.41.0 # https://github.com/pennersr/django-allauth
django-crispy-forms==1.8.1 # https://github.com/django-crispy-forms/django-crispy-forms
{%- if cookiecutter.use_compressor == "y" %}
django-compressor==2.2 # https://github.com/django-compressor/django-compressor
django-compressor==2.4 # https://github.com/django-compressor/django-compressor
{%- endif %}
django-redis==4.10.0 # https://github.com/niwinz/django-redis
django-redis==4.11.0 # https://github.com/niwinz/django-redis
# Django REST Framework
djangorestframework==3.9.1 # https://github.com/encode/django-rest-framework
djangorestframework==3.11.0 # https://github.com/encode/django-rest-framework
coreapi==2.3.3 # https://github.com/core-api/python-client

View File

@ -1,30 +1,37 @@
-r ./base.txt
Werkzeug==0.14.1 # https://github.com/pallets/werkzeug
ipdb==0.11 # https://github.com/gotcha/ipdb
Sphinx==1.8.4 # https://github.com/sphinx-doc/sphinx
Werkzeug==0.16.1 # https://github.com/pallets/werkzeug
ipdb==0.12.3 # https://github.com/gotcha/ipdb
Sphinx==2.3.1 # https://github.com/sphinx-doc/sphinx
{%- if cookiecutter.use_docker == 'y' %}
psycopg2==2.7.4 --no-binary psycopg2 # https://github.com/psycopg/psycopg2
psycopg2==2.8.4 --no-binary psycopg2 # https://github.com/psycopg/psycopg2
{%- else %}
psycopg2-binary==2.7.7 # https://github.com/psycopg/psycopg2
psycopg2-binary==2.8.4 # https://github.com/psycopg/psycopg2
{%- endif %}
# Testing
# ------------------------------------------------------------------------------
mypy==0.670 # https://github.com/python/mypy
pytest==4.2.0 # https://github.com/pytest-dev/pytest
mypy==0.761 # https://github.com/python/mypy
django-stubs==1.4.0 # https://github.com/typeddjango/django-stubs
pytest==5.3.5 # https://github.com/pytest-dev/pytest
pytest-sugar==0.9.2 # https://github.com/Frozenball/pytest-sugar
# Code quality
# ------------------------------------------------------------------------------
flake8==3.7.5 # https://github.com/PyCQA/flake8
coverage==4.5.2 # https://github.com/nedbat/coveragepy
flake8==3.7.9 # https://github.com/PyCQA/flake8
coverage==5.0.3 # https://github.com/nedbat/coveragepy
black==19.10b0 # https://github.com/ambv/black
pylint-django==2.0.13 # https://github.com/PyCQA/pylint-django
{%- if cookiecutter.use_celery == 'y' %}
pylint-celery==0.3 # https://github.com/PyCQA/pylint-celery
{%- endif %}
pre-commit==2.0.1 # https://github.com/pre-commit/pre-commit
# Django
# ------------------------------------------------------------------------------
factory-boy==2.11.1 # https://github.com/FactoryBoy/factory_boy
factory-boy==2.12.0 # https://github.com/FactoryBoy/factory_boy
django-debug-toolbar==1.11 # https://github.com/jazzband/django-debug-toolbar
django-extensions==2.1.5 # https://github.com/django-extensions/django-extensions
django-coverage-plugin==1.6.0 # https://github.com/nedbat/django_coverage_plugin
pytest-django==3.4.7 # https://github.com/pytest-dev/pytest-django
django-debug-toolbar==2.2 # https://github.com/jazzband/django-debug-toolbar
django-extensions==2.2.6 # https://github.com/django-extensions/django-extensions
django-coverage-plugin==1.8.0 # https://github.com/nedbat/django_coverage_plugin
pytest-django==3.8.0 # https://github.com/pytest-dev/pytest-django

View File

@ -2,16 +2,20 @@
-r ./base.txt
gunicorn==19.9.0 # https://github.com/benoitc/gunicorn
psycopg2==2.7.4 --no-binary psycopg2 # https://github.com/psycopg/psycopg2
gunicorn==20.0.4 # https://github.com/benoitc/gunicorn
psycopg2==2.8.4 --no-binary psycopg2 # https://github.com/psycopg/psycopg2
{%- if cookiecutter.use_whitenoise == 'n' %}
Collectfast==0.6.2 # https://github.com/antonagestam/collectfast
Collectfast==1.3.1 # https://github.com/antonagestam/collectfast
{%- endif %}
{%- if cookiecutter.use_sentry == "y" %}
raven==6.10.0 # https://github.com/getsentry/raven-python
sentry-sdk==0.14.1 # https://github.com/getsentry/sentry-python
{%- endif %}
# Django
# ------------------------------------------------------------------------------
django-storages[boto3]==1.7.1 # https://github.com/jschneier/django-storages
django-anymail[mailgun]==5.0 # https://github.com/anymail/django-anymail
{%- if cookiecutter.cloud_provider == 'AWS' %}
django-storages[boto3]==1.8 # https://github.com/jschneier/django-storages
{%- elif cookiecutter.cloud_provider == 'GCP' %}
django-storages[google]==1.8 # https://github.com/jschneier/django-storages
{%- endif %}
django-anymail[mailgun]==7.0.0 # https://github.com/anymail/django-anymail

View File

@ -1 +1 @@
python-3.6.8
python-3.7.6

View File

@ -7,14 +7,16 @@ max-line-length = 120
exclude = .tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules
[mypy]
python_version = 3.6
python_version = 3.7
check_untyped_defs = True
ignore_errors = False
ignore_missing_imports = True
strict_optional = True
warn_unused_ignores = True
warn_redundant_casts = True
warn_unused_configs = True
plugins = mypy_django_plugin.main
[mypy.plugins.django-stubs]
django_settings_module = config.settings.test
[mypy-*.migrations.*]
# Django migrations should not produce any errors:

View File

@ -0,0 +1,23 @@
##basic build dependencies of various Django apps for Ubuntu Bionic 18.04
#build-essential metapackage install: make, gcc, g++,
build-essential
#required to translate
gettext
python3-dev
##shared dependencies of:
##Pillow, pylibmc
zlib1g-dev
##Postgresql and psycopg2 dependencies
libpq-dev
##Pillow dependencies
libtiff5-dev
libjpeg8-dev
libfreetype6-dev
liblcms2-dev
libwebp-dev
##django-extensions
libgraphviz-dev

View File

@ -0,0 +1,23 @@
##basic build dependencies of various Django apps for Debian Jessie 10.x
#build-essential metapackage install: make, gcc, g++,
build-essential
#required to translate
gettext
python3-dev
##shared dependencies of:
##Pillow, pylibmc
zlib1g-dev
##Postgresql and psycopg2 dependencies
libpq-dev
##Pillow dependencies
libtiff5-dev
libjpeg62-turbo-dev
libfreetype6-dev
liblcms2-dev
libwebp-dev
##django-extensions
libgraphviz-dev

View File

@ -1,7 +1,7 @@
import pytest
from django.conf import settings
from django.test import RequestFactory
from {{ cookiecutter.project_slug }}.users.models import User
from {{ cookiecutter.project_slug }}.users.tests.factories import UserFactory
@ -11,7 +11,7 @@ def media_storage(settings, tmpdir):
@pytest.fixture
def user() -> settings.AUTH_USER_MODEL:
def user() -> User:
return UserFactory()

View File

@ -1,58 +0,0 @@
{% if cookiecutter.use_celery == 'y' %}
import os
from celery import Celery
from django.apps import apps, AppConfig
from django.conf import settings
if not settings.configured:
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'config.settings.local') # pragma: no cover
app = Celery('{{cookiecutter.project_slug}}')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
class CeleryAppConfig(AppConfig):
name = '{{cookiecutter.project_slug}}.taskapp'
verbose_name = 'Celery Config'
def ready(self):
installed_apps = [app_config.name for app_config in apps.get_app_configs()]
app.autodiscover_tasks(lambda: installed_apps, force=True)
{% if cookiecutter.use_sentry == 'y' -%}
if hasattr(settings, 'RAVEN_CONFIG'):
# Celery signal registration
{% if cookiecutter.use_pycharm == 'y' -%}
# Since raven is required in production only,
# imports might (most surely will) be wiped out
# during PyCharm code clean up started
# in other environments.
# @formatter:off
{%- endif %}
from raven import Client as RavenClient
from raven.contrib.celery import register_signal as raven_register_signal
from raven.contrib.celery import register_logger_signal as raven_register_logger_signal
{% if cookiecutter.use_pycharm == 'y' -%}
# @formatter:on
{%- endif %}
raven_client = RavenClient(dsn=settings.RAVEN_CONFIG['dsn'])
raven_register_logger_signal(raven_client)
raven_register_signal(raven_client)
{%- endif %}
@app.task(bind=True)
def debug_task(self):
print(f'Request: {self.request!r}') # pragma: no cover
{% else %}
# Use this as a starting point for your project with celery.
# If you are not using celery, you can remove this app
{% endif -%}

View File

@ -15,7 +15,7 @@
<form method="POST" action=".">
{% csrf_token %}
{{ form|crispy }}
<input type="submit" name="action" value="{% trans 'change password' %}"/>
<input class="btn btn-primary" type="submit" name="action" value="{% trans 'change password' %}"/>
</form>
{% else %}
<p>{% trans 'Your password is now changed.' %}</p>

View File

@ -11,7 +11,7 @@
<form method="POST" action="{% url 'account_set_password' %}" class="password_set">
{% csrf_token %}
{{ form|crispy }}
<input type="submit" name="action" value="{% trans 'Set Password' %}"/>
<input class="btn btn-primary" type="submit" name="action" value="{% trans 'Set Password' %}"/>
</form>
{% endblock %}
{% endraw %}

View File

@ -17,8 +17,8 @@
{% block css %}
{% endraw %}{% if cookiecutter.custom_bootstrap_compilation == "n" %}{% raw %}
<!-- Latest compiled and minified Bootstrap 4.1.1 CSS -->
<link rel="stylesheet" href="https://stackpath.bootstrapcdn.com/bootstrap/4.1.1/css/bootstrap.min.css" integrity="sha384-WskhaSGFgHYWDcbwN70/dfYBj47jz9qbsMId/iRN3ewGhXQFZCSftd1LZCfmhktB" crossorigin="anonymous">
<!-- Latest compiled and minified Bootstrap CSS -->
<link href="https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-ggOyR0iXCbMQv3Xipma34MD+dH/1fQ784/j6cY/iJTQUOhcWr7x9JvoRxT2MZw1T" crossorigin="anonymous">
{% endraw %}{% endif %}{% raw %}
<!-- Your stuff: Third-party CSS libraries go here -->
@ -80,7 +80,7 @@
{% if messages %}
{% for message in messages %}
<div class="alert {% if message.tags %}alert-{{ message.tags }}{% endif %}">{{ message }}</div>
<div class="alert {% if message.tags %}alert-{{ message.tags }}{% endif %}">{{ message }}<button type="button" class="close" data-dismiss="alert" aria-label="Close"><span aria-hidden="true">&times;</span></button></div>
{% endfor %}
{% endif %}
@ -102,10 +102,10 @@
<script src="{% static 'js/vendors.js' %}"></script>
{% endraw %}{% if cookiecutter.use_compressor == "y" %}{% raw %}{% endcompress %}{% endraw %}{% endif %}{% raw %}
{% endraw %}{% else %}{% raw %}
<!-- Required by Bootstrap v4.1.1 -->
<!-- Bootstrap JS and its dependencies-->
<script src="https://code.jquery.com/jquery-3.3.1.slim.min.js" integrity="sha384-q8i/X+965DzO0rT7abK41JStQIAqVgRVzpbzo5smXKp4YfRvH+8abtTE1Pi6jizo" crossorigin="anonymous"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/popper.js/1.14.3/umd/popper.min.js" integrity="sha384-ZMP7rVo3mIykV+2+9J3UJ46jBk0WLaUAdn689aCwoqbBJiSnjAK/l8WvCWPIPm49" crossorigin="anonymous"></script>
<script src="https://stackpath.bootstrapcdn.com/bootstrap/4.1.1/js/bootstrap.min.js" integrity="sha384-smHYKdLADwkXOn1EmN1qk/HfnUcbVRZyYmZ4qpPea6sjB/pTJ0euyQp0Mk8ck+5T" crossorigin="anonymous"></script>
<script src="https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/js/bootstrap.min.js" integrity="sha384-JjSmVgyd0p3pXB1rRibZUAYoIIy6OrQ6VrjIEaFf/nJGzIxFDsf4x0xIM+B07jRM" crossorigin="anonymous"></script>
<!-- Your stuff: Third-party javascript libraries go here -->
{% endraw %}{% endif %}{% raw %}

View File

@ -10,7 +10,7 @@
{{ form|crispy }}
<div class="control-group">
<div class="controls">
<button type="submit" class="btn">Update</button>
<button type="submit" class="btn btn-primary">Update</button>
</div>
</div>
</form>

View File

@ -1,17 +0,0 @@
{% raw %}{% extends "base.html" %}
{% load static i18n %}
{% block title %}Members{% endblock %}
{% block content %}
<div class="container">
<h2>Users</h2>
<div class="list-group">
{% for user in user_list %}
<a href="{% url 'users:detail' user.username %}" class="list-group-item">
<h4 class="list-group-item-heading">{{ user.username }}</h4>
</a>
{% endfor %}
</div>
</div>
{% endblock content %}{% endraw %}

View File

@ -7,12 +7,10 @@ from django.http import HttpRequest
class AccountAdapter(DefaultAccountAdapter):
def is_open_for_signup(self, request: HttpRequest):
return getattr(settings, "ACCOUNT_ALLOW_REGISTRATION", True)
class SocialAccountAdapter(DefaultSocialAccountAdapter):
def is_open_for_signup(self, request: HttpRequest, sociallogin: Any):
return getattr(settings, "ACCOUNT_ALLOW_REGISTRATION", True)

View File

@ -0,0 +1,12 @@
from rest_framework import serializers
from {{ cookiecutter.project_slug }}.users.models import User
class UserSerializer(serializers.ModelSerializer):
class Meta:
model = User
fields = ["username", "email", "name", "url"]
extra_kwargs = {
"url": {"view_name": "api:user-detail", "lookup_field": "username"}
}

Some files were not shown because too many files have changed in this diff Show More