This commit is contained in:
Burhan Khalid 2016-10-27 14:01:16 +03:00
commit add8c9b9d0
232 changed files with 5024 additions and 2043 deletions

View File

@ -21,3 +21,7 @@ trim_trailing_whitespace = false
[Makefile]
indent_style = tab
[nginx.conf]
indent_style = space
indent_size = 2

8
.gitignore vendored
View File

@ -25,15 +25,19 @@ sftp-config.json
*.pyc
.idea
_build
*.egg-info/
# Project Specific Stuff
local_settings.py
repo_name
project_slug
my_test_project/*
# Generated when running py.test for the cookiecutter-django generation tests
# Generated when running py.test for the Cookiecutter Django generation tests
.cache/
# Generated when running celery beat
celerybeat-schedule.db
# Unit test / coverage reports
.coverage
.tox

View File

@ -1,14 +1,35 @@
# Config file for automatic testing at travis-ci.org
sudo: false
sudo: required
services:
- docker
language: python
python: 3.5
env:
- TOX_ENV=py27
- TOX_ENV=py34
- TOX_ENV=py35
script: tox -e $TOX_ENV
before_install:
- sudo sh -c 'echo "deb https://apt.dockerproject.org/repo ubuntu-precise main" > /etc/apt/sources.list.d/docker.list'
- sudo apt-key adv --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys 58118E89F3A912897C070ADBF76221572C52609D
- sudo apt-get update
- sudo apt-key update
- sudo apt-get --force-yes -qqy -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install docker-engine=1.11.1-0~precise
- sudo rm /usr/local/bin/docker-compose
- curl -L https://github.com/docker/compose/releases/download/1.7.0/docker-compose-`uname -s`-`uname -m` > docker-compose
- chmod +x docker-compose
- sudo mv docker-compose /usr/local/bin
- docker-compose -v
- docker -v
script:
- tox -e $TOX_ENV
- sh tests/test_docker.sh
install:
- pip install tox

View File

@ -1,17 +1,358 @@
# Change Log
All enhancements and patches to cookiecutter-django will be documented in this file.
All enhancements and patches to Cookiecutter Django will be documented in this file.
This project adheres to [Semantic Versioning](http://semver.org/).
## [2016-1-16]
### Add explanation for having `django.contrib.sites`. (@pydanny)
## [2015-10-08]
### Changed
- Elastic Beanstalk: Added --noinput to migrate command (@MightySCollins )
## [2015-10-07]
### Added
- Finished first pass at Elastic Beanstalk docs (@pydanny & @audreyr)
### Deleted
- Removed Heroku instant deploy button (@pydanny)
## [2016-1-13]
##[2016-09-29]
### Added
- Added default `AUTH_PASSWORD_VALIDATORS` configuration, generated by django 1.10 startproject. See [Password Validation docs](https://docs.djangoproject.com/en/1.10/topics/auth/passwords/#module-django.contrib.auth.password_validation") (@luzfcb)
- Rename `MIDDLEWARE_CLASSES` to `MIDDLEWARE` to enable support to [new style middleware](https://github.com/django/deps/blob/master/final/0005-improved-middleware.rst) introduced in Django 1.10 (@luzfcb)
- New setting `MAILGUN_SENDER_DOMAIN` to allow sending mail from any domain other than those registered with mailgun (@jangeador)
- add `urlpatterns` configuration to django-debug-toolbar, because the automatic configuration of `urlpatterns` was removed from django-debug-toolbar (@luzfcb)
- Added Temporary workaround on `requirements/local.txt` to fix django-debug-toolbar issue: https://github.com/pydanny/cookiecutter-django/issues/827 (@luzfcb)
### Changed
- Upgrade to Django 1.10.1 (@luzfcb)
- Upgrade django-model-utils to 2.6, django-redis to 4.5.0, redis to 2.10.5, Sphinx to 1.4.6, pytest-django to 3.0.0, django-anymail to 0.5, raven to 5.27.1, whitenoise to 3.2.2 (@luzfcb)
- Upgrade to Bootstrap 4 Alpha 4, jQuery to 3.1.1, tether.js to 1.3.7 (@luzfcb)
- Update `manage.py` to use same code of `manage.py` from Django 1.10 (@luzfcb)
- Sync `sites` app migrations with django 1.10, and fix aditional migrations to `sites` and `user` app (@luzfcb)
d changed 'admin' url on `config/urls.py`, to stay the same as generated by django 1.10 (@luzfcb)
- Make test_docker.sh tests pass by passing new password auth rules (@ssteinerx)
### Removed
- Removed django-autoslug because not support django 1.10 at this date (@luzfcb)
##[2016-09-10]
### Changed
- Use app registry instead of INSTALLED_APPS to discover celery tasks (@dhepper)
- PEP8 imports fix (@aleprovencio)
### Removed
- Removed django-floppyforms (@pydanny)
##[2016-09-08]
### Removed
- Webpack support, see #774 (@ssteinerx)
##[2016-08-10]
## Added
- PostgreSQL versions are now selectable, instead of defaulting to 9.5; the minimum version is 9.2, which is supported by [Heroku](https://devcenter.heroku.com/articles/heroku-postgresql#version-support-and-legacy-infrastructure) and Django (@burhan)
- Fixed minor issue in the README.rst (@burhan)
##[2016-08-03]
## Changed
- Upgrade to Bootstrap 4 Alpha 3 and its dependencies, including jQuery (@audreyr)
##[2016-06-25]
## Changed
- use `https` instead `ssh` to clone [cookiecutter-webpack](https://github.com/hzdg/cookiecutter-webpack) if `Webpack` is selected as `JS Task Runner` - fix issue #647 (@luzfcb and @resakse)
##[2016-06-24]
## Added
- Settings file for running tests faster (@audreyr)
- Add GPLv3 licence support (@cgaspoz)
## Changed
- Makes the database backups compressed. restores compressed backups (@jangeador)
- Review and edit django-allauth templates (@kappataumu)
##[2016-06-19]
## Added
- Webpack as an option (@goldhand)
##[2016-06-17]
## Added
- django-compressor support (@andresgz)
- Debian Jessie OS Requirements (@ddiazpinto)
##[2016-06-14]
### Changed
- Move Docker backups to their own section (@pydanny)
##[2016-06-13]
### Changed
- Use latest redis image in Docker (@pydanny)
- Documentation cleanup and corrections (@audreyr)
##[2016-06-12]
### Changed
- Documentation cleanup and corrections (@kappataumu)
##[2016-06-11]
### Changed
- Enhancements to the developing locally docs (@antoniablair)
##[2016-06-06]
### Changed
- Pin Bootstrap CSS and JS to v4.0.0-alpha.2, use minified versions
##[2016-06-05]
### Added
- Configurable admin for users (@pydanny, @jayfk, @dezoito)
##[2016-06-04]
### Added
- Let's Encrypt automation and instruction (@mjsisley and @chrisdev)
##[2016-06-03]
### Added
- Documentation for debugging with Docker (@mjsisley)
- Apache 2 License option in `cookiecutter.json` (@dot2dotseurat)
- Removed unnecessary version check from `pre_gen_project.py` (@suledev)
- Add gulp alternative as a js task runner and fix navbar style issue (@viviangb and @xpostudio4)
### Deleted
- AngularJS (@pydanny)
- django-secure (@xpostudio4)
##[2016-06-02]
### Added
- Added better instructions for installing postgres on Mac OS X (@dot2dotseurat )
##[2016-05-22]
### Added
- Added instructions for copying backups from docker to host (@phiberjenz)
- Added mailhog docker container (@noisy)
##[2016-05-15]
### Added
- Added GitLab continuous integration article to README.rst (@dezoito)
## [2016-05-13]
### Changed
- Update version of pyflakes to 1.2.3, django-extensions to 1.6.7 and gunicorn to 19.5.0 (@luzfcb)
- Update version of AngularJS to 1.5.5 (@luzfcb)
### Removed
- Remove Raven 404 catch middleware. Fix #367 (@pydanny)
## [2016-05-09]
### Changed
- Improved mailhog usage documentation on `developing-locally.rst` (@shireenrao)
- Replaced all `readthedocs.org` referencies to point to the new domain `readthedocs.io` (@luzfcb)
- Update version of pyflakes (@luzfcb)
## [2016-05-08]
### Changed
- Updated whitenoise configuration to match changes in version 3.0 (@trungdong)
## [2016-05-07]
### Added
- Added Ubuntu 16.04 dependencies on a new dependency file `requirements.apt.xenial` (@raonyguimaraes)
### Changed
- Small improvements in ``install_os_dependencies.sh`` support new dependency file (@raonyguimaraes)
## [2016-05-06]
### Changed
- Update version of pyflakes (@pydanny)
## [2016-05-03]
### Changed
- Update version of Django, django-extensions, django-mailgun (@luzfcb)
### [2016-05-01]
### Changed
- Restored the Pycharm project configuration files, that was accidentally removed in [15f350f](https://github.com/pydanny/cookiecutter-django/commit/15f350f05e2b49b4bdff0bdaa2b2ff260606e0f6) (@luzfcb @Newton715)
### [2016-04-30]
### Changed
- Small fixes to utility scripts (@scast)
### [2016-04-26]
### Added
- Instructions on how to install PythonAnywhere. (@hjwp)
### [2016-04-25]
### Added
- Check to confirm that the user has a modern version of Cookiecutter. (@pydanny)
### Removed
- Removed hitch per #529 (@pydanny)
### [2016-04-20]
### Changed
- Default to today's date in cookiecutter.json. (@audreyr)
- Change repo_name to project_slug for clarity. (@audreyr)
- Transform project name to lowercase for slug. (@audreyr)
### [2016-04-19]
### Added
- "Got Questions?" section in our README.rst. Yes, there is now a cookiecutter-django tag on Stack Overflow! (@pydanny)
### Changed
- Update usage instructions with new prompts, minor cleanup (@audreyr)
### [2016-04-18]
### Added
- removing duplication of depends_on in docker-compose.yml (@noisy)
### [2016-04-17]
### Added
- "Built with Cookiecutter Django" badge to generated project README (@audreyr)
- New introductory article (@krzysztofzuraw)
### Changed
- Quote consistency, single quotes everywhere! (@blopker)
### [2016-04-15]
### Changed
- Major project generation cleanup (@jayfk)
### Removed
- Deleting unnecessary .idea dir from MAIN directory (@noisy)
### [2016-04-14]
### Added
- Added typecheck in .pylintrc to fix pylint-django gets "no-member" error (@solvire)
### Changed
- Downgrading python-dateutil to version 2.4.2 because pykwalify==1.5.0 (required by HitchTest) uses a [pinned version of python-dateutil](https://github.com/Grokzen/pykwalify/blob/1.5.0/setup.py#L31) (@noisy)
- Update Pillow version to 3.2.0 (security fix) (@luzfcb)
### [2016-04-12]
### Changed
- celeryworker and celerybeat missing the correct dockerfile (@jayfk)
### [2016-04-08]
### Changed
- Move to named docker volumes (@jayfk)
### [2016-04-07]
### Changed
- Pycharm Support (including debugging in Docker) @noisy
- Set the correct License @epileptic-fish
### [2016-03-23]
### Changed
- Fixed issue on LICENSE file generation (@romanosipenko)
- In install_python_dependencies.sh file, Fixed wrong reference to python3 if use_python2 was set to y (@luzfcb @noisy)
### [2016-03-16]
### Changed
- Set the correct postgres username in dev.yml (@calculuscowboy)
## [2016-03-14]
### Changed
- Enforce `repo_name` as proper python module (@catherinedevlin)
## [2016-03-08]
### Changed
- Docker configuration now uses docker-compose format v2 (@aeikenberry)
- Make sure that STATIC_URL != MEDIA_URL (@cdvv7788)
- fix minor typos in project README (@menzenski)
- Updated docker docs (@jayfk)
### Added
- Added database controls for docker (@jayfk)
## [2016-03-05]
### Changed
- Update version of Django, celery, django-test-plus (@luzfcb)
- Update version of Hitch tests dependencies: jupyter_client (@luzfcb)
- Update 'now' date in cookiecutter.json (@luzfcb)
- Update the usage example in README (@luzfcb)
## [2016-03-01]
### Changed
- Update version of Django, flake8, pyflakes, pytest, factory_boy, ipdb, Werkzeug, gevent (@luzfcb)
- Update version of Hitch tests dependencies: click, hitchserve, hitchsystem, hitchtest, ipython, psutil, python-dateutil(@luzfcb)
- Update Tether (JS) version to 1.2.0 (@luzfcb)
## [2016-02-24]
### Added
- Beginning support for `py.test` (@pydanny)
### Changed
- Fixed missing div closing tag for "container" on user_list.html (@Eraldo)
## [2016-02-18]
### Changed
- The status of the registration (open or closed) is now read from the project environment instead of hardcoded in the common settings file. (@Eraldo)
- Renamed the adapter.py file to adapters.py to match the django naming convention. (@Eraldo)
## [2016-02-15]
### Changed
- In `users` app adapter, fix `is_open_for_signup` missing parameter (@oryx2)
- Fixes and improvements in Hitch tests , see [#485](https://github.com/pydanny/cookiecutter-django/pull/485) (@crdoconnor)
## [2016-02-12]
### Changed
- Fixed typo (@yunti)
## [2016-02-07]
### Changed
- In `users` app, use Django 1.9 `LoginRequiredMixin` instead of django-braces implementation (@yunti)
- Update native OS libraries of Hitch Test, because [unixpackage](https://github.com/unixpackage/unixpackage) now supports multiple versions of same Linux distribution (@crdoconnor)
- Update AngularJS version to 1.5.0 (@luzfcb)
- Update version of wheel, Pillow, django_coverage_plugin (@luzfcb)
- Update version of Hitch tests dependencies: decorator, hitchselenium, ipython, ptyprocess, selenium (@luzfcb)
- Provided options for FOSS license choices, or for private efforts, no written license (@pydanny)
## [2016-02-01]
### Changed
- Update version of Django and django-floppyforms (@luzfcb)
- Update version of Hitch tests dependencies: hitchpython and selenium (@luzfcb)
## [2016-01-30]
### Changed
- Update flake8 to 2.5.2 (@luzfcb)
## [2016-01-29]
### Changed
- Update AngularJS version to 1.4.9 (@luzfcb)
- Update jQuery version to 2.2.0 (@luzfcb)
- Update 'now' date in cookiecutter.json (@luzfcb)
- Update version of boto, celery, django_coverage_plugin, django-storages-redux, flake8, gevent, gunicorn, pep8, pytest, tox, Werkzeug (@luzfcb)
- Update version of Hitch tests dependencies: colorama, decorator, hitchpostgres, hitchpython, hitchredis, hitchselenium, hitchserve, hitchsystem, hitchtest, ipython, patool, pickleshare, psutil, python-build, requests, selenium, tblib, traitlets (@luzfcb)
## [2016-01-26]
### Changed
- Fixed NEW_RELIC_APP_NAME environment variable (@jayfk)
## [2016-1-18]
### Added
- Added .dockerignore file (@bogdal)
- Docker tests for travis (@jayfk)
### Changed
- Removed the $-sign from allowed chars to generate the secret key (@jayfk)
## [2016-01-17]
### Added
- Adding a section on third party articles referencing `cookiecutter-django` (@mjheo)
### Changed
- Add celerybeat db to gitignore (@originell)
## [2016-01-16]
### Added
- Adding an explanation for having `django.contrib.sites`. (@pydanny)
## [2016-01-13]
### Changed
- Update setup.py version to 1.9.1 to match Django version. (@Collederas)
- Require Wheel 0.26.0. Needed to install certain packages on CPython 3.5+ like Pillow and psycopg2 (@audreyr)
## [2016-1-9]
## [2016-01-09]
### Changed
- Upgraded django-extensions to 1.6.1 as it fixes a [JSONField bug](https://github.com/django-extensions/django-extensions/blob/master/CHANGELOG.md#161) (@burhan)
- Upgraded Pillow to version 3.1.0 ([upstream changelog](https://github.com/python-pillow/Pillow/blob/master/CHANGES.rst#310-2016-01-04)) (@burhan)
@ -19,18 +360,19 @@ This project adheres to [Semantic Versioning](http://semver.org/).
- Upgraded django-crispy-forms to 1.6 for [BS4 and django 1.9 compatibility fixes](https://github.com/maraujop/django-crispy-forms/blob/dev/CHANGELOG.md#160-201617) (@burhan)
- Upgraded django-model-utils to 2.4, to enable [support for django 1.9](https://github.com/carljm/django-model-utils/blob/master/CHANGES.rst#24-2015-12-03) (@burhan)
## [2016-1-8]
## [2016-01-08]
### Changed
- Fixed redis url on docker (@jayfk)
- Fixed docker on windows (@burhan)
## [2016-1-6]
## [2016-01-06]
### Added
- You can now enable or disable user registration using the ACCOUNT_ALLOW_REGISTRATION setting. (@ddiazpinto)
### Changed
- Use Postgres 9.5 on docker (@jayfk)
## [2016-1-4]
## [2016-01-04]
### Added
- Add Tether.js because [is needed](http://v4-alpha.getbootstrap.com/components/tooltips/#overview) for proper positioning of Bootstrap tooltips (@EricZaporzan)
@ -333,7 +675,7 @@ This project adheres to [Semantic Versioning](http://semver.org/).
- Styles that already exist in Bootstrap 4 (or 3) (@audreyr)
### Changed
- Fix issue #296 - change login.html to use [get_providers](https://github.com/pennersr/django-allauth/blob/master/allauth/socialaccount/templatetags/socialaccount.py#L84-L93) templatetag because ``allauth.socialaccount`` context processor now is [deprecated](http://django-allauth.readthedocs.org/en/latest/changelog.html#from-0-21-0) (@luzfcb)
- Fix issue #296 - change login.html to use [get_providers](https://github.com/pennersr/django-allauth/blob/master/allauth/socialaccount/templatetags/socialaccount.py#L84-L93) templatetag because ``allauth.socialaccount`` context processor now is [deprecated](http://django-allauth.readthedocs.io/en/latest/changelog.html#from-0-21-0) (@luzfcb)
## [2015-09-09]
### Added

View File

@ -49,7 +49,7 @@ To run a particular test with tox for against your current Python version::
$ tox -e py -- -k test_default_configuration
.. _`pytest usage docs`: https://pytest.org/latest/usage.html#specifying-tests-selecting-tests
.. _`tox`: https://tox.readthedocs.org/en/latest/
.. _`tox`: https://tox.readthedocs.io/en/latest/
.. _`pip`: https://pypi.python.org/pypi/pip/
.. _`pytest-cookies`: https://pypi.python.org/pypi/pytest-cookies/
.. _`flake8`: https://pypi.python.org/pypi/flake8/

View File

@ -2,7 +2,7 @@ Contributors
============
Core Developers
----------------
---------------
These contributors have commit flags for the repository,
and are able to accept and merge pull requests.
@ -12,7 +12,7 @@ Name Github Twitter
=========================== ============= ===========
Daniel Roy Greenfeld `@pydanny`_ @pydanny
Audrey Roy Greenfeld* `@audreyr`_ @audreyr
Fábio C. Barrionuevo da Luz `@luzfcb`_
Fábio C. Barrionuevo da Luz `@luzfcb`_ @luzfcb
Saurabh Kumar `@theskumar`_ @_theskumar
Jannis Gebauer `@jayfk`_
Burhan Khalid `@burhan`_ @burhan
@ -28,14 +28,16 @@ Daniel are on the Cookiecutter core team.*
.. _@jayfk: https://github.com/jayfk
Other Contributors
-------------------
------------------
Listed in alphabetical order.
========================== ============================ ==============
Name Github Twitter
========================== ============================ ==============
18 `@dezoito`_
a7p `@a7p`_
Aaron Eikenberry `@aeikenberry`_
Adam Bogdał `@bogdal`_
Adam Dobrawy `@ad-m`_
Agam Dua
@ -43,26 +45,41 @@ Listed in alphabetical order.
Alex Tsai `@caffodian`_
Alvaro [Andor] `@andor-pierdelacabeza`_
Amjith Ramanujam `@amjith`_
Andreas Meistad `@ameistad`_
Andres Gonzalez `@andresgz`_
Andrew Mikhnevich `@zcho`_
Andy Rose
Anna Callahan `@jazztpt`_
Antonia Blair `@antoniablair`_ @antoniablairart
Areski Belaid `@areski`_
Ashley Camba
Barclay Gauld `@yunti`_
Ben Lopatin
Benjamin Abel
Bo Lopker `@blopker`_
Bouke Haarsma
Brent Payne `@brentpayne`_ @brentpayne
Burhan Khalid `@burhan`_ @burhan
Catherine Devlin `@catherinedevlin`_
Cédric Gaspoz `@cgaspoz`_
Chris Curvey `@ccurvey`_
Chris Franklin
Chris Franklin `@hairychris`_
Chris Pappalardo `@ChrisPappalardo`_
Collederas `@Collederas`
Christopher Clarke `@chrisdev`_
Collederas `@Collederas`_
Cristian Vargas `@cdvv7788`_
Cullen Rhodes `@c-rhodes`_
Dan Shultz `@shultz`_
Daniel Hepper `@dhepper`_ @danielhepper
Daniele Tricoli `@eriol`_
David Díaz `@ddiazpinto`_ @DavidDiazPinto
Davur Clementsen `@dsclementsen`_ @davur
Davur Clementsen `@dsclementsen`_ @davur
Delio Castillo `@jangeador`_ @jangeador
Dónal Adams `@epileptic-fish`_
Dong Huynh `@trungdong`_
Emanuel Calso `@bloodpet`_ @bloodpet
Eraldo Energy `@eraldo`_
Eyad Al Sibai `@eyadsibai`_
Felipe Arruda `@arruda`_
Garry Cairns `@garry-cairns`_
@ -71,78 +88,139 @@ Listed in alphabetical order.
Henrique G. G. Pereira `@ikkebr`_
Ian Lee `@IanLee1521`_
Jan Van Bruggen `@jvanbrug`_
Jens Nilsson `@phiberjenz`_
Julien Almarcha `@sladinji`_
Julio Castillo `@juliocc`_
Kaido Kert `@kaidokert`_
kappataumu `@kappataumu`_ @kappataumu
Kaveh `@ka7eh`_
Kevin A. Stone
Kevin Ndung'u `@kevgathuku`_
Keith Webber `@townie`_
Krzysztof Szumny `@noisy`_
Krzysztof Żuraw `@krzysztofzuraw`_
Leonardo Jimenez `@xpostudio4`_
Lin Xianyi `@iynaix`_
Luis Nell `@originell`_
Lukas Klein
Lyla Fischer
Martin Blech
Mathijs Hoogland `@MathijsHoogland`_
Matt Linares
Matt Menzenski `@menzenski`_
Matt Warren `@mfwarren`_
Matthew Sisley `@mjsisley`_
Meghan Heintz `@dot2dotseurat`_
mozillazg `@mozillazg`_
Pablo `@oubiga`_
Parbhat Puri `@parbhat`_
Peter Bittner `@bittner`_
Raphael Pierzina `@hackebrot`_
Raony Guimarães Corrêa `@raonyguimaraes`_
René Muhl `@rm--`_
Roman Afanaskin `@siauPatrick`_
Roman Osipenko `@romanosipenko`_
Russell Davies
Sam Collins `@MightySCollins`_
stepmr `@stepmr`_
Sławek Ehlert `@slafs`_
Srinivas Nyayapati `@shireenrao`_
Steve Steiner `@ssteinerX`_
Sule Marshall `@suledev`_
Taylor Baldwin
Théo Segonds `@show0k`_
Tom Atkins `@knitatoms`_
Tom Offermann
Travis McNeill `@Travistock`_ @tavistock_esq
Travis McNeill `@Travistock`_ @tavistock_esq
Vitaly Babiy
Vivian Guillen `@viviangb`_
Will Farley `@goldhand`_ @g01dhand
Yaroslav Halchenko
========================== ============================ ==============
.. _@areski: https://github.com/areski
.. _@a7p: https://github.com/a7p
.. _@bogdal: https://github.com/bogdal
.. _@ad-m: https://github.com/ad-m
.. _@aeikenberry: https://github.com/aeikenberry
.. _@alb3rto: https://github.com/alb3rto
.. _@caffodian: https://github.com/caffodian
.. _@andor-pierdelacabeza: https://github.com/andor-pierdelacabeza
.. _@ameistad: https://github.com/ameistad
.. _@amjith: https://github.com/amjith
.. _@zcho: https://github.com/zcho
.. _@jazztpt: https://github.com/jazztpt
.. _@yunti: https://github.com/yunti
.. _@andor-pierdelacabeza: https://github.com/andor-pierdelacabeza
.. _@antoniablair: https://github.com/antoniablair
.. _@areski: https://github.com/areski
.. _@arruda: https://github.com/arruda
.. _@bittner: https://github.com/bittner
.. _@bloodpet: https://github.com/bloodpet
.. _@blopker: https://github.com/blopker
.. _@bogdal: https://github.com/bogdal
.. _@burhan: https://github.com/burhan
.. _@ccurvey: https://github.com/ccurvey
.. _@hairychris: https://github.com/hairychris
.. _@ChrisPappalardo: https://github.com/ChrisPappalardo
.. _@cdvv7788: https://github.com/cdvv7788
.. _@c-rhodes: https://github.com/c-rhodes
.. _@caffodian: https://github.com/caffodian
.. _@catherinedevlin: https://github.com/catherinedevlin
.. _@ccurvey: https://github.com/ccurvey
.. _@cdvv7788: https://github.com/cdvv7788
.. _@cgaspoz: https://github.com/cgaspoz
.. _@chrisdev: https://github.com/chrisdev
.. _@ChrisPappalardo: https://github.com/ChrisPappalardo
.. _@Collederas: https://github.com/Collederas
.. _@ddiazpinto: https://github.com/ddiazpinto
.. _@dezoito: https://github.com/dezoito
.. _@dhepper: https://github.com/dhepper
.. _@dot2dotseurat: https://github.com/dot2dotseurat
.. _@dsclementsen: https://github.com/dsclementsen
.. _@epileptic-fish: https://gihub.com/epileptic-fish
.. _@eraldo: https://github.com/eraldo
.. _@eriol: https://github.com/eriol
.. _@eyadsibai: https://github.com/eyadsibai
.. _@arruda: https://github.com/arruda
.. _@garry-cairns: https://github.com/garry-cairns
.. _@garrypolley: https://github.com/garrypolley
.. _@hjwp: https://github.com/hjwp
.. _@ikkebr: https://github.com/ikkebr
.. _@IanLee1521: https://github.com/IanLee1521
.. _@juliocc: https://github.com/juliocc
.. _@kaidokert: https://github.com/kaidokert
.. _@ka7eh: https://github.com/ka7eh
.. _@kevgathuku: https://github.com/kevgathuku
.. _@iynaix: https://github.com/iynaix
.. _@MathijsHoogland: https://github.com/MathijsHoogland
.. _@mfwarren: https://github.com/mfwarren
.. _@mozillazg: https://github.com/mozillazg
.. _@oubiga: https://github.com/oubiga
.. _@goldhand: https://github.com/goldhand
.. _@hackebrot: https://github.com/hackebrot
.. _@siauPatrick: https://github.com/siauPatrick
.. _@stepmr: https://github.com/stepmr
.. _@slafs: https://github.com/slafs
.. _@show0k: https://github.com/show0k
.. _@knitatoms: https://github.com/knitatoms
.. _@Travistock: https://github.com/Tavistock
.. _@hairychris: https://github.com/hairychris
.. _@hjwp: https://github.com/hjwp
.. _@IanLee1521: https://github.com/IanLee1521
.. _@ikkebr: https://github.com/ikkebr
.. _@iynaix: https://github.com/iynaix
.. _@jazztpt: https://github.com/jazztpt
.. _@juliocc: https://github.com/juliocc
.. _@jvanbrug: https://github.com/jvanbrug
.. _@ddiazpinto: https://github.com/ddiazpinto
.. _@Collederas: https://github.com/Collederas
.. _@ka7eh: https://github.com/ka7eh
.. _@kaidokert: https://github.com/kaidokert
.. _@kappataumu: https://github.com/kappataumu
.. _@kevgathuku: https://github.com/kevgathuku
.. _@knitatoms: https://github.com/knitatoms
.. _@krzysztofzuraw: https://github.com/krzysztofzuraw
.. _@MathijsHoogland: https://github.com/MathijsHoogland
.. _@menzenski: https://github.com/menzenski
.. _@mfwarren: https://github.com/mfwarren
.. _@mjsisley: https://github.com/mjsisley
.. _@mozillazg: https://github.com/mozillazg
.. _@noisy: https://github.com/noisy
.. _@originell: https://github.com/originell
.. _@oubiga: https://github.com/oubiga
.. _@parbhat: https://github.com/parbhat
.. _@raonyguimaraes: https://github.com/raonyguimaraes
.. _@rm--: https://github.com/rm--
.. _@romanosipenko: https://github.com/romanosipenko
.. _@shireenrao: https://github.com/shireenrao
.. _@show0k: https://github.com/show0k
.. _@shultz: https://github.com/shultz
.. _@siauPatrick: https://github.com/siauPatrick
.. _@slafs: https://github.com/slafs
.. _@ssteinerX: https://github.com/ssteinerx
.. _@stepmr: https://github.com/stepmr
.. _@suledev: https://github.com/suledev
.. _@Travistock: https://github.com/Tavistock
.. _@trungdong: https://github.com/trungdong
.. _@viviangb: httpsL//github.com/viviangb
.. _@xpostudio4: https://github.com/xpostudio4
.. _@yunti: https://github.com/yunti
.. _@zcho: https://github.com/zcho
.. _@phiberjenz: https://github.com/phiberjenz
.. _@sladinji: https://github.com/sladinji
.. _@andresgz: https://github.com/andresgz
.. _@jangeador: https://github.com/jangeador
.. _@townie: https://github.com/townie
.. _@MightySCollins: https://github.com/MightySCollins
Special Thanks
~~~~~~~~~~~~~~

View File

@ -1,4 +1,4 @@
Copyright (c) 2013, Daniel Greenfeld
Copyright (c) 2013-2016, Daniel Greenfeld
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
@ -11,7 +11,7 @@ are permitted provided that the following conditions are met:
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
* Neither the name of cookiecutter-django nor the names of its contributors may
* Neither the name of Cookiecutter Django nor the names of its contributors may
be used to endorse or promote products derived from this software without
specific prior written permission.

View File

@ -1,9 +1,9 @@
cookiecutter-django
Cookiecutter Django
=======================
.. image:: https://requires.io/github/pydanny/cookiecutter-django/requirements.svg?branch=master
:target: https://requires.io/github/pydanny/cookiecutter-django/requirements/?branch=master
:alt: Requirements Status
.. image:: https://pyup.io/repos/github/pydanny/cookiecutter-django/shield.svg
:target: https://pyup.io/repos/github/pydanny/cookiecutter-django/
:alt: Updates
.. image:: https://travis-ci.org/pydanny/cookiecutter-django.svg?branch=master
:target: https://travis-ci.org/pydanny/cookiecutter-django?branch=master
@ -12,28 +12,40 @@ cookiecutter-django
.. image:: https://badges.gitter.im/Join Chat.svg
:target: https://gitter.im/pydanny/cookiecutter-django?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
Powered by Cookiecutter_, Cookiecutter Django is a framework for jumpstarting production-ready Django projects quickly.
A Cookiecutter_ template for Django.
* Documentation: https://cookiecutter-django.readthedocs.io/en/latest/
* See Troubleshooting_ for common errors and obstacles
.. _cookiecutter: https://github.com/audreyr/cookiecutter
.. _Troubleshooting: https://cookiecutter-django.readthedocs.io/en/latest/troubleshooting.html
.. _528: https://github.com/pydanny/cookiecutter-django/issues/528#issuecomment-212650373
Features
---------
* For Django 1.9
* Renders Django projects with 100% test coverage
* Twitter Bootstrap_ v4.0.0 - alpha_
* End-to-end via Hitch_
* AngularJS_
* For Django 1.10
* Renders Django projects with 100% starting test coverage
* Twitter Bootstrap_ v4.0.0 - `alpha 4`_ (`maintained Foundation fork`_ also available)
* 12-Factor_ based settings via django-environ_
* Optimized development and production settings
* Registration via django-allauth_
* Comes with custom user model ready to go.
* Comes with custom user model ready to go
* Grunt build for compass and livereload
* Basic e-mail configurations for sending emails via Mailgun_
* Send emails via Anymail_ (using Mailgun_ by default, but switchable)
* Media storage using Amazon S3
* Docker support using docker-compose_ for development and production
* Procfile_ for deploying to Heroku
* Instructions for deploying to PythonAnywhere_
* Works with Python 2.7.x or 3.5.x
* Run tests with unittest or py.test
* Customizable PostgreSQL version
* Experimental support for Amazon Elastic Beanstalk
.. _`maintained Foundation fork`: https://github.com/Parbhat/cookiecutter-django-foundation
Optional Integrations
---------------------
@ -44,33 +56,31 @@ Optional Integrations
* Configuration for Celery_
* Integration with MailHog_ for local email testing
* Integration with Sentry_ for error logging
* Integration with NewRelic_ for performance monitoring
* Integration with Opbeat_ for performance monitoring
.. _alpha: http://blog.getbootstrap.com/2015/08/19/bootstrap-4-alpha/
.. _Hitch: https://github.com/hitchtest/hitchtest
.. _`alpha 4`: http://blog.getbootstrap.com/2016/09/05/bootstrap-4-alpha-4/
.. _Bootstrap: https://github.com/twbs/bootstrap
.. _AngularJS: https://github.com/angular/angular.js
.. _django-environ: https://github.com/joke2k/django-environ
.. _12-Factor: http://12factor.net/
.. _django-allauth: https://github.com/pennersr/django-allauth
.. _django-avatar: https://github.com/jezdez/django-avatar/
.. _django-avatar: https://github.com/grantmcconnaughey/django-avatar
.. _Procfile: https://devcenter.heroku.com/articles/procfile
.. _Mailgun: https://mailgun.com/
.. _Whitenoise: https://whitenoise.readthedocs.org/
.. _Mailgun: http://www.mailgun.com/
.. _Whitenoise: https://whitenoise.readthedocs.io/
.. _Celery: http://www.celeryproject.org/
.. _Anymail: https://github.com/anymail/django-anymail
.. _MailHog: https://github.com/mailhog/MailHog
.. _Sentry: https://getsentry.com
.. _NewRelic: https://newrelic.com
.. _docker-compose: https://www.github.com/docker/compose
.. _Sentry: https://getsentry.com/welcome/
.. _docker-compose: https://github.com/docker/compose
.. _Opbeat: https://opbeat.com/
.. _PythonAnywhere: https://www.pythonanywhere.com/
Constraints
-----------
* Only maintained 3rd party libraries are used.
* PostgreSQL everywhere (9.0+)
* Uses PostgreSQL everywhere (9.2+)
* Environment variables for configuration (This won't work with Apache/mod_wsgi).
@ -80,22 +90,19 @@ Usage
Let's pretend you want to create a Django project called "redditclone". Rather than using `startproject`
and then editing the results to include your name, email, and various configuration issues that always get forgotten until the worst possible moment, get cookiecutter_ to do all the work.
First, get cookiecutter. Trust me, it's awesome::
First, get Cookiecutter. Trust me, it's awesome::
$ pip install cookiecutter
$ pip install "cookiecutter>=1.4.0"
Now run it against this repo::
$ cookiecutter https://github.com/pydanny/cookiecutter-django.git
You'll be prompted for some questions, answer them, then it will create a Django project for you.
$ cookiecutter https://github.com/pydanny/cookiecutter-django
You'll be prompted for some values. Provide them, then a Django project will be created for you.
**Warning**: After this point, change 'Daniel Greenfeld', 'pydanny', etc to your own information.
**Warning**: repo_name must be a valid Python module name or you will have issues on imports.
It prompts you for questions. Answer them::
Answer the prompts with your own desired options_. For example::
Cloning into 'cookiecutter-django'...
remote: Counting objects: 550, done.
@ -103,32 +110,53 @@ It prompts you for questions. Answer them::
remote: Total 550 (delta 283), reused 479 (delta 222)
Receiving objects: 100% (550/550), 127.66 KiB | 58 KiB/s, done.
Resolving deltas: 100% (283/283), done.
project_name [project_name]: Reddit Clone
repo_name [Reddit_Clone]: reddit
author_name [Your Name]: Daniel Greenfeld
email [Your email]: pydanny@gmail.com
project_name [Project Name]: Reddit Clone
project_slug [reddit_clone]: reddit
author_name [Daniel Roy Greenfeld]: Daniel Greenfeld
email [you@example.com]: pydanny@gmail.com
description [A short description of the project.]: A reddit clone.
domain_name [example.com]: myreddit.com
version [0.1.0]: 0.0.1
timezone [UTC]:
now [2015/11/22]: 2015/11/22
year [2015]:
timezone [UTC]: America/Los_Angeles
use_whitenoise [y]: n
use_celery [n]: y
use_mailhog [n]: n
use_sentry [n]: y
use_newrelic [n]: y
use_sentry_for_error_reporting [y]: y
use_opbeat [n]: y
use_pycharm [n]: y
windows [n]: n
use_python2 [n]: y
use_python3 [y]: y
use_docker [y]: n
use_heroku [n]: y
use_compressor [n]: y
Select postgresql_version:
1 - 9.5
2 - 9.4
3 - 9.3
4 - 9.2
Choose from 1, 2, 3, 4 [1]: 1
Select js_task_runner:
1 - Gulp
2 - Grunt
3 - Webpack
4 - None
Choose from 1, 2, 3, 4 [1]: 1
use_lets_encrypt [n]: n
Select open_source_license:
1 - MIT
2 - BSD
3 - GPLv3
4 - Apache Software License 2.0
5 - Not open source
Choose from 1, 2, 3, 4, 5 [1]: 1
use_elasticbeanstalk_experimental: n
Enter the project and take a look around::
$ cd reddit/
$ ls
Create a GitHub repo and push it there::
Create a git repo and push it there::
$ git init
$ git add .
@ -138,24 +166,25 @@ Create a GitHub repo and push it there::
Now take a look at your repo. Don't forget to carefully look at the generated README. Awesome, right?
For development, see the following for local development:
For local development, see the following:
* `Developing locally`_
* `Developing locally using docker`_
.. _`Developing locally`: http://cookiecutter-django.readthedocs.org/en/latest/developing-locally.html
.. _`Developing locally using docker`: http://cookiecutter-django.readthedocs.org/en/latest/developing-locally-docker.html
.. _options: http://cookiecutter-django.readthedocs.io/en/latest/project-generation-options.html
.. _`Developing locally`: http://cookiecutter-django.readthedocs.io/en/latest/developing-locally.html
.. _`Developing locally using docker`: http://cookiecutter-django.readthedocs.io/en/latest/developing-locally-docker.html
Support This Project
---------------------------
Community
-----------
This project is maintained by volunteers. Support their efforts by spreading the word about:
* Have questions? **Before you ask questions anywhere else**, please post your question on `Stack Overflow`_ under the *cookiecutter-django* tag. We check there periodically for questions.
* If you think you found a bug or want to request a feature, please open an issue_.
* For anything else, you can chat with us on `Gitter`_.
.. image:: https://s3.amazonaws.com/tsacademy/images/tsa-logo-250x60-transparent-01.png
:name: Two Scoops Academy
:align: center
:alt: Two Scoops Academy
:target: http://www.twoscoops.academy/
.. _`Stack Overflow`: http://stackoverflow.com/questions/tagged/cookiecutter-django
.. _`issue`: https://github.com/pydanny/cookiecutter-django/issues
.. _`Gitter`: https://gitter.im/pydanny/cookiecutter-django?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
For Readers of Two Scoops of Django 1.8
--------------------------------------------
@ -170,7 +199,7 @@ Scattered throughout the Python and HTML of this project are places marked with
Releases
--------
Want a stable release? You can find them at https://github.com/pydanny/cookiecutter-django/releases
Need a stable release? You can find them at https://github.com/pydanny/cookiecutter-django/releases
Not Exactly What You Want?
@ -193,8 +222,44 @@ If you do rename your fork, I encourage you to submit it to the following places
.. _cookiecutter: https://github.com/audreyr/cookiecutter
.. _grid: https://www.djangopackages.com/grids/g/cookiecutters/
Or Submit a Pull Request
~~~~~~~~~~~~~~~~~~~~~~~~~
Submit a Pull Request
~~~~~~~~~~~~~~~~~~~~~~
I also accept pull requests on this, if they're small, atomic, and if they make my own project development
We accept pull requests if they're small, atomic, and make our own project development
experience better.
Articles
---------
* `Development and Deployment of Cookiecutter-Django on Fedora`_ - Jan. 18, 2016
* `Development and Deployment of Cookiecutter-Django via Docker`_ - Dec. 29, 2015
* `How to create a Django Application using Cookiecutter and Django 1.8`_ - Sept. 12, 2015
* `Introduction to Cookiecutter-Django`_ - Feb. 19, 2016
* `Django and GitLab - Running Continuous Integration and tests with your FREE account`_ - May. 11, 2016
Have a blog or online publication? Write about your cookiecutter-django tips and tricks, then send us a pull request with the link.
.. _`Development and Deployment of Cookiecutter-Django via Docker`: https://realpython.com/blog/python/development-and-deployment-of-cookiecutter-django-via-docker/
.. _`Development and Deployment of Cookiecutter-Django on Fedora`: https://realpython.com/blog/python/development-and-deployment-of-cookiecutter-django-on-fedora/
.. _`How to create a Django Application using Cookiecutter and Django 1.8`: https://www.swapps.io/blog/how-to-create-a-django-application-using-cookiecutter-and-django-1-8/
.. _`Introduction to Cookiecutter-Django`: http://krzysztofzuraw.com/blog/2016/django-cookiecutter.html
.. _`Django and GitLab - Running Continuous Integration and tests with your FREE account`: http://dezoito.github.io/2016/05/11/django-gitlab-continuous-integration-phantomjs.html
Code of Conduct
---------------
Everyone interacting in the Cookiecutter project's codebases, issue trackers, chat
rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_.
Support This Project
---------------------------
This project is maintained by volunteers. Support their efforts by spreading the word about:
.. image:: https://s3.amazonaws.com/tsacademy/images/tsa-logo-250x60-transparent-01.png
:name: Two Scoops Academy
:align: center
:alt: Two Scoops Academy
:target: https://twoscoops.academy/
.. _`PyPA Code of Conduct`: https://www.pypa.io/en/latest/code-of-conduct/

View File

@ -1,20 +1,26 @@
{
"project_name": "project_name",
"repo_name": "{{ cookiecutter.project_name|replace(' ', '_') }}",
"author_name": "Your Name",
"email": "Your email",
"project_name": "Project Name",
"project_slug": "{{ cookiecutter.project_name.lower()|replace(' ', '_')|replace('-', '_') }}",
"author_name": "Daniel Roy Greenfeld",
"email": "you@example.com",
"description": "A short description of the project.",
"domain_name": "example.com",
"version": "0.1.0",
"timezone": "UTC",
"now": "2016/01/07",
"year": "{{ cookiecutter.now[:4] }}",
"use_whitenoise": "y",
"use_celery": "n",
"use_mailhog": "n",
"use_sentry": "n",
"use_newrelic": "n",
"use_sentry_for_error_reporting": "y",
"use_opbeat": "n",
"use_pycharm": "n",
"windows": "n",
"use_python2": "n"
"use_python3": "y",
"use_docker": "y",
"use_heroku": "n",
"use_elasticbeanstalk_experimental": "n",
"use_compressor": "n",
"postgresql_version": ["9.5", "9.4", "9.3", "9.2"],
"js_task_runner": ["Gulp", "Grunt", "None"],
"use_lets_encrypt": "n",
"open_source_license": ["MIT", "BSD", "GPLv3", "Apache Software License 2.0", "Not open source"]
}

View File

@ -77,17 +77,17 @@ qthelp:
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/{{ cookiecutter.repo_name }}.qhcp"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/{{ cookiecutter.project_slug }}.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/{{ cookiecutter.repo_name }}.qhc"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/{{ cookiecutter.project_slug }}.qhc"
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/{{ cookiecutter.repo_name }}"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/{{ cookiecutter.repo_name }}"
@echo "# mkdir -p $$HOME/.local/share/devhelp/{{ cookiecutter.project_slug }}"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/{{ cookiecutter.project_slug }}"
@echo "# devhelp"
epub:

View File

@ -10,6 +10,8 @@
# All configuration values have a default; values that are commented out
# serve to show the default.
from __future__ import unicode_literals
from datetime import datetime
import os
import sys
@ -43,8 +45,8 @@ source_suffix = '.rst'
master_doc = 'index'
# General information about the project.
project = u'cookiecutter-django'
copyright = u"2013-{}, Daniel Roy Greenfeld".format(now.year)
project = 'Cookiecutter Django'
copyright = "2013-2016, Daniel Roy Greenfeld".format(now.year)
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
@ -188,8 +190,8 @@ latex_elements = {
latex_documents = [
('index',
'cookiecutter-django.tex',
u'cookiecutter-django Documentation',
u"cookiecutter-django", 'manual'),
'cookiecutter-django Documentation',
'cookiecutter-django', 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
@ -218,8 +220,8 @@ latex_documents = [
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
('index', 'cookiecutter-django', u'cookiecutter-django documentation',
[u"Daniel Roy Greenfeld"], 1)
('index', 'Cookiecutter Django', 'Cookiecutter Django documentation',
['Daniel Roy Greenfeld'], 1)
]
# If true, show URL addresses after external links.
@ -232,8 +234,8 @@ man_pages = [
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
('index', 'cookiecutter-django', u'cookiecutter-django documentation',
u"Daniel Roy Greenfeld", 'cookiecutter-django',
('index', 'Cookiecutter Django', 'Cookiecutter Django documentation',
'Daniel Roy Greenfeld', 'Cookiecutter Django',
'A Cookiecutter template for creating production-ready Django projects quickly.', 'Miscellaneous'),
]

View File

@ -16,8 +16,8 @@ You can either push the 'deploy' button in your generated README.rst or run thes
heroku addons:create heroku-redis:hobby-dev
heroku addons:create mailgun
heroku config:set DJANGO_ADMIN_URL=`openssl rand -base64 32`
heroku config:set DJANGO_SECRET_KEY=`openssl rand -base64 64`
heroku config:set DJANGO_ADMIN_URL="$(openssl rand -base64 32)"
heroku config:set DJANGO_SECRET_KEY="$(openssl rand -base64 64)"
heroku config:set DJANGO_SETTINGS_MODULE='config.settings.production'
heroku config:set DJANGO_ALLOWED_HOSTS='.herokuapp.com'
@ -27,9 +27,10 @@ You can either push the 'deploy' button in your generated README.rst or run thes
heroku config:set DJANGO_MAILGUN_SERVER_NAME=YOUR_MALGUN_SERVER
heroku config:set DJANGO_MAILGUN_API_KEY=YOUR_MAILGUN_API_KEY
heroku config:set MAILGUN_SENDER_DOMAIN=YOUR_MAILGUN_SENDER_DOMAIN
heroku config:set PYTHONHASHSEED=random
heroku config:set DJANGO_ADMIN_URL=\^somelocation/
heroku config:set DJANGO_ADMIN_URL=\^somelocation/
git push heroku master
heroku run python manage.py migrate

View File

@ -0,0 +1,183 @@
Deployment on PythonAnywhere
============================
.. index:: PythonAnywhere
Overview
--------
Full instructions follow, but here's a high-level view.
**First time config**:
1. Pull your code down to PythonAnywhere using a *Bash console* and setup a virtualenv
2. Set your config variables in the *postactivate* script
3. Run the *manage.py* ``migrate`` and ``collectstatic`` commands
4. Add an entry to the PythonAnywhere *Web tab*
5. Set your config variables in the PythonAnywhere *WSGI config file*
Once you've been through this one-off config, future deployments are much simpler: just ``git pull`` and then hit the "Reload" button :)
Getting your code and dependencies installed on PythonAnywhere
--------------------------------------------------------------
Make sure your project is fully commited and pushed up to Bitbucket or Github or wherever it may be. Then, log into your PythonAnywhere account, open up a **Bash** console, clone your repo, and create a virtualenv:
.. code-block:: bash
git clone <my-repo-url> # you can also use hg
cd my-project-name
mkvirtualenv --python=/usr/bin/python3.5 my-project-name # or python2.7, etc
pip install -r requirements/production.txt # may take a few minutes
Setting environment variables in the console
--------------------------------------------
Generate a secret key for yourself, eg like this:
.. code-block:: bash
python -c 'import random; print("".join(random.SystemRandom().choice("abcdefghijklmnopqrstuvwxyz0123456789!@#$%^&*(-_=+)") for _ in range(50)))'
Make a note of it, since we'll need it here in the console and later on in the web app config tab.
Set environment variables via the virtualenv "postactivate" script (this will set them every time you use the virtualenv in a console):
.. code-block:: bash
vi $VIRTUAL_ENV/bin/postactivate
**TIP:** *If you don't like vi, you can also edit this file via the PythonAnywhere "Files" menu; look in the ".virtualenvs" folder*.
Add these exports
.. code-block:: bash
export DJANGO_SETTINGS_MODULE='config.settings.production'
export DJANGO_SECRET_KEY='<secret key goes here>'
export DJANGO_ALLOWED_HOSTS='<www.your-domain.com>'
export DJANGO_ADMIN_URL='<not admin/>'
export DJANGO_MAILGUN_API_KEY='<mailgun key>'
export DJANGO_MAILGUN_SERVER_NAME='<mailgun server name>'
export MAILGUN_SENDER_DOMAIN='<mailgun sender domain (e.g. mg.yourdomain.com)>'
export DJANGO_AWS_ACCESS_KEY_ID=
export DJANGO_AWS_SECRET_ACCESS_KEY=
export DJANGO_AWS_STORAGE_BUCKET_NAME=
export DATABASE_URL='<see below>'
**NOTE:** *The AWS details are not required if you're using whitenoise or the built-in pythonanywhere static files service, but you do need to set them to blank, as above.*
Database setup:
---------------
Go to the PythonAnywhere **Databases tab** and configure your database.
* For Postgres, setup your superuser password, then open a Postgres console and run a `CREATE DATABASE my-db-name`. You should probably also set up a specific role and permissions for your app, rather than using the superuser credentials. Make a note of the address and port of your postgres server.
* For MySQL, set the password and create a database. More info here: https://help.pythonanywhere.com/pages/UsingMySQL
* You can also use sqlite if you like! Not recommended for anything beyond toy projects though.
Now go back to the *postactivate* script and set the ``DATABASE_URL`` environment variable:
.. code-block:: bash
export DATABASE_URL='postgres://<postgres-username>:<postgres-password>@<postgres-address>:<postgres-port>/<database-name>'
# or
export DATABASE_URL='mysql://<pythonanywhere-username>:<mysql-password>@<mysql-address>/<database-name>'
# or
export DATABASE_URL='sqlite:////home/yourusername/path/to/db.sqlite'
If you're using MySQL, you may need to run ``pip install mysqlclient``, and maybe add ``mysqlclient`` to *requirements/production.txt* too.
Now run the migration, and collectstatic:
.. code-block:: bash
source $VIRTUAL_ENV/bin/postactivate
python manage.py migrate
python manage.py collectstatic
# and, optionally
python manage.py createsuperuser
Configure the PythonAnywhere Web Tab
------------------------------------
Go to the PythonAnywhere **Web tab**, hit **Add new web app**, and choose **Manual Config**, and then the version of Python you used for your virtualenv.
**NOTE:** *If you're using a custom domain (not on \*.pythonanywhere.com), then you'll need to set up a CNAME with your domain registrar.*
When you're redirected back to the web app config screen, set the **path to your virtualenv**. If you used virtualenvwrapper as above, you can just enter its name.
Click through to the **WSGI configuration file** link (near the top) and edit the wsgi file. Make it look something like this, repeating the environment variables you used earlier:
.. code-block:: python
import os
import sys
path = '/home/<your-username>/<your-project-directory>'
if path not in sys.path:
sys.path.append(path)
os.environ['DJANGO_SETTINGS_MODULE'] = 'config.settings.production'
os.environ['DJANGO_SECRET_KEY'] = '<as above>'
os.environ['DJANGO_ALLOWED_HOSTS'] = '<as above>'
os.environ['DJANGO_ADMIN_URL'] = '<as above>'
os.environ['DJANGO_MAILGUN_API_KEY'] = '<as above>'
os.environ['DJANGO_MAILGUN_SERVER_NAME'] = '<as above>'
os.environ['DJANGO_AWS_ACCESS_KEY_ID'] = ''
os.environ['DJANGO_AWS_SECRET_ACCESS_KEY'] = ''
os.environ['DJANGO_AWS_STORAGE_BUCKET_NAME'] = ''
os.environ['DATABASE_URL'] = '<as above>'
from django.core.wsgi import get_wsgi_application
application = get_wsgi_application()
Back on the Web tab, hit **Reload**, and your app should be live!
**NOTE:** *you may see security warnings until you set up your SSL certificates. If you
want to supress them temporarily, set DJANGO_SECURE_SSL_REDIRECT to blank. Follow
the instructions here to get SSL set up: https://help.pythonanywhere.com/pages/SSLOwnDomains/*
Optional: static files
----------------------
If you want to use the PythonAnywhere static files service instead of using whitenoise or S3, you'll find its configuration section on the Web tab. Essentially you'll need an entry to match your ``STATIC_URL`` and ``STATIC_ROOT`` settings. There's more info here: https://help.pythonanywhere.com/pages/DjangoStaticFiles
Future deployments
------------------
For subsequent deployments, the procedure is much simpler. In a Bash console:
.. code-block:: bash
workon my-virtualenv-name
cd project-directory
git pull
python manage.py migrate
python manage.py collectstatic
And then go to the Web tab and hit **Reload**
**TIP:** *if you're really keen, you can set up git-push based deployments: https://blog.pythonanywhere.com/87/*

View File

@ -1,49 +1,104 @@
Deployment with Docker
=================================================
=======================
.. index:: Docker, deployment
TODO: Review and revise
Prerequisites
-------------
**Warning**
* Docker (at least 1.10)
* Docker Compose (at least 1.6)
Docker is evolving extremely fast, but it has still some rough edges here and there. Compose is currently (as of version 1.4)
not considered production ready. That means you won't be able to scale to multiple servers and you won't be able to run
zero downtime deployments out of the box. Consider all this as experimental until you understand all the implications
to run docker (with compose) on production.
**Run your app with docker-compose**
Prerequisites:
* docker (tested with 1.8)
* docker-compose (tested with 0.4)
Understand the Compose Setup
--------------------------------
Before you start, check out the `docker-compose.yml` file in the root of this project. This is where each component
of this application gets its configuration from. It consists of a `postgres` service that runs the database, `redis`
for caching, `nginx` as reverse proxy and last but not least the `django` application run by gunicorn.
{% if cookiecutter.use_celery == 'y' -%}
Since this application also runs Celery, there are two more services with a service called `celeryworker` that runs the
celery worker process and `celerybeat` that runs the celery beat process.
{% endif %}
of this application gets its configuration from. Notice how it provides configuration for these services:
* `postgres` service that runs the database
* `redis` for caching
* `nginx` as reverse proxy
* `django` is the Django project run by gunicorn
All of these services except `redis` rely on environment variables set by you. There is an `env.example` file in the
If you chose the `use_celery` option, there are two more services:
* `celeryworker` which runs the celery worker process
* `celerybeat` which runs the celery beat process
If you chose the `use_letsencrypt` option, you also have:
* `certbot` which keeps your certs from letsencrypt up-to-date
Populate .env With Your Environment Variables
---------------------------------------------
Some of these services rely on environment variables set by you. There is an `env.example` file in the
root directory of this project as a starting point. Add your own variables to the file and rename it to `.env`. This
file won't be tracked by git by default so you'll have to make sure to use some other mechanism to copy your secret if
you are relying solely on git.
Optional: nginx-proxy Setup
---------------------------
By default, the application is configured to listen on all interfaces on port 80. If you want to change that, open the
`docker-compose.yml` file and replace `0.0.0.0` with your own ip. If you are using `nginx-proxy`_ to run multiple
application stacks on one host, remove the port setting entirely and add `VIRTUAL_HOST={{cookiecutter.domain_name}}` to your env file.
`docker-compose.yml` file and replace `0.0.0.0` with your own ip.
If you are using `nginx-proxy`_ to run multiple application stacks on one host, remove the port setting entirely and add `VIRTUAL_HOST=example.com` to your env file. Here, replace example.com with the value you entered for `domain_name`.
This pass all incoming requests on `nginx-proxy`_ to the nginx service your application is using.
.. _nginx-proxy: https://github.com/jwilder/nginx-proxy
Postgres is saving its database files to `/data/{{cookiecutter.repo_name}}/postgres` by default. Change that if you wan't
Optional: Postgres Data Volume Modifications
---------------------------------------------
Postgres is saving its database files to the `postgres_data` volume by default. Change that if you wan't
something else and make sure to make backups since this is not done automatically.
Optional: Certbot and Let's Encrypt Setup
------------------------------------------
If you chose `use_letsencrypt` and will be using certbot for https, you must do the following before running anything with docker-compose:
Replace dhparam.pem.example with a generated dhparams.pem file before running anything with docker-compose. You can generate this on ubuntu or OS X by running the following in the project root:
::
$ openssl dhparam -out /path/to/project/compose/nginx/dhparams.pem 2048
If you would like to add additional subdomains to your certificate, you must add additional parameters to the certbot command in the `docker-compose.yml` file:
Replace:
::
command: bash -c "sleep 6 && certbot certonly -n --standalone -d {{ cookiecutter.domain_name }} --text --agree-tos --email mjsisley@relawgo.com --server https://acme-v01.api.letsencrypt.org/directory --rsa-key-size 4096 --verbose --keep-until-expiring --standalone-supported-challenges http-01"
With:
::
command: bash -c "sleep 6 && certbot certonly -n --standalone -d {{ cookiecutter.domain_name }} -d www.{{ cookiecutter.domain_name }} -d etc.{{ cookiecutter.domain_name }} --text --agree-tos --email {{ cookiecutter.email }} --server https://acme-v01.api.letsencrypt.org/directory --rsa-key-size 4096 --verbose --keep-until-expiring --standalone-supported-challenges http-01"
Please be cognizant of Certbot/Letsencrypt certificate requests limits when getting this set up. The provide a test server that does not count against the limit while you are getting set up.
The certbot certificates expire after 3 months.
If you would like to set up autorenewal of your certificates, the following commands can be put into a bash script:
::
#!/bin/bash
cd <project directory>
docker-compose run --rm --name certbot certbot bash -c "sleep 6 && certbot certonly --standalone -d {{ cookiecutter.domain_name }} --text --agree-tos --email {{ cookiecutter.email }} --server https://acme-v01.api.letsencrypt.org/directory --rsa-key-size 4096 --verbose --keep-until-expiring --standalone-supported-challenges http-01"
docker exec pearl_nginx_1 nginx -s reload
And then set a cronjob by running `crontab -e` and placing in it (period can be adjusted as desired)::
0 4 * * 1 /path/to/bashscript/renew_certbot.sh
Run your app with docker-compose
--------------------------------
To get started, pull your code from source control (don't forget the `.env` file) and change to your projects root
directory.
@ -55,7 +110,6 @@ Once this is ready, you can run it with::
docker-compose up
To run a migration, open up a second terminal and run::
docker-compose run django python manage.py migrate
@ -64,10 +118,9 @@ To create a superuser, run::
docker-compose run django python manage.py createsuperuser
If you need a shell, run::
docker-compose run django python manage.py shell_plus
docker-compose run django python manage.py shell
To get an output of all running containers.
@ -80,8 +133,15 @@ If you want to scale your application, run::
docker-compose scale django=4
docker-compose scale celeryworker=2
.. warning:: Don't run the scale command on postgres, celerybeat, certbot, or nginx.
**Don't run the scale command on postgres or celerybeat**
If you have errors, you can always check your stack with `docker-compose`. Switch to your projects root directory and run::
docker-compose ps
Supervisor Example
-------------------
Once you are ready with your initial setup, you wan't to make sure that your application is run by a process manager to
survive reboots and auto restarts in case of an error. You can use the process manager you are most familiar with. All
@ -89,24 +149,19 @@ it needs to do is to run `docker-compose up` in your projects root directory.
If you are using `supervisor`, you can use this file as a starting point::
[program:{{cookiecutter.repo_name}}]
[program:{{cookiecutter.project_slug}}]
command=docker-compose up
directory=/path/to/{{cookiecutter.repo_name}}
directory=/path/to/{{cookiecutter.project_slug}}
redirect_stderr=true
autostart=true
autorestart=true
priority=10
Place it in `/etc/supervisor/conf.d/{{cookiecutter.repo_name}}.conf` and run::
Place it in `/etc/supervisor/conf.d/{{cookiecutter.project_slug}}.conf` and run::
supervisorctl reread
supervisorctl start {{cookiecutter.repo_name}}
supervisorctl start {{cookiecutter.project_slug}}
To get the status, run::
supervisorctl status
If you have errors, you can always check your stack with `docker-compose`. Switch to your projects root directory and run::
docker-compose ps

View File

@ -0,0 +1,72 @@
Deployment with Elastic Beanstalk
==========================================
.. index:: Elastic Beanstalk
Warning: Experimental
---------------------
This is experimental. For the time being there will be bugs and issues. If you've never used Elastic Beanstalk before, please hold off before trying this option.
On the other hand, we need help cleaning this up. If you do have knowledge of Elastic Beanstalk, we would appreciate the help. :)
Prerequisites
-------------
* awsebcli
Instructions
-------------
If you haven't done so, create a directory of environments::
eb init -p python3.4 MY_PROJECT_SLUG
Replace `MY_PROJECT_SLUG` with the value you entered for `project_slug`.
Once that is done, create the environment (server) where the app will run::
eb create MY_PROJECT_SLUG
# Note: This will eventually fail on a postgres error, because postgres doesn't exist yet
Now make sure you are in the right environment::
eb list
If you are not in the right environment, then put yourself in the correct one::
eb use MY_PROJECT_SLUG
Set the environment variables. Notes: You will be prompted if the `.env` file is missing. The script will ignore any PostgreSQL values, as RDS uses it's own system::
# Set the environment variables
python ebsetenv.py
Speaking of PostgreSQL, go to the Elasting Beanstalk configuration panel for RDS. Create new RDS database, with these attributes:
* PostgreSQL
* Version 9.4.9
* Size db.t2.micro (You can upgrade later)
(Get some coffee, this is going to take a while)
Once you have a database specified, deploy again so your instance can pick up the new PostgreSQL values::
eb deploy
Take a look::
eb open
FAQ
-----
Why Not Use Docker on Elastic Beanstalk?
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Because I didn't want to add an abstraction (Docker) on top of an abstraction (Elastic Beanstalk) on top of an abstraction (Cookiecutter Django).
Why Can't I Use Both Docker/Heroku with Elastic Beanstalk?
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Because the environment variables that our Docker and Heroku setups use for PostgreSQL access is different then how Amazon RDS handles this access. At this time we're just trying to get things to work reliably with Elastic Beanstalk, and full integration will come later.

View File

@ -1,5 +1,5 @@
Getting Up and Running with Docker
==================================
Getting Up and Running Locally With Docker
==========================================
.. index:: Docker
@ -7,61 +7,18 @@ The steps below will get you up and running with a local development environment
All of these commands assume you are in the root of your generated project.
Prerequisites
--------------
-------------
If you don't already have these installed, get them all by installing `Docker Toolbox`_.
You'll need at least Docker 1.10.
* docker
* docker-machine
* docker-compose
* virtualbox
If you don't already have it installed, follow the instructions for your OS:
.. _`Docker Toolbox`: https://github.com/docker/toolbox/releases
Create the Machine (Optional)
-------------------------------
On Ubuntu you have native Docker, so you don't need to create a VM with
docker-machine to use it.
However, on Mac/Windows/other systems without native Docker, you'll want to
start by creating a VM with docker-machine::
$ docker-machine create --driver virtualbox dev1
**Note:** If you want to have more than one docker development environment, then
name them accordingly. Instead of 'dev1' you might have 'dev2', 'myproject',
'djangopackages', et al.
Get the IP Address
--------------------
Once your machine is up and running, run this::
$ docker-machine ip dev1
123.456.789.012
This is also the IP address where the Django project will be served from.
Saving changes
--------------
If you are using OS X or Windows, you need to create a /data partition inside the
virtual machine that runs the docker deamon in order make all changes persistent.
If you don't do that your /data directory will get wiped out on every reboot.
To create a persistent folder, log into the virtual machine by running::
$ docker-machine ssh dev1
$ sudo su
$ mkdir /data
$ echo 'ln -sfn /mnt/sda1/data /data' >> /var/lib/boot2docker/bootlocal.sh
In case you are wondering why you can't use a host volume to keep the files on
your mac: As of `boot2docker` 1.7 you'll run into permission problems with mounted
host volumes if the container creates his own user and chown's the directories
on the volume. Postgres is doing that, so we need this quick fix to ensure that
all development data persists.
- On Mac OS X, you'll need `Docker for Mac`_
- On Windows, you'll need `Docker for Windows`_
- On Linux, you'll need `docker-engine`_
.. _`Docker for Mac`: https://docs.docker.com/engine/installation/mac/
.. _`Docker for Windows`: https://docs.docker.com/engine/installation/windows/
.. _`docker-engine`: https://docs.docker.com/engine/installation/
Build the Stack
---------------
@ -70,15 +27,15 @@ This can take a while, especially the first time you run this particular command
on your development system::
$ docker-compose -f dev.yml build
If you want to build the production environment you don't have to pass an argument -f, it will automatically use docker-compose.yml.
If you want to build the production environment you don't have to pass an argument -f, it will automatically use docker-compose.yml.
Boot the System
---------------
This brings up both Django and PostgreSQL.
This brings up both Django and PostgreSQL.
The first time it is run it might take a while to get started, but subsequent
The first time it is run it might take a while to get started, but subsequent
runs will occur quickly.
Open a terminal at the project root and run the following for local development::
@ -92,12 +49,12 @@ You can also set the environment variable ``COMPOSE_FILE`` pointing to ``dev.yml
And then run::
$ docker-compose up
Running management commands
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
As with any shell command that we wish to run in our container, this is done
using the ``docker-compose run`` command.
Running management commands
~~~~~~~~~~~~~~~~~~~~~~~~~~~
As with any shell command that we wish to run in our container, this is done
using the ``docker-compose run`` command.
To migrate your app and to create a superuser, run::
@ -107,17 +64,17 @@ To migrate your app and to create a superuser, run::
Here we specify the ``django`` container as the location to run our management commands.
Production Mode
~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~
Instead of using `dev.yml`, you would use `docker-compose.yml`.
Other Useful Tips
------------------
-----------------
Make a machine the active unit
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This tells our computer that all future commands are specifically for the dev1 machine.
This tells our computer that all future commands are specifically for the dev1 machine.
Using the ``eval`` command we can switch machines as needed.
::
@ -132,3 +89,54 @@ If you want to run the stack in detached mode (in the background), use the ``-d`
::
$ docker-compose -f dev.yml up -d
Debugging
~~~~~~~~~~~~~
ipdb
"""""
If you are using the following within your code to debug:
::
import ipdb; ipdb.set_trace()
Then you may need to run the following for it to work as desired:
::
$ docker-compose run -f dev.yml --service-ports django
django-debug-toolbar
""""""""""""""""""""
In order for django-debug-toolbar to work with docker you need to add your docker-machine ip address (the output of `Get the IP ADDRESS`_) to INTERNAL_IPS in local.py
.. May be a better place to put this, as it is not Docker specific.
You may need to add the following to your css in order for the django-debug-toolbar to be visible (this applies whether Docker is being used or not):
.. code-block:: css
/* Override Bootstrap 4 styling on Django Debug Toolbar */
#djDebug[hidden], #djDebug [hidden] {
display: block !important;
}
#djDebug [hidden][style='display: none;'] {
display: none !important;
}
Using the Mailhog Docker Container
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In development you can (optionally) use MailHog_ for email testing. If you selected `use_docker`, MailHog is added as a Docker container. To use MailHog:
1. Make sure, that ``mailhog`` docker container is up and running
2. Open your browser and go to ``http://127.0.0.1:8025``
.. _Mailhog: https://github.com/mailhog/MailHog/

View File

@ -9,44 +9,56 @@ The steps below will get you up and running with a local development environment
* virtualenv
* PostgreSQL
First make sure to create and activate a virtualenv_, then open a terminal at the project root and install the os dependencies::
First make sure to create and activate a virtualenv_.
$ sudo ./install_os_dependencies.sh install
.. _virtualenv: http://docs.python-guide.org/en/latest/dev/virtualenvs/
Then install the requirements for your local development::
$ pip install -r requirements/local.txt
.. _virtualenv: http://docs.python-guide.org/en/latest/dev/virtualenvs/
Then, create a PostgreSQL database with the following command, where `[project_slug]` is what value you entered for your project's `project_slug`::
Then, create a PostgreSQL database with the following command, where `[repo_name]` is what value you entered for your project's `repo_name`::
$ createdb [repo_name]
`cookiecutter-django` uses the excellent `django-environ`_ package with its ``DATABASE_URL`` environment variable to simplify database configuration in your Django settings. Now all you have to do is compose a definition for ``DATABASE_URL``:
.. parsed-literal::
$ export DATABASE_URL="postgres://*<pg_user_name>*:*<pg_user_password>*\ @127.0.0.1:\ *<pg_port>*/*<pg_database_name>*"
.. _django-environ: http://django-environ.readthedocs.org
$ createdb [project_slug]
You can now run the usual Django ``migrate`` and ``runserver`` commands::
$ python manage.py migrate
$ python manage.py runserver
**Setup your email backend**
At this point you can take a break from setup and start getting to know the files in the project.
But if you want to go further with setup, read on.
(Note: the following sections still need to be revised)
Setting Up Env Vars for Production
-----------------------------------
`Cookiecutter Django` uses the excellent `django-environ`_ package, which includes a ``DATABASE_URL`` environment variable to simplify database configuration in your Django settings.
Rename env.example to .env to begin updating the file with your own environment variables. To add your database, define ``DATABASE_URL`` and add it to the .env file, as shown below:
.. parsed-literal::
DATABASE_URL="postgres://*<pg_user_name>*:*<pg_user_password>*\ @127.0.0.1:\ *<pg_port>*/*<pg_database_name>*"
.. _django-environ: http://django-environ.readthedocs.io
Setup your email backend
-------------------------
django-allauth sends an email to verify users (and superusers) after signup and login (if they are still not verified). To send email you need to `configure your email backend`_
.. _configure your email backend: http://docs.djangoproject.com/en/1.9/topics/email/#smtp-backend
In development you can (optionally) use MailHog_ for email testing. MailHog is built with Go so there are no dependencies. To use MailHog::
In development you can (optionally) use MailHog_ for email testing. MailHog is built with Go so there are no dependencies. To use MailHog:
1. `Download the latest release`_ for your operating system
2. Rename the executable to ``mailhog`` and copy it to the root of your project directory
3. Make sure it is executable (e.g. ``chmod +x mailhog``)
4. Execute mailhog from the root of your project in a new terminal window (e.g. ``./mailhog``)
5. All emails generated from your django app can be seen on http://127.0.0.1:8025/
.. _Mailhog: https://github.com/mailhog/MailHog/
.. _Download the latest release: https://github.com/mailhog/MailHog/releases
@ -59,22 +71,6 @@ In production basic email configuration is setup to send emails with Mailgun_
**Live reloading and Sass CSS compilation**
If you'd like to take advantage of live reloading and Sass / Compass CSS compilation you can do so with the included Grunt task.
If youd like to take advantage of live reloading and Sass / Compass CSS compilation you can do so with a little bit of `prep work`_.
Make sure that nodejs_ is installed. Then in the project root run::
$ npm install
.. _nodejs: http://nodejs.org/download/
Now you just need::
$ grunt serve
The base app will now run as it would with the usual ``manage.py runserver`` but with live reloading and Sass compilation enabled.
To get live reloading to work you'll probably need to install an `appropriate browser extension`_
.. _appropriate browser extension: http://feedback.livereload.com/knowledgebase/articles/86242-how-do-i-install-and-use-the-browser-extensions-
It's time to write the code!!!
.. _prep work: https://cookiecutter-django.readthedocs.io/en/latest/live-reloading-and-sass-compilation.html

View File

@ -0,0 +1,40 @@
============================
Database Backups with Docker
============================
The database has to be running to create/restore a backup. These examples show local examples. If you want to use it on a remote server, remove ``-f dev.yml`` from each example.
Running Backups
================
Run the app with `docker-compose -f dev.yml up`.
To create a backup, run::
docker-compose -f dev.yml run postgres backup
To list backups, run::
docker-compose -f dev.yml run postgres list-backups
To restore a backup, run::
docker-compose -f dev.yml run postgres restore filename.sql
Where <containerId> is the ID of the Postgres container. To get it, run::
docker ps
To copy the files from the running Postgres container to the host system::
docker cp <containerId>:/backups /host/path/target
Restoring From Backups
======================
To restore the production database to a local PostgreSQL database::
createdb NAME_OF_DATABASE
psql NAME_OF_DATABASE < NAME_OF_BACKUP_FILE

View File

@ -3,14 +3,14 @@ FAQ
.. index:: FAQ, 12-Factor App
Why is there a django.contrib.sites directory in cookiecutter-django?
Why is there a django.contrib.sites directory in Cookiecutter Django?
---------------------------------------------------------------------
It is there to add a migration so you don't have to manually change the ``sites.Site`` record from ``example.com`` to whatever your domain is. Instead, your ``{{cookiecutter.domain_name}}`` and {{cookiecutter.project_name}} value is placed by **Cookiecutter** in the domain and name fields respectively.
See `0002_set_site_domain_and_name.py`_.
See `0003_set_site_domain_and_name.py`_.
.. _`0002_set_site_domain_and_name.py`: https://github.com/pydanny/cookiecutter-django/blob/master/%7B%7Bcookiecutter.repo_name%7D%7D/%7B%7Bcookiecutter.repo_name%7D%7D/contrib/sites/migrations/0002_set_site_domain_and_name.py
.. _`0003_set_site_domain_and_name.py`: https://github.com/pydanny/cookiecutter-django/blob/master/%7B%7Bcookiecutter.project_slug%7D%7D/%7B%7Bcookiecutter.project_slug%7D%7D/contrib/sites/migrations/0003_set_site_domain_and_name.py
Why aren't you using just one configuration file (12-Factor App)

View File

@ -1,29 +1,31 @@
.. cookiecutter-django documentation master file.
Welcome to cookiecutter-django's documentation!
Welcome to Cookiecutter Django's documentation!
====================================================================
A Cookiecutter_ template for Django.
.. _cookiecutter: https://github.com/audreyr/cookiecutter
.. note:: This is an in-progress documentation reorganization. Locations of files may change dramatically over the course of the next few days. See https://github.com/pydanny/cookiecutter-django/issues/335
Contents:
.. toctree::
:maxdepth: 2
project-generation-options
developing-locally
developing-locally-docker
settings
linters
live-reloading-and-sass-compilation
deployment-on-pythonanywhere
deployment-on-heroku
deployment-with-docker
docker-postgres-backups
faq
troubleshooting
my-favorite-cookie
deployment-with-elastic-beanstalk
Indices and tables
==================

View File

@ -0,0 +1,17 @@
PostgreSQL Installation Basics
==============================
.. index:: pip, virtualenv, PostgreSQL
The steps below will get you up and running with PostgreSQL. This assumes you have pip and virtualenv_ installed.
.. _virtualenv: http://docs.python-guide.org/en/latest/dev/virtualenvs/
On Mac
Install PostgreSQLapp_ from the browser and move PostGresSQL into your applications folder. Then install PostgreSQL from HomeBrew_.
$ brew install postgres
.. _PostgreSQLapp: http://postgresapp.com/
.. _HomeBrew: http://brew.sh/

View File

@ -99,9 +99,9 @@ if "%1" == "qthelp" (
echo.
echo.Build finished; now you can run "qcollectiongenerator" with the ^
.qhcp project file in %BUILDDIR%/qthelp, like this:
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\{{ cookiecutter.repo_name }}.qhcp
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\{{ cookiecutter.project_slug }}.qhcp
echo.To view the help file:
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\{{ cookiecutter.repo_name }}.ghc
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\{{ cookiecutter.project_slug }}.ghc
goto end
)

100
docs/my-favorite-cookie.rst Normal file
View File

@ -0,0 +1,100 @@
************************************************
Creating your first app with Cookiecutter-Django
************************************************
This tutorial will show you how to build a simple app using the `Cookiecutter Django <https://github.com/pydanny/cookiecutter-django>`_ templating system. We'll be building a cookie polling app to determine the most popular flavor of cookie.
Developers who have never used Django will learn the basics of creating a Django app; developers who are experienced with Django will learn how to set up a project within the Cookiecutter system. While many Django tutorials use the default SQLite database, Cookiecutter Django uses PostGres only, so we'll have you install and use that.
Dependencies
============
This tutorial was written on Windows 10 using `git bash <https://git-for-windows.github.io/>`_; alternate instructions for Mac OS and Linux will be provided when needed. Any Linux-style shell should work for the following commands.
You should have your preferred versions of `Python <https://www.python.org/downloads/>`_
and `Django <https://www.djangoproject.com/download/>`_ installed. Use the latest stable versions if you have no preference.
You should have `Virtualenv <https://virtualenv.pypa.io/en/stable/>`_ and `Cookiecutter <https://github.com/pydanny/cookiecutter-django/>`_ installed:
.. code-block:: python
$ pip install virtualenv
$ pip install cookiecutter
You should also have `PostgreSQL <https://www.postgresql.org/download/>`_ installed on your machine--just download and run the installer for your OS. The install menu will prompt you for a password, which you'll use when creating the project's database.
Instructions
============
1. **Setup** -- how to set up a virtual environment
2. **Cookiecutter** -- use Cookiecutter to initialize a project with your own customized information.
3. **Building the App** -- creating the My Favorite Cookie application.
============
1. Setup
============
Virtual Environment
"""""""""""""""""""
Create a virtual environment for your project. Cookiecutter will install a bunch of dependencies for you automatically; using a virtualenv will prevent this from interfering with your other work.
.. code-block:: python
$ virtualenv c:/.virtualenvs/cookie_polls
Replace ``c:/.virtualenvs`` with the path to your own ``.virtualenvs`` folder.
Activate the virtual environment by calling ``source`` on the ``activate`` shell script . On Windows you'll call this from the virtualenv's ``scripts`` folder:
.. code-block:: python
$ source /path/to/.virtualenvs/cookie_polls/scripts/activate
On other operating systems, it'll be found in the ``bin`` folder.
.. code-block:: python
$ source /path/to/.virtualenvs/cookie_polls/bin/activate
You'll know the virtual environment is active because its name will appear in parentheses before the command prompt. When you're done with this project, you can leave the virtual environment with the ``deactivate`` command.
.. code-block:: python
(cookie_polls)
$ deactivate
Now you're ready to create your project using Cookiecutter.
===============
2. Cookiecutter
===============
Django developers may be familiar with the ``startproject`` command, which initializes the directory structure and required files for a bare-bones Django project. While this is fine when you're just learning Django for the first time, it's not great for a real production app. Cookiecutter takes care of a lot of standard tasks for you, including installing software dependencies, setting up testing files, and including and organizing common libraries like Bootstrap and AngularJS. It also generates a software license and a README.
Change directories into the folder where you want your project to live, and run ``cookiecutter`` followed by the URL of Cookiecutter's Github repo.
.. code-block:: python
$ cd /my/project/folder
(cookie_polls)
my/project/folder
$ cookiecutter https://github.com/pydanny/cookiecutter-django
This will prompt you for a bunch of values specific to your project. Press "enter" without typing anything to use the default values, which are shown in [brackets] after the question. You can learn about all the different options `here, <http://cookiecutter-django.readthedocs.io/en/latest/project-generation-options.html>`_ but for now we'll use the defaults for everything but your name, your email, the project's name, and the project's description.
.. code-block:: python
project_name [project_name]: My Favorite Cookie
project_slug [My_Favorite_Cookie]:
author_name [Your Name]: Emily Cain
email [Your email]: contact@emcain.net
description [A short description of the project.]: Poll your friends to determine the most popular cookie.
Then hit "enter" to use the default values for everything else.

View File

@ -4,7 +4,7 @@ Project Generation Options
project_name [project_name]:
Your human-readable project name, including any capitalization or spaces.
repo_name [project_name]:
project_slug [project_name]:
The slug of your project, without dashes or spaces. Used to name your repo
and in other places where a Python-importable version of your project name
is needed.
@ -37,19 +37,75 @@ use_celery [n]
use_mailhog [n]
Whether to use MailHog_. MailHog is a tool that simulates email receiving
for development purposes. It runs a simple SMTP server which catches
any message sent to it. Messages are displayed in a web interface which runs at ``http://localhost:8025/`` You need to download the MailHog executable for your operating system, see the 'Developing Locally' docs for instructions.
any message sent to it. Messages are displayed in a web interface which
runs at ``http://localhost:8025/`` You need to download the MailHog
executable for your operating system, see the 'Developing Locally' docs
for instructions.
use_sentry [n]
use_sentry_for_error_reporting [n]
Whether to use Sentry_ to log errors from your project.
use_opbeat [n]
Whether to use Opbeat_ for preformance monitoring and code optimization.
use_pycharm [n]
Adds support for developing in PyCharm_ with a preconfigured .idea directory.
windows [n]
Whether you'll be developing on Windows.
use_python2 [n]
use_python3 [y]
By default, the Python code generated will be for Python 3.x. But if you
answer `y` here, it will be legacy Python 2.7 code.
answer `n` here, it will be legacy Python 2.7 code.
use_docker [y]
Whether to use Docker_, separating the app and database into separate
containers.
use_heroku [n]
Add configuration to deploy the application to a Heroku_ instance.
use_compressor [n]
Use `Django Compressor`_ to minify and combine rendered JavaScript and CSS
into cachable static resources.
js_task_runner [1]
Select a JavaScript task runner. The choices are:
1. Gulp_
2. Grunt_
3. Webpack_
4. None
use_lets_encrypt [n]
Use `Let's Encrypt`_ as the certificate authority for this project.
open_source_license [1]
Select a software license for the project. The choices are:
1. MIT_
2. BSD_
3. GPLv3_
4. `Apache Software License 2.0`_
5. Not open source
**NOTE:** *If you choose to use Docker, selecting a JavaScript task runner is
not supported out of the box.*
.. _WhiteNoise: https://github.com/evansd/whitenoise
.. _Celery: https://github.com/celery/celery
.. _MailHog: https://github.com/mailhog/MailHog
.. _Sentry: https://github.com/getsentry/sentry
.. _Opbeat: https://github.com/opbeat/opbeat_python
.. _PyCharm: https://www.jetbrains.com/pycharm/
.. _Docker: https://github.com/docker/docker
.. _Heroku: https://github.com/heroku/heroku-buildpack-python
.. _Django Compressor: https://github.com/django-compressor/django-compressor
.. _Gulp: https://github.com/gulpjs/gulp
.. _Grunt: https://github.com/gruntjs/grunt
.. _Webpack: https://github.com/webpack/webpack
.. _Let's Encrypt: https://github.com/certbot/certbot
.. _MIT: https://opensource.org/licenses/MIT
.. _BSD: https://opensource.org/licenses/BSD-3-Clause
.. _GPLv3: https://www.gnu.org/licenses/gpl.html
.. _Apache Software License 2.0: http://www.apache.org/licenses/LICENSE-2.0

View File

@ -27,7 +27,7 @@ DJANGO_EMAIL_SUBJECT_PREFIX EMAIL_SUBJECT_PREFIX n/a
DJANGO_ALLOWED_HOSTS ALLOWED_HOSTS ['*'] ['your_domain_name']
======================================= =========================== ============================================== ======================================================================
The following table lists settings and their defaults for third-party applications, which may or may be part of your project:
The following table lists settings and their defaults for third-party applications, which may or may not be part of your project:
======================================= =========================== ============================================== ======================================================================
Environment Variable Django Setting Development Default Production Default
@ -40,15 +40,17 @@ DJANGO_SENTRY_CLIENT SENTRY_CLIENT n/a
DJANGO_SENTRY_LOG_LEVEL SENTRY_LOG_LEVEL n/a logging.INFO
DJANGO_MAILGUN_API_KEY MAILGUN_ACCESS_KEY n/a raises error
DJANGO_MAILGUN_SERVER_NAME MAILGUN_SERVER_NAME n/a raises error
MAILGUN_SENDER_DOMAIN MAILGUN_SENDER_DOMAIN n/a raises error
NEW_RELIC_APP_NAME NEW_RELIC_APP_NAME n/a raises error
NEW_RELIC_LICENSE_KEY NEW_RELIC_LICENSE_KEY n/a raises error
DJANGO_OPBEAT_APP_ID OPBEAT['APP_ID'] n/a raises error
DJANGO_OPBEAT_SECRET_TOKEN OPBEAT['SECRET_TOKEN'] n/a raises error
DJANGO_OPBEAT_ORGANIZATION_ID OPBEAT['ORGANIZATION_ID'] n/a raises error
======================================= =========================== ============================================== ======================================================================
--------------
Other Settings
--------------
--------------------------
Other Environment Settings
--------------------------
ACCOUNT_ALLOW_REGISTRATION (=True)
Allow enable or disable user registration through `django-allauth` without disabling other characteristics like authentication and account management.
DJANGO_ACCOUNT_ALLOW_REGISTRATION (=True)
Allow enable or disable user registration through `django-allauth` without disabling other characteristics like authentication and account management. (Django Setting: ACCOUNT_ALLOW_REGISTRATION)

9
docs/troubleshooting.rst Normal file
View File

@ -0,0 +1,9 @@
Troubleshooting
=====================================
This page contains some advice about errors and problems commonly encountered during the development of Cookiecutter Django applications.
#. If you get the error ``jinja2.exceptions.TemplateSyntaxError: Encountered unknown tag 'now'.`` , please upgrade your cookiecutter version to >= 1.4 (see issue # 528_ )
#. ``project_slug`` must be a valid Python module name or you will have issues on imports.
.. _528: https://github.com/pydanny/cookiecutter-django/issues/528#issuecomment-212650373

View File

@ -3,7 +3,8 @@ Does the following:
1. Generates and saves random secret key
2. Removes the taskapp if celery isn't going to be used
3. Copy files from /docs/ to {{ cookiecutter.repo_name }}/docs/
3. Removes the .idea directory if PyCharm isn't going to be used
4. Copy files from /docs/ to {{ cookiecutter.project_slug }}/docs/
TODO: this might have to be moved to a pre_gen_hook
@ -11,13 +12,11 @@ A portion of this code was adopted from Django's standard crypto functions and
utilities, specifically:
https://github.com/django/django/blob/master/django/utils/crypto.py
"""
import hashlib
from __future__ import print_function
import os
import random
import shutil
from cookiecutter.config import DEFAULT_CONFIG
# Get the root project directory
PROJECT_DIRECTORY = os.path.realpath(os.path.curdir)
@ -26,34 +25,26 @@ try:
random = random.SystemRandom()
using_sysrandom = True
except NotImplementedError:
# import warnings
# warnings.warn('A secure pseudo-random number generator is not available '
# 'on your system. Falling back to Mersenne Twister.')
using_sysrandom = False
def get_random_string(
length=50,
allowed_chars='abcdefghijklmnopqrstuvwxyz0123456789!@#$%^&*(-_=+)'):
allowed_chars='abcdefghijklmnopqrstuvwxyz0123456789!@#%^&*(-_=+)'):
"""
Returns a securely generated random string.
The default length of 12 with the a-z, A-Z, 0-9 character set returns
a 71-bit value. log_2((26+26+10)^12) =~ 71 bits
"""
if not using_sysrandom:
# This is ugly, and a hack, but it makes things better than
# the alternative of predictability. This re-seeds the PRNG
# using a value that is hard for an attacker to predict, every
# time a random string is required. This may change the
# properties of the chosen random sequence slightly, but this
# is better than absolute predictability.
random.seed(
hashlib.sha256(
("%s%s%s" % (
random.getstate(),
time.time(),
settings.SECRET_KEY)).encode('utf-8')
).digest())
return ''.join(random.choice(allowed_chars) for i in range(length))
if using_sysrandom:
return ''.join(random.choice(allowed_chars) for i in range(length))
print(
"Cookiecutter Django couldn't find a secure pseudo-random number generator on your system."
" Please change change your SECRET_KEY variables in conf/settings/local.py and env.example"
" manually."
)
return "CHANGEME!!"
def set_secret_key(setting_file_location):
# Open locals.py
@ -91,15 +82,122 @@ def make_secret_key(project_directory):
set_secret_key(env_file)
def remove_file(file_name):
if os.path.exists(file_name):
os.remove(file_name)
def remove_task_app(project_directory):
"""Removes the taskapp if celery isn't going to be used"""
# Determine the local_setting_file_location
task_app_location = os.path.join(
PROJECT_DIRECTORY,
'{{ cookiecutter.repo_name }}/taskapp'
'{{ cookiecutter.project_slug }}/taskapp'
)
shutil.rmtree(task_app_location)
def remove_pycharm_dir(project_directory):
"""
Removes directories related to PyCharm
if it isn't going to be used
"""
idea_dir_location = os.path.join(PROJECT_DIRECTORY, '.idea/')
if os.path.exists(idea_dir_location):
shutil.rmtree(idea_dir_location)
docs_dir_location = os.path.join(PROJECT_DIRECTORY, 'docs/pycharm/')
if os.path.exists(docs_dir_location):
shutil.rmtree(docs_dir_location)
def remove_heroku_files():
"""
Removes files needed for heroku if it isn't going to be used
"""
filenames = ["Procfile", "runtime.txt"]
if '{{ cookiecutter.use_elasticbeanstalk_experimental }}'.lower() != 'y':
filenames.append("requirements.txt")
for filename in ["Procfile", "runtime.txt"]:
file_name = os.path.join(PROJECT_DIRECTORY, filename)
remove_file(file_name)
def remove_docker_files():
"""
Removes files needed for docker if it isn't going to be used
"""
for filename in ["dev.yml", "docker-compose.yml", ".dockerignore"]:
os.remove(os.path.join(
PROJECT_DIRECTORY, filename
))
shutil.rmtree(os.path.join(
PROJECT_DIRECTORY, "compose"
))
def remove_grunt_files():
"""
Removes files needed for grunt if it isn't going to be used
"""
for filename in ["Gruntfile.js"]:
os.remove(os.path.join(
PROJECT_DIRECTORY, filename
))
def remove_gulp_files():
"""
Removes files needed for grunt if it isn't going to be used
"""
for filename in ["gulpfile.js"]:
os.remove(os.path.join(
PROJECT_DIRECTORY, filename
))
def remove_packageJSON_file():
"""
Removes files needed for grunt if it isn't going to be used
"""
for filename in ["package.json"]:
os.remove(os.path.join(
PROJECT_DIRECTORY, filename
))
def remove_certbot_files():
"""
Removes files needed for certbot if it isn't going to be used
"""
nginx_dir_location = os.path.join(PROJECT_DIRECTORY, 'compose/nginx')
for filename in ["nginx-secure.conf", "start.sh", "dhparams.example.pem"]:
file_name = os.path.join(nginx_dir_location, filename)
remove_file(file_name)
def remove_copying_files():
"""
Removes files needed for the GPLv3 licence if it isn't going to be used
"""
for filename in ["COPYING"]:
os.remove(os.path.join(
PROJECT_DIRECTORY, filename
))
def remove_elasticbeanstalk():
"""
Removes elastic beanstalk components
"""
docs_dir_location = os.path.join(PROJECT_DIRECTORY, '.ebextensions')
if os.path.exists(docs_dir_location):
shutil.rmtree(docs_dir_location)
filenames = ["ebsetenv.py", ]
if '{{ cookiecutter.use_heroku }}'.lower() != 'y':
filenames.append("requirements.txt")
for filename in filenames:
os.remove(os.path.join(
PROJECT_DIRECTORY, filename
))
# IN PROGRESS
# def copy_doc_files(project_directory):
# cookiecutters_dir = DEFAULT_CONFIG['cookiecutters_dir']
@ -125,5 +223,60 @@ make_secret_key(PROJECT_DIRECTORY)
if '{{ cookiecutter.use_celery }}'.lower() == 'n':
remove_task_app(PROJECT_DIRECTORY)
# 3. Copy files from /docs/ to {{ cookiecutter.repo_name }}/docs/
# copy_doc_files(PROJECT_DIRECTORY)
# 3. Removes the .idea directory if PyCharm isn't going to be used
if '{{ cookiecutter.use_pycharm }}'.lower() != 'y':
remove_pycharm_dir(PROJECT_DIRECTORY)
# 4. Removes all heroku files if it isn't going to be used
if '{{ cookiecutter.use_heroku }}'.lower() != 'y':
remove_heroku_files()
# 5. Removes all docker files if it isn't going to be used
if '{{ cookiecutter.use_docker }}'.lower() != 'y':
remove_docker_files()
# 6. Removes all JS task manager files if it isn't going to be used
if '{{ cookiecutter.js_task_runner}}'.lower() == 'gulp':
remove_grunt_files()
elif '{{ cookiecutter.js_task_runner}}'.lower() == 'grunt':
remove_gulp_files()
else:
remove_gulp_files()
remove_grunt_files()
remove_packageJSON_file()
# 7. Removes all certbot/letsencrypt files if it isn't going to be used
if '{{ cookiecutter.use_lets_encrypt }}'.lower() != 'y':
remove_certbot_files()
# 8. Display a warning if use_docker and use_grunt are selected. Grunt isn't
# supported by our docker config atm.
if '{{ cookiecutter.js_task_runner }}'.lower() in ['grunt', 'gulp'] and '{{ cookiecutter.use_docker }}'.lower() == 'y':
print(
"You selected to use docker and a JS task runner. This is NOT supported out of the box for now. You "
"can continue to use the project like you normally would, but you will need to add a "
"js task runner service to your docker configuration manually."
)
# 9. Removes the certbot/letsencrypt files and display a warning if use_lets_encrypt is selected and use_docker isn't.
if '{{ cookiecutter.use_lets_encrypt }}'.lower() == 'y' and '{{ cookiecutter.use_docker }}'.lower() != 'y':
remove_certbot_files()
print(
"You selected to use Let's Encrypt and didn't select to use docker. This is NOT supported out of the box for now. You "
"can continue to use the project like you normally would, but Let's Encrypt files have been included."
)
# 10. Directs the user to the documentation if certbot and docker are selected.
if '{{ cookiecutter.use_lets_encrypt }}'.lower() == 'y' and '{{ cookiecutter.use_docker }}'.lower() == 'y':
print(
"You selected to use Let's Encrypt, please see the documentation for instructions on how to use this in production. "
"You must generate a dhparams.pem file before running docker-compose in a production environment."
)
# 11. Removes files needed for the GPLv3 licence if it isn't going to be used.
if '{{ cookiecutter.open_source_license}}' != 'GPLv3':
remove_copying_files()
# 12. Remove Elastic Beanstalk files
if '{{ cookiecutter.use_elasticbeanstalk_experimental }}'.lower() != 'y':
remove_elasticbeanstalk()

11
hooks/pre_gen_project.py Normal file
View File

@ -0,0 +1,11 @@
project_slug = '{{ cookiecutter.project_slug }}'
if hasattr(project_slug, 'isidentifier'):
assert project_slug.isidentifier(), 'Project slug should be valid Python identifier!'
elasticbeanstalk = '{{ cookiecutter.use_elasticbeanstalk_experimental }}'.lower()
heroku = '{{ cookiecutter.use_heroku }}'.lower()
docker = '{{ cookiecutter.use_docker }}'.lower()
if elasticbeanstalk == 'y' and (heroku == 'y' or docker == 'y'):
raise Exception("Cookiecutter Django's EXPERIMENTAL Elastic Beanstalk support is incompatible with Heroku and Docker setups.")

View File

@ -1,10 +1,11 @@
cookiecutter==1.3.0
flake8==2.5.0
cookiecutter==1.4.0
flake8==3.0.4 # pyup: != 2.6.0
sh==1.11
binaryornot==0.4.0
# Testing
pytest==2.8.3
pep8==1.6.2
pyflakes==1.0.0
tox==2.2.1
pytest==3.0.3
pep8==1.7.0
pyflakes==1.3.0
tox==2.4.1
pytest-cookies==0.2.0

View File

@ -0,0 +1,4 @@
# These requirements prevented an upgrade to Django 1.10.
django-coverage-plugin==1.3.1
django-autoslug==1.9.3

View File

@ -1,3 +1,3 @@
[pytest]
[tool:pytest]
python_paths = .
norecursedirs = .tox .git */migrations/* */static/* docs venv */{{cookiecutter.repo_name}}/*
norecursedirs = .tox .git */migrations/* */static/* docs venv */{{cookiecutter.project_slug}}/*

View File

@ -1,7 +1,6 @@
#!/usr/bin/env python
import os
import platform
import sys
try:
@ -11,11 +10,11 @@ except ImportError:
# Our version ALWAYS matches the version of Django we support
# If Django has a new release, we branch, tag, then update this setting after the tag.
version = "1.9.1"
version = '1.10.1'
if sys.argv[-1] == 'tag':
os.system("git tag -a %s -m 'version %s'" % (version, version))
os.system("git push --tags")
os.system('git tag -a %s -m "version %s"' % (version, version))
os.system('git push --tags')
sys.exit()
with open('README.rst') as readme_file:
@ -35,7 +34,7 @@ setup(
classifiers=[
'Development Status :: 4 - Beta',
'Environment :: Console',
'Framework :: Django :: 1.9',
'Framework :: Django :: 1.10',
'Intended Audience :: Developers',
'Natural Language :: English',
'License :: OSI Approved :: BSD License',

26
tests/test_cookiecutter_generation.py Normal file → Executable file
View File

@ -7,23 +7,21 @@ import sh
import pytest
from binaryornot.check import is_binary
PATTERN = "{{(\s?cookiecutter)[.](.*?)}}"
PATTERN = '{{(\s?cookiecutter)[.](.*?)}}'
RE_OBJ = re.compile(PATTERN)
@pytest.fixture
def context():
return {
"project_name": "My Test Project",
"repo_name": "my_test_project",
"author_name": "Test Author",
"email": "test@example.com",
"description": "A short description of the project.",
"domain_name": "example.com",
"version": "0.1.0",
"timezone": "UTC",
"now": "2015/01/13",
"year": "2015"
'project_name': 'My Test Project',
'project_slug': 'my_test_project',
'author_name': 'Test Author',
'email': 'test@example.com',
'description': 'A short description of the project.',
'domain_name': 'example.com',
'version': '0.1.0',
'timezone': 'UTC',
}
@ -46,7 +44,7 @@ def check_paths(paths):
continue
for line in open(path, 'r'):
match = RE_OBJ.search(line)
msg = "cookiecutter variable not replaced in {}"
msg = 'cookiecutter variable not replaced in {}'
assert match is None, msg.format(path)
@ -54,7 +52,7 @@ def test_default_configuration(cookies, context):
result = cookies.bake(extra_context=context)
assert result.exit_code == 0
assert result.exception is None
assert result.project.basename == context['repo_name']
assert result.project.basename == context['project_slug']
assert result.project.isdir()
paths = build_files_list(str(result.project))
@ -72,7 +70,7 @@ def test_enabled_features(cookies, feature_context):
result = cookies.bake(extra_context=feature_context)
assert result.exit_code == 0
assert result.exception is None
assert result.project.basename == feature_context['repo_name']
assert result.project.basename == feature_context['project_slug']
assert result.project.isdir()
paths = build_files_list(str(result.project))

21
tests/test_docker.sh Executable file
View File

@ -0,0 +1,21 @@
#!/bin/sh
# this is a very simple script that tests the docker configuration for cookiecutter-django
# it is meant to be run from the root directory of the repository, eg:
# sh tests/test_docker.sh
# install test requirements
pip install -r requirements.txt
# create a cache directory
mkdir -p .cache/docker
cd .cache/docker
# create the project using the default settings in cookiecutter.json
cookiecutter ../../ --no-input --overwrite-if-exists
cd project_name
# run the project's tests
docker-compose -f dev.yml run django python manage.py test
# return non-zero status code if there are migrations that have not been created
docker-compose -f dev.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; }

View File

@ -6,7 +6,7 @@ envlist = py27,py34,py35
passenv = LC_ALL, LANG, HOME
deps =
binaryornot
flake8
flake8==2.5.5
pytest-cookies
sh
commands = py.test {posargs:tests}

View File

@ -1,5 +1,5 @@
[run]
include = {{cookiecutter.repo_name}}/*
include = {{cookiecutter.project_slug}}/*
omit = *migrations*, *tests*
plugins =
django_coverage_plugin

View File

@ -0,0 +1,4 @@
.*
!.coveragerc
!.env
!.pylintrc

View File

@ -0,0 +1,5 @@
packages:
yum:
git: []
postgresql94-devel: []
libjpeg-turbo-devel: []

View File

@ -0,0 +1,46 @@
#This sample requires you to create a separate configuration file that defines the custom
# option settings for CacheCluster properties.
Resources:
MyCacheSecurityGroup:
Type: "AWS::EC2::SecurityGroup"
Properties:
GroupDescription: "Lock cache down to webserver access only"
SecurityGroupIngress :
- IpProtocol : "tcp"
FromPort :
Fn::GetOptionSetting:
OptionName : "CachePort"
DefaultValue: "6379"
ToPort :
Fn::GetOptionSetting:
OptionName : "CachePort"
DefaultValue: "6379"
SourceSecurityGroupName:
Ref: "AWSEBSecurityGroup"
MyElastiCache:
Type: "AWS::ElastiCache::CacheCluster"
Properties:
CacheNodeType:
Fn::GetOptionSetting:
OptionName : "CacheNodeType"
DefaultValue : "cache.t1.micro"
NumCacheNodes:
Fn::GetOptionSetting:
OptionName : "NumCacheNodes"
DefaultValue : "1"
Engine:
Fn::GetOptionSetting:
OptionName : "Engine"
DefaultValue : "redis"
VpcSecurityGroupIds:
-
Fn::GetAtt:
- MyCacheSecurityGroup
- GroupId
Outputs:
ElastiCache:
Description : "ID of ElastiCache Cache Cluster with Redis Engine"
Value :
Ref : "MyElastiCache"

View File

@ -0,0 +1,6 @@
option_settings:
"aws:elasticbeanstalk:customoption":
CacheNodeType : cache.t1.micro
NumCacheNodes : 1
Engine : redis
CachePort : 6379

View File

@ -0,0 +1,17 @@
container_commands:
01_migrate:
command: "source /opt/python/run/venv/bin/activate && python manage.py migrate --noinput"
leader_only: True
02_collectstatic:
command: "source /opt/python/run/venv/bin/activate && python manage.py collectstatic --noinput"
option_settings:
"aws:elasticbeanstalk:application:environment":
DJANGO_SETTINGS_MODULE: "config.settings.production"
REDIS_ENDPOINT_ADDRESS: '`{ "Fn::GetAtt" : [ "MyElastiCache", "RedisEndpoint.Address"]}`'
REDIS_PORT: '`{ "Fn::GetAtt" : [ "MyElastiCache", "RedisEndpoint.Port"]}`'
"aws:elasticbeanstalk:container:python":
WSGIPath: "config/wsgi.py"
NumProcesses: 3
NumThreads: 20
"aws:elasticbeanstalk:container:python:staticfiles":
"/static/": "www/static/"

View File

@ -14,7 +14,7 @@ indent_size = 4
[*.py]
line_length=120
known_first_party={{ cookiecutter.repo_name }}
known_first_party={{ cookiecutter.project_slug }}
multi_line_output=3
default_section=THIRDPARTY
@ -27,3 +27,7 @@ trim_trailing_whitespace = false
[Makefile]
indent_style = tab
[nginx.conf]
indent_style = space
indent_size = 2

View File

@ -24,8 +24,10 @@ sftp-config.json
__pycache__
# Logs
logs
*.log
pip-log.txt
npm-debug.log*
# Unit test / coverage reports
.coverage
@ -38,7 +40,17 @@ htmlcov
*.pot
# Pycharm
.idea
.idea/*
{% if cookiecutter.use_pycharm == 'y' %}
# Provided default Pycharm Run/Debug Configurations should be tracked by git
# In case of local modifications made by Pycharm, use update-index command
# for each changed file, like this:
# git update-index --assume-unchanged .idea/{{cookiecutter.project_slug}}.iml
!.idea/runConfigurations/
!.idea/{{cookiecutter.project_slug}}.iml
!.idea/vcs.xml
!.idea/webResources.xml
{% endif %}
# Vim
@ -56,10 +68,11 @@ node_modules/
.env
# User-uploaded media
{{ cookiecutter.repo_name }}/media/
# Hitch directory
tests/.hitch
{{ cookiecutter.project_slug }}/media/
{% if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' %}
# MailHog binary
mailhog
{% endif %}
staticfiles/

View File

@ -0,0 +1,32 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="Docker: migrate" type="Python.DjangoServer" factoryName="Django server" singleton="true">
<option name="INTERPRETER_OPTIONS" value="" />
<option name="PARENT_ENVS" value="true" />
<envs>
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="false" />
<option name="ADD_CONTENT_ROOTS" value="true" />
<option name="ADD_SOURCE_ROOTS" value="true" />
<module name="{{ cookiecutter.project_slug }}" />
<PathMappingSettings>
<option name="pathMappings">
<list>
<mapping local-root="$PROJECT_DIR$" remote-root="/app" />
</list>
</option>
</PathMappingSettings>
<option name="launchJavascriptDebuger" value="false" />
<option name="host" value="" />
<option name="additionalOptions" value="" />
<option name="browserUrl" value="" />
<option name="runTestServer" value="false" />
<option name="runNoReload" value="false" />
<option name="useCustomRunCommand" value="true" />
<option name="customRunCommand" value="migrate" />
<method />
</configuration>
</component>

View File

@ -0,0 +1,33 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="Docker: runserver" type="Python.DjangoServer" factoryName="Django server" singleton="true">
<option name="INTERPRETER_OPTIONS" value="" />
<option name="PARENT_ENVS" value="true" />
<envs>
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:django/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="false" />
<option name="ADD_CONTENT_ROOTS" value="true" />
<option name="ADD_SOURCE_ROOTS" value="true" />
<module name="{{ cookiecutter.project_slug }}" />
<PathMappingSettings>
<option name="pathMappings">
<list>
<mapping local-root="$PROJECT_DIR$" remote-root="/app" />
</list>
</option>
</PathMappingSettings>
<option name="launchJavascriptDebuger" value="false" />
<option name="port" value="8000" />
<option name="host" value="0.0.0.0" />
<option name="additionalOptions" value="" />
<option name="browserUrl" value="" />
<option name="runTestServer" value="false" />
<option name="runNoReload" value="false" />
<option name="useCustomRunCommand" value="false" />
<option name="customRunCommand" value="" />
<method />
</configuration>
</component>

View File

@ -0,0 +1,33 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="Docker: runserver_plus" type="Python.DjangoServer" factoryName="Django server" singleton="true">
<option name="INTERPRETER_OPTIONS" value="" />
<option name="PARENT_ENVS" value="true" />
<envs>
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:django/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="false" />
<option name="ADD_CONTENT_ROOTS" value="true" />
<option name="ADD_SOURCE_ROOTS" value="true" />
<module name="{{ cookiecutter.project_slug }}" />
<PathMappingSettings>
<option name="pathMappings">
<list>
<mapping local-root="$PROJECT_DIR$" remote-root="/app" />
</list>
</option>
</PathMappingSettings>
<option name="launchJavascriptDebuger" value="false" />
<option name="port" value="8000" />
<option name="host" value="0.0.0.0" />
<option name="additionalOptions" value="" />
<option name="browserUrl" value="" />
<option name="runTestServer" value="false" />
<option name="runNoReload" value="false" />
<option name="useCustomRunCommand" value="true" />
<option name="customRunCommand" value="runserver_plus" />
<method />
</configuration>
</component>

View File

@ -0,0 +1,30 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="Docker: tests - all" type="DjangoTestsConfigurationType" factoryName="Django tests" singleton="true">
<option name="INTERPRETER_OPTIONS" value="" />
<option name="PARENT_ENVS" value="true" />
<envs>
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />
<option name="ADD_SOURCE_ROOTS" value="true" />
<module name="{{ cookiecutter.project_slug }}" />
<EXTENSION ID="PythonCoverageRunConfigurationExtension" enabled="false" sample_coverage="true" runner="coverage.py" />
<PathMappingSettings>
<option name="pathMappings">
<list>
<mapping local-root="$PROJECT_DIR$" remote-root="/app" />
</list>
</option>
</PathMappingSettings>
<option name="TARGET" value="." />
<option name="SETTINGS_FILE" value="" />
<option name="CUSTOM_SETTINGS" value="false" />
<option name="USE_OPTIONS" value="false" />
<option name="OPTIONS" value="" />
<method />
</configuration>
</component>

View File

@ -0,0 +1,30 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="Docker: tests - class: TestUser" type="DjangoTestsConfigurationType" factoryName="Django tests" singleton="true">
<option name="INTERPRETER_OPTIONS" value="" />
<option name="PARENT_ENVS" value="true" />
<envs>
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />
<option name="ADD_SOURCE_ROOTS" value="true" />
<module name="{{ cookiecutter.project_slug }}" />
<EXTENSION ID="PythonCoverageRunConfigurationExtension" enabled="false" sample_coverage="true" runner="coverage.py" />
<PathMappingSettings>
<option name="pathMappings">
<list>
<mapping local-root="$PROJECT_DIR$" remote-root="/app" />
</list>
</option>
</PathMappingSettings>
<option name="TARGET" value="{{ cookiecutter.project_slug }}.users.tests.test_models.TestUser" />
<option name="SETTINGS_FILE" value="" />
<option name="CUSTOM_SETTINGS" value="false" />
<option name="USE_OPTIONS" value="false" />
<option name="OPTIONS" value="" />
<method />
</configuration>
</component>

View File

@ -0,0 +1,30 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="Docker: tests - file: test_models" type="DjangoTestsConfigurationType" factoryName="Django tests" singleton="true">
<option name="INTERPRETER_OPTIONS" value="" />
<option name="PARENT_ENVS" value="true" />
<envs>
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />
<option name="ADD_SOURCE_ROOTS" value="true" />
<module name="{{ cookiecutter.project_slug }}" />
<EXTENSION ID="PythonCoverageRunConfigurationExtension" enabled="false" sample_coverage="true" runner="coverage.py" />
<PathMappingSettings>
<option name="pathMappings">
<list>
<mapping local-root="$PROJECT_DIR$" remote-root="/app" />
</list>
</option>
</PathMappingSettings>
<option name="TARGET" value="{{ cookiecutter.project_slug }}.users.tests.test_models" />
<option name="SETTINGS_FILE" value="" />
<option name="CUSTOM_SETTINGS" value="false" />
<option name="USE_OPTIONS" value="false" />
<option name="OPTIONS" value="" />
<method />
</configuration>
</component>

View File

@ -0,0 +1,30 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="Docker: tests - module: users" type="DjangoTestsConfigurationType" factoryName="Django tests" singleton="true">
<option name="INTERPRETER_OPTIONS" value="" />
<option name="PARENT_ENVS" value="true" />
<envs>
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />
<option name="ADD_SOURCE_ROOTS" value="true" />
<module name="{{ cookiecutter.project_slug }}" />
<EXTENSION ID="PythonCoverageRunConfigurationExtension" enabled="false" sample_coverage="true" runner="coverage.py" />
<PathMappingSettings>
<option name="pathMappings">
<list>
<mapping local-root="$PROJECT_DIR$" remote-root="/app" />
</list>
</option>
</PathMappingSettings>
<option name="TARGET" value="{{ cookiecutter.project_slug }}.users" />
<option name="SETTINGS_FILE" value="" />
<option name="CUSTOM_SETTINGS" value="false" />
<option name="USE_OPTIONS" value="false" />
<option name="OPTIONS" value="" />
<method />
</configuration>
</component>

View File

@ -0,0 +1,30 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="Docker: tests - specific: test_get_absolute_url" type="DjangoTestsConfigurationType" factoryName="Django tests" singleton="true">
<option name="INTERPRETER_OPTIONS" value="" />
<option name="PARENT_ENVS" value="true" />
<envs>
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />
<option name="ADD_SOURCE_ROOTS" value="true" />
<module name="{{ cookiecutter.project_slug }}" />
<EXTENSION ID="PythonCoverageRunConfigurationExtension" enabled="false" sample_coverage="true" runner="coverage.py" />
<PathMappingSettings>
<option name="pathMappings">
<list>
<mapping local-root="$PROJECT_DIR$" remote-root="/app" />
</list>
</option>
</PathMappingSettings>
<option name="TARGET" value="{{ cookiecutter.project_slug }}.users.tests.test_models.TestUser.test_get_absolute_url" />
<option name="SETTINGS_FILE" value="" />
<option name="CUSTOM_SETTINGS" value="false" />
<option name="USE_OPTIONS" value="false" />
<option name="OPTIONS" value="" />
<method />
</configuration>
</component>

View File

@ -0,0 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="VcsDirectoryMappings">
<mapping directory="$PROJECT_DIR$" vcs="Git" />
</component>
</project>

View File

@ -0,0 +1,14 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="WebResourcesPaths">
<contentEntries>
<entry url="file://$PROJECT_DIR$">
<entryData>
<resourceRoots>
<path value="file://$PROJECT_DIR$/{{ cookiecutter.project_slug }}/static" />
</resourceRoots>
</entryData>
</entry>
</contentEntries>
</component>
</project>

View File

@ -0,0 +1,34 @@
<?xml version="1.0" encoding="UTF-8"?>
<module type="PYTHON_MODULE" version="4">
<component name="FacetManager">
<facet type="django" name="Django">
<configuration>
<option name="rootFolder" value="$MODULE_DIR$" />
<option name="settingsModule" value="config/settings/local.py" />
<option name="manageScript" value="manage.py" />
<option name="environment" value="&lt;map/&gt;" />
</configuration>
</facet>
</component>
<component name="NewModuleRootManager">
<content url="file://$MODULE_DIR$">
<excludeFolder url="file://$MODULE_DIR$/node_modules" />
</content>
<orderEntry type="sourceFolder" forTests="false" />
</component>
<component name="PackageRequirementsSettings">
<option name="requirementsPath" value="$MODULE_DIR$/requirements/local.txt" />
</component>
<component name="TemplatesService">
<option name="TEMPLATE_CONFIGURATION" value="Django" />
<option name="TEMPLATE_FOLDERS">
<list>
<option value="$MODULE_DIR$/{{ cookiecutter.project_slug }}/templates" />
</list>
</option>
</component>
<component name="TestRunnerService">
<option name="projectConfiguration" value="py.test" />
<option name="PROJECT_TEST_RUNNER" value="py.test" />
</component>
</module>

View File

@ -8,4 +8,7 @@ max-line-length=120
disable=missing-docstring,invalid-name
[DESIGN]
max-parents=13
max-parents=13
[TYPECHECK]
generated-members=REQUEST,acl_users,aq_parent,"[a-zA-Z]+_set{1,2}",save,delete

View File

@ -8,14 +8,8 @@ before_install:
- sudo apt-get install -qq libsqlite3-dev libxml2 libxml2-dev libssl-dev libbz2-dev wget curl llvm
language: python
python:
{% if cookiecutter.use_python2 == 'n' -%}
{% if cookiecutter.use_python3 == 'y' -%}
- "3.5"
{% else %}
- "2.7"
{%- endif %}
install:
- "pip install hitch"
- "cd tests"
- "hitch init"
script:
- "hitch test . --extra '{\"xvfb\":true, \"pause_on_failure\":false}'"

View File

@ -0,0 +1 @@
{{ cookiecutter.author_name }}

View File

@ -0,0 +1,674 @@
GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
<http://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
<http://www.gnu.org/philosophy/why-not-lgpl.html>.

View File

@ -112,14 +112,14 @@ module.exports = function (grunt) {
runDjango: {
cmd: 'python <%= paths.manageScript %> runserver'
},
{% if cookiecutter.use_mailhog == "y" -%}runMailHog: {
{% if cookiecutter.use_mailhog == "y" and cookiecutter.use_docker == 'n' -%}runMailHog: {
cmd: './mailhog'
},{%- endif %}
}
});
grunt.registerTask('serve', [
{% if cookiecutter.use_mailhog == "y" -%}
{% if cookiecutter.use_mailhog == "y" and cookiecutter.use_docker == 'n' -%}
'bgShell:runMailHog',
{%- endif %}
'bgShell:runDjango',

View File

@ -0,0 +1,53 @@
{% if cookiecutter.open_source_license == 'MIT' %}
The MIT License (MIT)
Copyright (c) {% now 'utc', '%Y' %}, {{ cookiecutter.author_name }}
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
{% elif cookiecutter.open_source_license == 'BSD' %}
Copyright (c) {% now 'utc', '%Y' %}, {{ cookiecutter.author_name }}
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
* Neither the name of {{ cookiecutter.project_name }} nor the names of its
contributors may be used to endorse or promote products derived from this
software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT,
INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
OF THE POSSIBILITY OF SUCH DAMAGE.
{% elif cookiecutter.open_source_license == 'GPLv3' %}
Copyright (c) {% now 'utc', '%Y' %}, {{ cookiecutter.author_name }}
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
{% endif %}

View File

@ -0,0 +1,4 @@
web: gunicorn config.wsgi:application
{% if cookiecutter.use_celery == "y" -%}
worker: celery worker --app={{cookiecutter.project_slug}}.taskapp --loglevel=info
{%- endif %}

View File

@ -0,0 +1,147 @@
{{cookiecutter.project_name}}
{{ '=' * cookiecutter.project_name|length }}
{{cookiecutter.description}}
.. image:: https://img.shields.io/badge/built%20with-Cookiecutter%20Django-ff69b4.svg
:target: https://github.com/pydanny/cookiecutter-django/
:alt: Built with Cookiecutter Django
{% if cookiecutter.open_source_license != "Not open source" %}
:License: {{cookiecutter.open_source_license}}
{% endif %}
Settings
--------
Moved to settings_.
.. _settings: http://cookiecutter-django.readthedocs.io/en/latest/settings.html
Basic Commands
--------------
Setting Up Your Users
^^^^^^^^^^^^^^^^^^^^^
* To create a **normal user account**, just go to Sign Up and fill out the form. Once you submit it, you'll see a "Verify Your E-mail Address" page. Go to your console to see a simulated email verification message. Copy the link into your browser. Now the user's email should be verified and ready to go.
* To create an **superuser account**, use this command::
$ python manage.py createsuperuser
For convenience, you can keep your normal user logged in on Chrome and your superuser logged in on Firefox (or similar), so that you can see how the site behaves for both kinds of users.
Test coverage
^^^^^^^^^^^^^
To run the tests, check your test coverage, and generate an HTML coverage report::
$ coverage run manage.py test
$ coverage html
$ open htmlcov/index.html
Running tests with py.test
~~~~~~~~~~~~~~~~~~~~~~~~~~
::
$ py.test
Live reloading and Sass CSS compilation
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Moved to `Live reloading and SASS compilation`_.
.. _`Live reloading and SASS compilation`: http://cookiecutter-django.readthedocs.io/en/latest/live-reloading-and-sass-compilation.html
{% if cookiecutter.use_celery == "y" %}
Celery
^^^^^^
This app comes with Celery.
To run a celery worker:
.. code-block:: bash
cd {{cookiecutter.project_slug}}
celery -A {{cookiecutter.project_slug}}.taskapp worker -l info
Please note: For Celery's import magic to work, it is important *where* the celery commands are run. If you are in the same folder with *manage.py*, you should be right.
{% endif %}
{% if cookiecutter.use_mailhog == "y" %}
Email Server
^^^^^^^^^^^^
{% if cookiecutter.use_docker == 'y' %}
In development, it is often nice to be able to see emails that are being sent from your application. For that reason local SMTP server `MailHog`_ with a web interface is available as docker container.
.. _mailhog: https://github.com/mailhog/MailHog
Container mailhog will start automatically when you will run all docker containers.
Please check `cookiecutter-django Docker documentation`_ for more details how to start all containers.
With MailHog running, to view messages that are sent by your application, open your browser and go to ``http://127.0.0.1:8025``
{% else %}
In development, it is often nice to be able to see emails that are being sent from your application. If you choose to use `MailHog`_ when generating the project a local SMTP server with a web interface will be available.
.. _mailhog: https://github.com/mailhog/MailHog
To start the service, make sure you have nodejs installed, and then type the following::
$ npm install
$ grunt serve
(After the first run you only need to type ``grunt serve``) This will start an email server that listens on ``127.0.0.1:1025`` in addition to starting your Django project and a watch task for live reload.
To view messages that are sent by your application, open your browser and go to ``http://127.0.0.1:8025``
The email server will exit when you exit the Grunt task on the CLI with Ctrl+C.
{% endif %}
{% endif %}
{% if cookiecutter.use_sentry_for_error_reporting == "y" %}
Sentry
^^^^^^
Sentry is an error logging aggregator service. You can sign up for a free account at https://getsentry.com/signup/?code=cookiecutter or download and host it yourself.
The system is setup with reasonable defaults, including 404 logging and integration with the WSGI application.
You must set the DSN url in production.
{% endif %}
Deployment
----------
The following details how to deploy this application.
{% if cookiecutter.use_heroku.lower() == "y" %}
Heroku
^^^^^^
See detailed `cookiecutter-django Heroku documentation`_.
.. _`cookiecutter-django Heroku documentation`: http://cookiecutter-django.readthedocs.io/en/latest/deployment-on-heroku.html
{% endif %}
{% if cookiecutter.use_docker.lower() == "y" %}
Docker
^^^^^^
See detailed `cookiecutter-django Docker documentation`_.
.. _`cookiecutter-django Docker documentation`: http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-docker.html
{% endif %}
{% if cookiecutter.use_elasticbeanstalk_experimental.lower() == 'y' %}
Elastic Beanstalk
~~~~~~~~~~~~~~~~~~
See detailed `cookiecutter-django Elastic Beanstalk documentation`_.
.. _`cookiecutter-django Docker documentation`: http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-elastic-beanstalk.html
{% endif %}

View File

@ -1,4 +1,4 @@
{% if cookiecutter.use_python2 == 'n' -%}
{% if cookiecutter.use_python3 == 'y' -%}
FROM python:3.5
{% else %}
FROM python:2.7
@ -8,18 +8,21 @@ ENV PYTHONUNBUFFERED 1
# Requirements have to be pulled and installed here, otherwise caching won't work
COPY ./requirements /requirements
RUN pip install -r /requirements/production.txt
RUN pip install -r /requirements/production.txt \
&& groupadd -r django \
&& useradd -r -g django django
RUN groupadd -r django && useradd -r -g django django
COPY . /app
RUN chown -R django /app
COPY ./compose/django/gunicorn.sh /gunicorn.sh
COPY ./compose/django/entrypoint.sh /entrypoint.sh
RUN sed -i 's/\r//' /entrypoint.sh
RUN sed -i 's/\r//' /gunicorn.sh
RUN chmod +x /entrypoint.sh && chown django /entrypoint.sh
RUN chmod +x /gunicorn.sh && chown django /gunicorn.sh
RUN sed -i 's/\r//' /entrypoint.sh \
&& sed -i 's/\r//' /gunicorn.sh \
&& chmod +x /entrypoint.sh \
&& chown django /entrypoint.sh \
&& chmod +x /gunicorn.sh \
&& chown django /gunicorn.sh
WORKDIR /app

View File

@ -1,4 +1,4 @@
{% if cookiecutter.use_python2 == 'n' -%}
{% if cookiecutter.use_python3 == 'y' -%}
FROM python:3.5
{% else %}
FROM python:2.7
@ -13,6 +13,10 @@ COPY ./compose/django/entrypoint.sh /entrypoint.sh
RUN sed -i 's/\r//' /entrypoint.sh
RUN chmod +x /entrypoint.sh
COPY ./compose/django/start-dev.sh /start-dev.sh
RUN sed -i 's/\r//' /start-dev.sh
RUN chmod +x /start-dev.sh
WORKDIR /app
ENTRYPOINT ["/entrypoint.sh"]

View File

@ -0,0 +1,39 @@
#!/bin/bash
set -e
cmd="$@"
# This entrypoint is used to play nicely with the current cookiecutter configuration.
# Since docker-compose relies heavily on environment variables itself for configuration, we'd have to define multiple
# environment variables just to support cookiecutter out of the box. That makes no sense, so this little entrypoint
# does all this for us.
export REDIS_URL=redis://redis:6379
# the official postgres image uses 'postgres' as default user if not set explictly.
if [ -z "$POSTGRES_USER" ]; then
export POSTGRES_USER=postgres
fi
export DATABASE_URL=postgres://$POSTGRES_USER:$POSTGRES_PASSWORD@postgres:5432/$POSTGRES_USER
{% if cookiecutter.use_celery == 'y' %}
export CELERY_BROKER_URL=$REDIS_URL/0
{% endif %}
function postgres_ready(){
python << END
import sys
import psycopg2
try:
conn = psycopg2.connect(dbname="$POSTGRES_USER", user="$POSTGRES_USER", password="$POSTGRES_PASSWORD", host="postgres")
except psycopg2.OperationalError:
sys.exit(-1)
sys.exit(0)
END
}
until postgres_ready; do
>&2 echo "Postgres is unavailable - sleeping"
sleep 1
done
>&2 echo "Postgres is up - continuing..."
exec $cmd

View File

@ -0,0 +1,3 @@
#!/bin/sh
python manage.py migrate
python manage.py runserver_plus 0.0.0.0:8000

View File

@ -0,0 +1,9 @@
FROM nginx:latest
ADD nginx.conf /etc/nginx/nginx.conf
{% if cookiecutter.use_lets_encrypt == 'y' and cookiecutter.use_docker == 'y' %}
ADD start.sh /start.sh
ADD nginx-secure.conf /etc/nginx/nginx-secure.conf
ADD dhparams.pem /etc/ssl/private/dhparams.pem
CMD /start.sh
{% endif %}

View File

@ -0,0 +1,3 @@
-----BEGIN DH PARAMETERS-----
EXAMPLE_FILE
-----END DH PARAMETERS-----

View File

@ -0,0 +1,96 @@
user nginx;
worker_processes 1;
error_log /var/log/nginx/error.log warn;
pid /var/run/nginx.pid;
events {
worker_connections 1024;
}
http {
include /etc/nginx/mime.types;
default_type application/octet-stream;
log_format main '$remote_addr - $remote_user [$time_local] "$request" '
'$status $body_bytes_sent "$http_referer" '
'"$http_user_agent" "$http_x_forwarded_for"';
access_log /var/log/nginx/access.log main;
sendfile on;
#tcp_nopush on;
keepalive_timeout 65;
proxy_headers_hash_bucket_size 52;
gzip on;
upstream app {
server django:5000;
}
server {
listen 80;
server_name ___my.example.com___ www.___my.example.com___;
location /.well-known/acme-challenge {
# Since the certbot container isn't up constantly, need to resolve ip dynamically using docker's dns
resolver ___NAMESERVER___;
set $certbot_addr_port certbot:80;
proxy_pass http://$certbot_addr_port;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-Proto $scheme;
}
location / {
return 301 https://$server_name$request_uri;
}
}
server {
listen 443;
server_name ___my.example.com___ www.___my.example.com___;
ssl on;
ssl_certificate /etc/letsencrypt/live/___my.example.com___/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/___my.example.com___/privkey.pem;
ssl_session_timeout 5m;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_ciphers 'EECDH+AESGCM:EDH+AESGCM:AES256+EECDH:AES256+EDH';
ssl_prefer_server_ciphers on;
ssl_session_cache shared:SSL:10m;
ssl_dhparam /etc/ssl/private/dhparams.pem;
location /.well-known/acme-challenge {
resolver ___NAMESERVER___;
set $certbot_addr_port certbot:443;
proxy_pass http://$certbot_addr_port;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-Proto https;
}
location / {
# checks for static file, if not found proxy to app
try_files $uri @proxy_to_app;
}
# cookiecutter-django app
location @proxy_to_app {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_redirect off;
proxy_pass http://app;
}
}
}

View File

@ -0,0 +1,61 @@
user nginx;
worker_processes 1;
error_log /var/log/nginx/error.log warn;
pid /var/run/nginx.pid;
events {
worker_connections 1024;
}
http {
include /etc/nginx/mime.types;
default_type application/octet-stream;
log_format main '$remote_addr - $remote_user [$time_local] "$request" '
'$status $body_bytes_sent "$http_referer" '
'"$http_user_agent" "$http_x_forwarded_for"';
access_log /var/log/nginx/access.log main;
sendfile on;
#tcp_nopush on;
keepalive_timeout 65;
#gzip on;
upstream app {
server django:5000;
}
server {
listen 80;
charset utf-8;
{% if cookiecutter.use_lets_encrypt == 'y' and cookiecutter.use_docker == 'y' %}
server_name ___my.example.com___ ;
location /.well-known/acme-challenge {
proxy_pass http://certbot:80;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-Proto https;
}
{% endif %}
location / {
# checks for static file, if not found proxy to app
try_files $uri @proxy_to_app;
}
# cookiecutter-django app
location @proxy_to_app {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_redirect off;
proxy_pass http://app;
}
}
}

View File

@ -0,0 +1,62 @@
echo sleep 5
sleep 5
echo build starting nginx config
echo replacing ___my.example.com___/$MY_DOMAIN_NAME
# Put your domain name into the nginx reverse proxy config.
sed -i "s/___my.example.com___/$MY_DOMAIN_NAME/g" /etc/nginx/nginx.conf
cat /etc/nginx/nginx.conf
echo .
echo Firing up nginx in the background.
nginx
# # Check user has specified domain name
if [ -z "$MY_DOMAIN_NAME" ]; then
echo "Need to set MY_DOMAIN_NAME (to a letsencrypt-registered name)."
exit 1
fi
# This bit waits until the letsencrypt container has done its thing.
# We see the changes here bceause there's a docker volume mapped.
echo Waiting for folder /etc/letsencrypt/live/$MY_DOMAIN_NAME to exist
while [ ! -d /etc/letsencrypt/live/$MY_DOMAIN_NAME ] ;
do
sleep 2
done
while [ ! -f /etc/letsencrypt/live/$MY_DOMAIN_NAME/fullchain.pem ] ;
do
echo Waiting for file fullchain.pem to exist
sleep 2
done
while [ ! -f /etc/letsencrypt/live/$MY_DOMAIN_NAME/privkey.pem ] ;
do
echo Waiting for file privkey.pem to exist
sleep 2
done
# This is added so that when the certificate is being renewed or is already in place, nginx waits for everything to be good.
sleep 15
echo replacing ___my.example.com___/$MY_DOMAIN_NAME
# Put your domain name into the nginx reverse proxy config.
sed -i "s/___my.example.com___/$MY_DOMAIN_NAME/g" /etc/nginx/nginx-secure.conf
# Add the system's nameserver (the docker network dns) so we can resolve container names in nginx
NAMESERVER=`cat /etc/resolv.conf | grep "nameserver" | awk '{print $2}' | tr '\n' ' '`
echo replacing ___NAMESERVER___/$NAMESERVER
sed -i "s/___NAMESERVER___/$NAMESERVER/g" /etc/nginx/nginx-secure.conf
#go!
kill $(ps aux | grep 'nginx' | grep -v 'grep' | awk '{print $2}')
cp /etc/nginx/nginx-secure.conf /etc/nginx/nginx.conf
nginx -g 'daemon off;'

View File

@ -0,0 +1,11 @@
FROM postgres:{{ cookiecutter.postgresql_version }}
# add backup scripts
ADD backup.sh /usr/local/bin/backup
ADD restore.sh /usr/local/bin/restore
ADD list-backups.sh /usr/local/bin/list-backups
# make them executable
RUN chmod +x /usr/local/bin/restore
RUN chmod +x /usr/local/bin/list-backups
RUN chmod +x /usr/local/bin/backup

View File

@ -0,0 +1,22 @@
#!/bin/bash
# stop on errors
set -e
# we might run into trouble when using the default `postgres` user, e.g. when dropping the postgres
# database in restore.sh. Check that something else is used here
if [ "$POSTGRES_USER" == "postgres" ]
then
echo "creating a backup as the postgres user is not supported, make sure to set the POSTGRES_USER environment variable"
exit 1
fi
# export the postgres password so that subsequent commands don't ask for it
export PGPASSWORD=$POSTGRES_PASSWORD
echo "creating backup"
echo "---------------"
FILENAME=backup_$(date +'%Y_%m_%dT%H_%M_%S').sql.gz
pg_dump -h postgres -U $POSTGRES_USER | gzip > /backups/$FILENAME
echo "successfully created backup $FILENAME"

View File

@ -0,0 +1,4 @@
#!/bin/bash
echo "listing available backups"
echo "-------------------------"
ls /backups/

View File

@ -0,0 +1,56 @@
#!/bin/bash
# stop on errors
set -e
# we might run into trouble when using the default `postgres` user, e.g. when dropping the postgres
# database in restore.sh. Check that something else is used here
if [ "$POSTGRES_USER" == "postgres" ]
then
echo "restoring as the postgres user is not supported, make sure to set the POSTGRES_USER environment variable"
exit 1
fi
# export the postgres password so that subsequent commands don't ask for it
export PGPASSWORD=$POSTGRES_PASSWORD
# check that we have an argument for a filename candidate
if [[ $# -eq 0 ]] ; then
echo 'usage:'
echo ' docker-compose run postgres restore <backup-file>'
echo ''
echo 'to get a list of available backups, run:'
echo ' docker-compose run postgres list-backups'
exit 1
fi
# set the backupfile variable
BACKUPFILE=/backups/$1
# check that the file exists
if ! [ -f $BACKUPFILE ]; then
echo "backup file not found"
echo 'to get a list of available backups, run:'
echo ' docker-compose run postgres list-backups'
exit 1
fi
echo "beginning restore from $1"
echo "-------------------------"
# delete the db
# deleting the db can fail. Spit out a comment if this happens but continue since the db
# is created in the next step
echo "deleting old database $POSTGRES_USER"
if dropdb -h postgres -U $POSTGRES_USER $POSTGRES_USER
then echo "deleted $POSTGRES_USER database"
else echo "database $POSTGRES_USER does not exist, continue"
fi
# create a new database
echo "creating new database $POSTGRES_USER"
createdb -h postgres -U $POSTGRES_USER $POSTGRES_USER -O $POSTGRES_USER
# restore the database
echo "restoring database $POSTGRES_USER"
gunzip -c $BACKUPFILE | psql -h postgres -U $POSTGRES_USER

View File

@ -12,8 +12,8 @@ from __future__ import absolute_import, unicode_literals
import environ
ROOT_DIR = environ.Path(__file__) - 3 # (/a/b/myfile.py - 3 = /)
APPS_DIR = ROOT_DIR.path('{{ cookiecutter.repo_name }}')
ROOT_DIR = environ.Path(__file__) - 3 # ({{ cookiecutter.project_slug }}/config/settings/common.py - 3 = {{ cookiecutter.project_slug }}/)
APPS_DIR = ROOT_DIR.path('{{ cookiecutter.project_slug }}')
env = environ.Env()
@ -43,7 +43,8 @@ THIRD_PARTY_APPS = (
# Apps specific for this project go here.
LOCAL_APPS = (
'{{ cookiecutter.repo_name }}.users', # custom users app
# custom users app
'{{ cookiecutter.project_slug }}.users.apps.UsersConfig',
# Your stuff: custom apps go here
)
@ -52,8 +53,8 @@ INSTALLED_APPS = DJANGO_APPS + THIRD_PARTY_APPS + LOCAL_APPS
# MIDDLEWARE CONFIGURATION
# ------------------------------------------------------------------------------
MIDDLEWARE_CLASSES = (
# Make sure djangosecure.middleware.SecurityMiddleware is listed first
MIDDLEWARE = (
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
@ -65,13 +66,13 @@ MIDDLEWARE_CLASSES = (
# MIGRATIONS CONFIGURATION
# ------------------------------------------------------------------------------
MIGRATION_MODULES = {
'sites': '{{ cookiecutter.repo_name }}.contrib.sites.migrations'
'sites': '{{ cookiecutter.project_slug }}.contrib.sites.migrations'
}
# DEBUG
# ------------------------------------------------------------------------------
# See: https://docs.djangoproject.com/en/dev/ref/settings/#debug
DEBUG = env.bool("DJANGO_DEBUG", False)
DEBUG = env.bool('DJANGO_DEBUG', False)
# FIXTURE CONFIGURATION
# ------------------------------------------------------------------------------
@ -98,8 +99,7 @@ MANAGERS = ADMINS
# ------------------------------------------------------------------------------
# See: https://docs.djangoproject.com/en/dev/ref/settings/#databases
DATABASES = {
# Raises ImproperlyConfigured exception if DATABASE_URL not in os.environ
'default': env.db("DATABASE_URL", default="postgres://{% if cookiecutter.windows == 'y' %}localhost{% endif %}/{{cookiecutter.repo_name}}"),
'default': env.db('DATABASE_URL', default='postgres://{% if cookiecutter.windows == 'y' %}localhost{% endif %}/{{cookiecutter.project_slug}}'),
}
DATABASES['default']['ATOMIC_REQUESTS'] = True
@ -163,8 +163,8 @@ TEMPLATES = [
},
]
# See: http://django-crispy-forms.readthedocs.org/en/latest/install.html#template-packs
CRISPY_TEMPLATE_PACK = 'bootstrap3'
# See: http://django-crispy-forms.readthedocs.io/en/latest/install.html#template-packs
CRISPY_TEMPLATE_PACK = 'bootstrap4'
# STATIC FILE CONFIGURATION
# ------------------------------------------------------------------------------
@ -200,6 +200,26 @@ ROOT_URLCONF = 'config.urls'
# See: https://docs.djangoproject.com/en/dev/ref/settings/#wsgi-application
WSGI_APPLICATION = 'config.wsgi.application'
# PASSWORD VALIDATION
# https://docs.djangoproject.com/en/dev/ref/settings/#auth-password-validators
# ------------------------------------------------------------------------------
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# AUTHENTICATION CONFIGURATION
# ------------------------------------------------------------------------------
AUTHENTICATION_BACKENDS = (
@ -211,9 +231,10 @@ AUTHENTICATION_BACKENDS = (
ACCOUNT_AUTHENTICATION_METHOD = 'username'
ACCOUNT_EMAIL_REQUIRED = True
ACCOUNT_EMAIL_VERIFICATION = 'mandatory'
ACCOUNT_ADAPTER = '{{cookiecutter.repo_name}}.users.adapter.AccountAdapter'
SOCIALACCOUNT_ADAPTER = '{{cookiecutter.repo_name}}.users.adapter.SocialAccountAdapter'
ACCOUNT_ALLOW_REGISTRATION = True
ACCOUNT_ALLOW_REGISTRATION = env.bool('DJANGO_ACCOUNT_ALLOW_REGISTRATION', True)
ACCOUNT_ADAPTER = '{{cookiecutter.project_slug}}.users.adapters.AccountAdapter'
SOCIALACCOUNT_ADAPTER = '{{cookiecutter.project_slug}}.users.adapters.SocialAccountAdapter'
# Custom user app defaults
# Select the correct user model
@ -223,16 +244,28 @@ LOGIN_URL = 'account_login'
# SLUGLIFIER
AUTOSLUG_SLUGIFY_FUNCTION = 'slugify.slugify'
{% if cookiecutter.use_celery == "y" %}
{% if cookiecutter.use_celery == 'y' %}
########## CELERY
INSTALLED_APPS += ('{{cookiecutter.repo_name}}.taskapp.celery.CeleryConfig',)
INSTALLED_APPS += ('{{cookiecutter.project_slug}}.taskapp.celery.CeleryConfig',)
# if you are not using the django database broker (e.g. rabbitmq, redis, memcached), you can remove the next line.
INSTALLED_APPS += ('kombu.transport.django',)
BROKER_URL = env("CELERY_BROKER_URL", default='django://')
BROKER_URL = env('CELERY_BROKER_URL', default='django://')
if BROKER_URL == 'django://':
CELERY_RESULT_BACKEND = 'redis://'
else:
CELERY_RESULT_BACKEND = BROKER_URL
########## END CELERY
{% endif %}
{%- if cookiecutter.use_compressor == 'y'-%}
# django-compressor
# ------------------------------------------------------------------------------
INSTALLED_APPS += ("compressor", )
STATICFILES_FINDERS += ("compressor.finders.CompressorFinder", )
{%- endif %}
# Location of root django.contrib.admin URL, use {% raw %}{% url 'admin:index' %}{% endraw %}
ADMIN_URL = r'^admin/'
# Your common stuff: Below this line define 3rd party library settings
# ------------------------------------------------------------------------------

View File

@ -1,13 +1,19 @@
# -*- coding: utf-8 -*-
'''
"""
Local settings
- Run in Debug mode
{% if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'y' %}
- Use mailhog for emails
{% else %}
- Use console backend for emails
{% endif %}
- Add Django Debug Toolbar
- Add django-extensions as app
'''
"""
import socket
import os
from .common import * # noqa
# DEBUG
@ -19,16 +25,19 @@ TEMPLATES[0]['OPTIONS']['debug'] = DEBUG
# ------------------------------------------------------------------------------
# See: https://docs.djangoproject.com/en/dev/ref/settings/#secret-key
# Note: This key only used for development and testing.
SECRET_KEY = env("DJANGO_SECRET_KEY", default='CHANGEME!!!')
SECRET_KEY = env('DJANGO_SECRET_KEY', default='CHANGEME!!!')
# Mail settings
# ------------------------------------------------------------------------------
EMAIL_HOST = 'localhost'
EMAIL_PORT = 1025
{%if cookiecutter.use_mailhog == "n" -%}
{% if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'y' %}
EMAIL_HOST = env("EMAIL_HOST", default='mailhog')
{% else %}
EMAIL_HOST = 'localhost'
EMAIL_BACKEND = env('DJANGO_EMAIL_BACKEND',
default='django.core.mail.backends.console.EmailBackend')
{%- endif %}
{% endif %}
# CACHING
# ------------------------------------------------------------------------------
@ -41,10 +50,14 @@ CACHES = {
# django-debug-toolbar
# ------------------------------------------------------------------------------
MIDDLEWARE_CLASSES += ('debug_toolbar.middleware.DebugToolbarMiddleware',)
MIDDLEWARE += ('debug_toolbar.middleware.DebugToolbarMiddleware',)
INSTALLED_APPS += ('debug_toolbar', )
INTERNAL_IPS = ('127.0.0.1', '10.0.2.2',)
INTERNAL_IPS = ['127.0.0.1', '10.0.2.2', ]
# tricks to have debug toolbar when developing with docker
if os.environ.get('USE_DOCKER') == 'yes':
ip = socket.gethostbyname(socket.gethostname())
INTERNAL_IPS += [ip[:-1] + "1"]
DEBUG_TOOLBAR_CONFIG = {
'DISABLE_PANELS': [
@ -60,10 +73,11 @@ INSTALLED_APPS += ('django_extensions', )
# TESTING
# ------------------------------------------------------------------------------
TEST_RUNNER = 'django.test.runner.DiscoverRunner'
{% if cookiecutter.use_celery == "y" %}
{% if cookiecutter.use_celery == 'y' %}
########## CELERY
# In development, all tasks will be executed locally by blocking until the task returns
CELERY_ALWAYS_EAGER = True
########## END CELERY
{% endif %}
# Your local stuff: Below this line define 3rd party library settings
# ------------------------------------------------------------------------------

View File

@ -1,23 +1,22 @@
# -*- coding: utf-8 -*-
'''
"""
Production Configurations
- Use djangosecure
- Use Amazon's S3 for storing static files and uploaded media
- Use mailgun to send emails
- Use Redis on Heroku
{% if cookiecutter.use_sentry == "y" %}
- Use Redis for cache
{% if cookiecutter.use_sentry_for_error_reporting == 'y' %}
- Use sentry for error logging
{% endif %}
{% if cookiecutter.use_opbeat == "y" %}
{% if cookiecutter.use_opbeat == 'y' %}
- Use opbeat for error reporting
{% endif %}
'''
"""
from __future__ import absolute_import, unicode_literals
from boto.s3.connection import OrdinaryCallingFormat
from django.utils import six
{% if cookiecutter.use_sentry == "y" %}
{% if cookiecutter.use_sentry_for_error_reporting == 'y' %}
import logging
{% endif %}
@ -27,35 +26,29 @@ from .common import * # noqa
# ------------------------------------------------------------------------------
# See: https://docs.djangoproject.com/en/dev/ref/settings/#secret-key
# Raises ImproperlyConfigured exception if DJANGO_SECRET_KEY not in os.environ
SECRET_KEY = env("DJANGO_SECRET_KEY")
SECRET_KEY = env('DJANGO_SECRET_KEY')
# This ensures that Django will be able to detect a secure connection
# properly on Heroku.
SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
# django-secure
# ------------------------------------------------------------------------------
INSTALLED_APPS += ("djangosecure", )
{% if cookiecutter.use_sentry == "y" -%}
{%- if cookiecutter.use_sentry_for_error_reporting == 'y' %}
# raven sentry client
# See https://docs.getsentry.com/hosted/clients/python/integrations/django/
INSTALLED_APPS += ('raven.contrib.django.raven_compat', )
{%- endif %}
SECURITY_MIDDLEWARE = (
'djangosecure.middleware.SecurityMiddleware',
)
{% if cookiecutter.use_sentry == "y" -%}
RAVEN_MIDDLEWARE = ('raven.contrib.django.raven_compat.middleware.Sentry404CatchMiddleware',
'raven.contrib.django.raven_compat.middleware.SentryResponseErrorIdMiddleware',)
MIDDLEWARE_CLASSES = SECURITY_MIDDLEWARE + \
RAVEN_MIDDLEWARE + MIDDLEWARE_CLASSES
{% else %}
# Make sure djangosecure.middleware.SecurityMiddleware is listed first
MIDDLEWARE_CLASSES = SECURITY_MIDDLEWARE + MIDDLEWARE_CLASSES
{%- endif %}
{% if cookiecutter.use_opbeat == "y" -%}
{% endif %}
{%- if cookiecutter.use_whitenoise == 'y' %}
# Use Whitenoise to serve static files
# See: https://whitenoise.readthedocs.io/
WHITENOISE_MIDDLEWARE = ('whitenoise.middleware.WhiteNoiseMiddleware', )
MIDDLEWARE = WHITENOISE_MIDDLEWARE + MIDDLEWARE
{% endif %}
{%- if cookiecutter.use_sentry_for_error_reporting == 'y' -%}
RAVEN_MIDDLEWARE = ('raven.contrib.django.raven_compat.middleware.SentryResponseErrorIdMiddleware', )
MIDDLEWARE = RAVEN_MIDDLEWARE + MIDDLEWARE
{% endif %}
{%- if cookiecutter.use_opbeat == 'y' -%}
# opbeat integration
# See https://opbeat.com/languages/django/
INSTALLED_APPS += ('opbeat.contrib.django',)
@ -64,21 +57,29 @@ OPBEAT = {
'APP_ID': env('DJANGO_OPBEAT_APP_ID'),
'SECRET_TOKEN': env('DJANGO_OPBEAT_SECRET_TOKEN')
}
MIDDLEWARE_CLASSES = (
MIDDLEWARE = (
'opbeat.contrib.django.middleware.OpbeatAPMMiddleware',
) + MIDDLEWARE_CLASSES
{%- endif %}
) + MIDDLEWARE
{% endif %}
# SECURITY CONFIGURATION
# ------------------------------------------------------------------------------
# See https://docs.djangoproject.com/en/1.9/ref/middleware/#module-django.middleware.security
# and https://docs.djangoproject.com/ja/1.9/howto/deployment/checklist/#run-manage-py-check-deploy
# set this to 60 seconds and then to 518400 when you can prove it works
SECURE_HSTS_SECONDS = 60
SECURE_HSTS_INCLUDE_SUBDOMAINS = env.bool(
"DJANGO_SECURE_HSTS_INCLUDE_SUBDOMAINS", default=True)
SECURE_FRAME_DENY = env.bool("DJANGO_SECURE_FRAME_DENY", default=True)
'DJANGO_SECURE_HSTS_INCLUDE_SUBDOMAINS', default=True)
SECURE_CONTENT_TYPE_NOSNIFF = env.bool(
"DJANGO_SECURE_CONTENT_TYPE_NOSNIFF", default=True)
'DJANGO_SECURE_CONTENT_TYPE_NOSNIFF', default=True)
SECURE_BROWSER_XSS_FILTER = True
SESSION_COOKIE_SECURE = False
SESSION_COOKIE_SECURE = True
SESSION_COOKIE_HTTPONLY = True
SECURE_SSL_REDIRECT = env.bool("DJANGO_SECURE_SSL_REDIRECT", default=True)
SECURE_SSL_REDIRECT = env.bool('DJANGO_SECURE_SSL_REDIRECT', default=True)
CSRF_COOKIE_SECURE = True
CSRF_COOKIE_HTTPONLY = True
X_FRAME_OPTIONS = 'DENY'
# SITE CONFIGURATION
# ------------------------------------------------------------------------------
@ -87,17 +88,17 @@ SECURE_SSL_REDIRECT = env.bool("DJANGO_SECURE_SSL_REDIRECT", default=True)
ALLOWED_HOSTS = env.list('DJANGO_ALLOWED_HOSTS', default=['{{cookiecutter.domain_name}}'])
# END SITE CONFIGURATION
INSTALLED_APPS += ("gunicorn", )
INSTALLED_APPS += ('gunicorn', )
# STORAGE CONFIGURATION
# ------------------------------------------------------------------------------
# Uploaded Media Files
# ------------------------
# See: http://django-storages.readthedocs.org/en/latest/index.html
# See: http://django-storages.readthedocs.io/en/latest/index.html
INSTALLED_APPS += (
'storages',
)
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = env('DJANGO_AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = env('DJANGO_AWS_SECRET_ACCESS_KEY')
@ -119,36 +120,52 @@ AWS_HEADERS = {
# URL that handles the media served from MEDIA_ROOT, used for managing
# stored files.
{% if cookiecutter.use_whitenoise == 'y' -%}
MEDIA_URL = 'https://s3.amazonaws.com/%s/' % AWS_STORAGE_BUCKET_NAME
{% else %}
# See:http://stackoverflow.com/questions/10390244/
from storages.backends.s3boto import S3BotoStorage
StaticRootS3BotoStorage = lambda: S3BotoStorage(location='static')
MediaRootS3BotoStorage = lambda: S3BotoStorage(location='media')
DEFAULT_FILE_STORAGE = 'config.settings.production.MediaRootS3BotoStorage'
MEDIA_URL = 'https://s3.amazonaws.com/%s/media/' % AWS_STORAGE_BUCKET_NAME
{%- endif %}
# Static Assets
# ------------------------
{% if cookiecutter.use_whitenoise == 'y' -%}
STATICFILES_STORAGE = 'whitenoise.django.GzipManifestStaticFilesStorage'
STATICFILES_STORAGE = 'whitenoise.storage.CompressedManifestStaticFilesStorage'
{% else %}
STATICFILES_STORAGE = DEFAULT_FILE_STORAGE
STATIC_URL = MEDIA_URL
STATIC_URL = 'https://s3.amazonaws.com/%s/static/' % AWS_STORAGE_BUCKET_NAME
STATICFILES_STORAGE = 'config.settings.production.StaticRootS3BotoStorage'
# See: https://github.com/antonagestam/collectfast
# For Django 1.7+, 'collectfast' should come before
# 'django.contrib.staticfiles'
AWS_PRELOAD_METADATA = True
INSTALLED_APPS = ('collectfast', ) + INSTALLED_APPS
{%- endif %}
{% if cookiecutter.use_compressor == 'y'-%}
# COMPRESSOR
# ------------------------------------------------------------------------------
COMPRESS_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
COMPRESS_URL = STATIC_URL
COMPRESS_ENABLED = env.bool('COMPRESS_ENABLED', default=True)
{%- endif %}
# EMAIL
# ------------------------------------------------------------------------------
DEFAULT_FROM_EMAIL = env('DJANGO_DEFAULT_FROM_EMAIL',
default='{{cookiecutter.project_name}} <noreply@{{cookiecutter.domain_name}}>')
EMAIL_BACKEND = 'django_mailgun.MailgunBackend'
MAILGUN_ACCESS_KEY = env('DJANGO_MAILGUN_API_KEY')
MAILGUN_SERVER_NAME = env('DJANGO_MAILGUN_SERVER_NAME')
EMAIL_SUBJECT_PREFIX = env("DJANGO_EMAIL_SUBJECT_PREFIX", default='[{{cookiecutter.project_name}}] ')
EMAIL_SUBJECT_PREFIX = env('DJANGO_EMAIL_SUBJECT_PREFIX', default='[{{cookiecutter.project_name}}] ')
SERVER_EMAIL = env('DJANGO_SERVER_EMAIL', default=DEFAULT_FROM_EMAIL)
{% if cookiecutter.use_newrelic == 'y'-%}
NEW_RELIC_LICENSE_KEY = env('NEW_RELIC_LICENSE_KEY')
NEW_RELIC_APP_NAME = '{{cookiecutter.project_name}}'
{%- endif %}
# Anymail with Mailgun
INSTALLED_APPS += ("anymail", )
ANYMAIL = {
"MAILGUN_API_KEY": env('DJANGO_MAILGUN_API_KEY'),
"MAILGUN_SENDER_DOMAIN": env('MAILGUN_SENDER_DOMAIN')
}
EMAIL_BACKEND = "anymail.backends.mailgun.MailgunBackend"
# TEMPLATE CONFIGURATION
# ------------------------------------------------------------------------------
@ -161,25 +178,48 @@ TEMPLATES[0]['OPTIONS']['loaders'] = [
# DATABASE CONFIGURATION
# ------------------------------------------------------------------------------
{% if cookiecutter.use_elasticbeanstalk_experimental.lower() == 'y' -%}
# Uses Amazon RDS for database hosting, which doesn't follow the Heroku-style spec
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': env('RDS_DB_NAME'),
'USER': env('RDS_USERNAME'),
'PASSWORD': env('RDS_PASSWORD'),
'HOST': env('RDS_HOSTNAME'),
'PORT': env('RDS_PORT'),
}
}
{% else %}
# Use the Heroku-style specification
# Raises ImproperlyConfigured exception if DATABASE_URL not in os.environ
DATABASES['default'] = env.db("DATABASE_URL")
DATABASES['default'] = env.db('DATABASE_URL')
{%- endif %}
# CACHING
# ------------------------------------------------------------------------------
{% if cookiecutter.use_elasticbeanstalk_experimental.lower() == 'y' -%}
REDIS_LOCATION = "redis://{}:{}/0".format(
env('REDIS_ENDPOINT_ADDRESS'),
env('REDIS_PORT')
)
{% else %}
REDIS_LOCATION = '{0}/{1}'.format(env('REDIS_URL', default='redis://127.0.0.1:6379'), 0)
{%- endif %}
# Heroku URL does not pass the DB number, so we parse it in
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": "{0}/{1}".format(env('REDIS_URL', default="redis://127.0.0.1:6379"), 0),
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
"IGNORE_EXCEPTIONS": True, # mimics memcache behavior.
'default': {
'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': REDIS_LOCATION,
'OPTIONS': {
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
'IGNORE_EXCEPTIONS': True, # mimics memcache behavior.
# http://niwinz.github.io/django-redis/latest/#_memcached_exceptions_behavior
}
}
}
{% if cookiecutter.use_sentry == "y" %}
{% if cookiecutter.use_sentry_for_error_reporting == 'y' %}
# Sentry Configuration
SENTRY_DSN = env('DJANGO_SENTRY_DSN')
SENTRY_CLIENT = env('DJANGO_SENTRY_CLIENT', default='raven.contrib.django.raven_compat.DjangoClient')
@ -235,7 +275,7 @@ RAVEN_CONFIG = {
'CELERY_LOGLEVEL': env.int('DJANGO_SENTRY_LOG_LEVEL', logging.INFO),
'DSN': SENTRY_DSN
}
{% elif cookiecutter.use_sentry == "n" %}
{% elif cookiecutter.use_sentry_for_error_reporting == 'n' %}
# LOGGING CONFIGURATION
# ------------------------------------------------------------------------------
# See: https://docs.djangoproject.com/en/dev/ref/settings/#logging
@ -288,3 +328,4 @@ LOGGING = {
ADMIN_URL = env('DJANGO_ADMIN_URL')
# Your production stuff: Below this line define 3rd party library settings
# ------------------------------------------------------------------------------

View File

@ -0,0 +1,62 @@
# -*- coding: utf-8 -*-
'''
Test settings
- Used to run tests fast on the continuous integration server and locally
'''
from .common import * # noqa
# DEBUG
# ------------------------------------------------------------------------------
# Turn debug off so tests run faster
DEBUG = False
TEMPLATES[0]['OPTIONS']['debug'] = False
# SECRET CONFIGURATION
# ------------------------------------------------------------------------------
# See: https://docs.djangoproject.com/en/dev/ref/settings/#secret-key
# Note: This key only used for development and testing.
SECRET_KEY = env('DJANGO_SECRET_KEY', default='CHANGEME!!!')
# Mail settings
# ------------------------------------------------------------------------------
EMAIL_HOST = 'localhost'
EMAIL_PORT = 1025
# In-memory email backend stores messages in django.core.mail.outbox
# for unit testing purposes
EMAIL_BACKEND = 'django.core.mail.backends.locmem.EmailBackend'
# CACHING
# ------------------------------------------------------------------------------
# Speed advantages of in-memory caching without having to run Memcached
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
'LOCATION': ''
}
}
# TESTING
# ------------------------------------------------------------------------------
TEST_RUNNER = 'django.test.runner.DiscoverRunner'
# PASSWORD HASHING
# ------------------------------------------------------------------------------
# Use fast password hasher so tests run faster
PASSWORD_HASHERS = (
'django.contrib.auth.hashers.MD5PasswordHasher',
)
# TEMPLATE LOADERS
# ------------------------------------------------------------------------------
# Keep templates in memory so tests run faster
TEMPLATES[0]['OPTIONS']['loaders'] = [
('django.template.loaders.cached.Loader', [
'django.template.loaders.filesystem.Loader',
'django.template.loaders.app_directories.Loader',
]),
]

View File

@ -9,14 +9,14 @@ from django.views.generic import TemplateView
from django.views import defaults as default_views
urlpatterns = [
url(r'^$', TemplateView.as_view(template_name='pages/home.html'), name="home"),
url(r'^about/$', TemplateView.as_view(template_name='pages/about.html'), name="about"),
url(r'^$', TemplateView.as_view(template_name='pages/home.html'), name='home'),
url(r'^about/$', TemplateView.as_view(template_name='pages/about.html'), name='about'),
# Django Admin, use {% raw %}{% url 'admin:index' %}{% endraw %}
url(settings.ADMIN_URL, include(admin.site.urls)),
url(settings.ADMIN_URL, admin.site.urls),
# User management
url(r'^users/', include("{{ cookiecutter.repo_name }}.users.urls", namespace="users")),
url(r'^users/', include('{{ cookiecutter.project_slug }}.users.urls', namespace='users')),
url(r'^accounts/', include('allauth.urls')),
# Your stuff: custom urls includes go here
@ -28,8 +28,14 @@ if settings.DEBUG:
# This allows the error pages to be debugged during development, just visit
# these url in browser to see how these error pages look like.
urlpatterns += [
url(r'^400/$', default_views.bad_request, kwargs={'exception': Exception("Bad Request!")}),
url(r'^403/$', default_views.permission_denied, kwargs={'exception': Exception("Permissin Denied")}),
url(r'^404/$', default_views.page_not_found, kwargs={'exception': Exception("Page not Found")}),
url(r'^400/$', default_views.bad_request, kwargs={'exception': Exception('Bad Request!')}),
url(r'^403/$', default_views.permission_denied, kwargs={'exception': Exception('Permission Denied')}),
url(r'^404/$', default_views.page_not_found, kwargs={'exception': Exception('Page not Found')}),
url(r'^500/$', default_views.server_error),
]
if 'debug_toolbar' in settings.INSTALLED_APPS:
import debug_toolbar
urlpatterns += [
url(r'^__debug__/', include(debug_toolbar.urls)),
]

View File

@ -15,17 +15,9 @@ framework.
"""
import os
{% if cookiecutter.use_newrelic == "y" -%}
if os.environ.get("DJANGO_SETTINGS_MODULE") == "config.settings.production":
import newrelic.agent
newrelic.agent.initialize()
{%- endif %}
from django.core.wsgi import get_wsgi_application
{% if cookiecutter.use_whitenoise == 'y' -%}
from whitenoise.django import DjangoWhiteNoise
{%- endif %}
{% if cookiecutter.use_sentry == "y" -%}
if os.environ.get("DJANGO_SETTINGS_MODULE") == "config.settings.production":
{% if cookiecutter.use_sentry_for_error_reporting == 'y' -%}
if os.environ.get('DJANGO_SETTINGS_MODULE') == 'config.settings.production':
from raven.contrib.django.raven_compat.middleware.wsgi import Sentry
{%- endif %}
@ -39,20 +31,10 @@ os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.production")
# file. This includes Django's development server, if the WSGI_APPLICATION
# setting points here.
application = get_wsgi_application()
{% if cookiecutter.use_whitenoise == 'y' -%}
# Use Whitenoise to serve static files
# See: https://whitenoise.readthedocs.org/
application = DjangoWhiteNoise(application)
{%- endif %}
{% if cookiecutter.use_sentry == "y" -%}
if os.environ.get("DJANGO_SETTINGS_MODULE") == "config.settings.production":
{% if cookiecutter.use_sentry_for_error_reporting == 'y' -%}
if os.environ.get('DJANGO_SETTINGS_MODULE') == 'config.settings.production':
application = Sentry(application)
{%- endif %}
{% if cookiecutter.use_newrelic == "y" -%}
if os.environ.get("DJANGO_SETTINGS_MODULE") == "config.settings.production":
application = newrelic.agent.WSGIApplicationWrapper(application)
{%- endif %}
# Apply WSGI middleware here.
# from helloworld.wsgi import HelloWorldApplication
# application = HelloWorldApplication(application)

View File

@ -0,0 +1,56 @@
version: '2'
volumes:
postgres_data_dev: {}
postgres_backup_dev: {}
services:
postgres:
build: ./compose/postgres
volumes:
- postgres_data_dev:/var/lib/postgresql/data
- postgres_backup_dev:/backups
environment:
- POSTGRES_USER={{cookiecutter.project_slug}}
django:
build:
context: .
dockerfile: ./compose/django/Dockerfile-dev
command: /start-dev.sh
depends_on:
- postgres
environment:
- POSTGRES_USER={{cookiecutter.project_slug}}
- USE_DOCKER=yes
volumes:
- .:/app
ports:
- "8000:8000"
links:
- postgres
{% if cookiecutter.use_mailhog == 'y' %}
- mailhog
{% endif %}
{% if cookiecutter.use_pycharm == 'y' %}
pycharm:
build:
context: .
dockerfile: ./compose/django/Dockerfile-dev
depends_on:
- postgres
environment:
- POSTGRES_USER={{cookiecutter.project_slug}}
volumes:
- .:/app
links:
- postgres
{% endif %}
{% if cookiecutter.use_mailhog == 'y' %}
mailhog:
image: mailhog/mailhog
ports:
- "8025:8025"
{% endif %}

View File

@ -0,0 +1,83 @@
version: '2'
volumes:
postgres_data: {}
postgres_backup: {}
services:
postgres:
build: ./compose/postgres
volumes:
- postgres_data:/var/lib/postgresql/data
- postgres_backup:/backups
env_file: .env
django:
build:
context: .
dockerfile: ./compose/django/Dockerfile
user: django
depends_on:
- postgres
- redis
command: /gunicorn.sh
env_file: .env
nginx:
build: ./compose/nginx
depends_on:
- django
{% if cookiecutter.use_lets_encrypt == 'y' %}
- certbot
{% endif %}
ports:
- "0.0.0.0:80:80"
{% if cookiecutter.use_lets_encrypt == 'y' %}
environment:
- MY_DOMAIN_NAME={{ cookiecutter.domain_name }}
ports:
- "0.0.0.0:80:80"
- "0.0.0.0:443:443"
volumes:
- /etc/letsencrypt:/etc/letsencrypt
- /var/lib/letsencrypt:/var/lib/letsencrypt
certbot:
image: quay.io/letsencrypt/letsencrypt
command: bash -c "sleep 6 && certbot certonly -n --standalone -d {{ cookiecutter.domain_name }} --text --agree-tos --email {{ cookiecutter.email }} --server https://acme-v01.api.letsencrypt.org/directory --rsa-key-size 4096 --verbose --keep-until-expiring --standalone-supported-challenges http-01"
entrypoint: ""
volumes:
- /etc/letsencrypt:/etc/letsencrypt
- /var/lib/letsencrypt:/var/lib/letsencrypt
ports:
- "80"
- "443"
environment:
- TERM=xterm
{% endif %}
redis:
image: redis:latest
{% if cookiecutter.use_celery == 'y' %}
celeryworker:
build:
context: .
dockerfile: ./compose/django/Dockerfile
user: django
env_file: .env
depends_on:
- postgres
- redis
command: celery -A {{cookiecutter.project_slug}}.taskapp worker -l INFO
celerybeat:
build:
context: .
dockerfile: ./compose/django/Dockerfile
user: django
env_file: .env
depends_on:
- postgres
- redis
command: celery -A {{cookiecutter.project_slug}}.taskapp beat -l INFO
{% endif %}

View File

@ -77,17 +77,17 @@ qthelp:
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/{{ cookiecutter.repo_name }}.qhcp"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/{{ cookiecutter.project_slug }}.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/{{ cookiecutter.repo_name }}.qhc"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/{{ cookiecutter.project_slug }}.qhc"
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/{{ cookiecutter.repo_name }}"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/{{ cookiecutter.repo_name }}"
@echo "# mkdir -p $$HOME/.local/share/devhelp/{{ cookiecutter.project_slug }}"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/{{ cookiecutter.project_slug }}"
@echo "# devhelp"
epub:

View File

@ -11,6 +11,8 @@
# All configuration values have a default; values that are commented out
# serve to show the default.
from __future__ import unicode_literals
import os
import sys
@ -41,8 +43,8 @@ source_suffix = '.rst'
master_doc = 'index'
# General information about the project.
project = u'{{ cookiecutter.project_name }}'
copyright = u"{{ cookiecutter.year }}, {{ cookiecutter.author_name }}"
project = '{{ cookiecutter.project_name }}'
copyright = """{% now 'utc', '%Y' %}, {{ cookiecutter.author_name }}"""
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
@ -165,7 +167,7 @@ html_static_path = ['_static']
# html_file_suffix = None
# Output file base name for HTML help builder.
htmlhelp_basename = '{{ cookiecutter.repo_name }}doc'
htmlhelp_basename = '{{ cookiecutter.project_slug }}doc'
# -- Options for LaTeX output --------------------------------------------------
@ -185,9 +187,9 @@ latex_elements = {
# (source start file, target name, title, author, documentclass [howto/manual]).
latex_documents = [
('index',
'{{ cookiecutter.repo_name }}.tex',
u'{{ cookiecutter.project_name }} Documentation',
u"{{ cookiecutter.author_name }}", 'manual'),
'{{ cookiecutter.project_slug }}.tex',
'{{ cookiecutter.project_name }} Documentation',
"""{{ cookiecutter.author_name }}""", 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
@ -216,8 +218,8 @@ latex_documents = [
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
('index', '{{ cookiecutter.repo_name }}', u'{{ cookiecutter.project_name }} Documentation',
[u"{{ cookiecutter.author_name }}"], 1)
('index', '{{ cookiecutter.project_slug }}', '{{ cookiecutter.project_name }} Documentation',
["""{{ cookiecutter.author_name }}"""], 1)
]
# If true, show URL addresses after external links.
@ -230,9 +232,9 @@ man_pages = [
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
('index', '{{ cookiecutter.repo_name }}', u'{{ cookiecutter.project_name }} Documentation',
u"{{ cookiecutter.author_name }}", '{{ cookiecutter.project_name }}',
'{{ cookiecutter.description }}', 'Miscellaneous'),
('index', '{{ cookiecutter.project_slug }}', '{{ cookiecutter.project_name }} Documentation',
"""{{ cookiecutter.author_name }}""", '{{ cookiecutter.project_name }}',
"""{{ cookiecutter.description }}""", 'Miscellaneous'),
]
# Documents to append as an appendix to all manuals.

View File

@ -14,7 +14,7 @@ Docker encourages running one container for each process. This might mean one co
.. _Redis: http://redis.io/
The Docker compose tool (previously known as `fig`_) makes linking these containers easy. An example set up for your cookiecutter-django project might look like this:
The Docker compose tool (previously known as `fig`_) makes linking these containers easy. An example set up for your Cookiecutter Django project might look like this:
.. _fig: http://www.fig.sh/

View File

@ -99,9 +99,9 @@ if "%1" == "qthelp" (
echo.
echo.Build finished; now you can run "qcollectiongenerator" with the ^
.qhcp project file in %BUILDDIR%/qthelp, like this:
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\{{ cookiecutter.repo_name }}.qhcp
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\{{ cookiecutter.project_slug }}.qhcp
echo.To view the help file:
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\{{ cookiecutter.repo_name }}.ghc
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\{{ cookiecutter.project_slug }}.ghc
goto end
)

View File

@ -0,0 +1,72 @@
Docker Remote Debugging
=======================
To connect to python remote interpreter inside docker, you have to make sure first, that Pycharm is aware of your docker.
Go to *Settings > Build, Execution, Deployment > Docker*. If you are on linux, you can use docker directly using its socket `unix:///var/run/docker.sock`, if you are on Windows or Mac, make sure that you have docker-machine installed, then you can simply *Import credentials from Docker Machine*.
.. image:: images/1.png
Configure Remote Python Interpreter
-----------------------------------
This repository comes with already prepared "Run/Debug Configurations" for docker.
.. image:: images/2.png
But as you can see, at the beggining there is something wrong with them. They have red X on django icon, and they cannot be used, withot configuring remote python interpteter. To do that, you have to go to *Settings > Build, Execution, Deployment* first.
Next, you have to add new remote python interpreter, based on already tested deployment settings. Go to *Settings > Project > Project Interpreter*. Click on the cog icon, and click *Add Remote*.
.. image:: images/3.png
Switch to *Docker Compose* and select `dev.yml` file from directory of your project, next set *Service name* to `django`
.. image:: images/4.png
Because Pycharm restarts container every time you use Configuration Run, to not have server restarted during running tests, we defined second service in `dev.yml` file called pycharm. To use it, you have to add interpreter of second service as well.
.. image:: images/5.png
The final result should be:
.. image:: images/6.png
Having that, click *OK*. Close *Settings* panel, and wait few seconds...
.. image:: images/7.png
After few seconds, all *Run/Debug Configurations* should be ready to use.
.. image:: images/8.png
**Things you can do with provided configuration**:
* run and debug python code
.. image:: images/f1.png
* run and debug tests
.. image:: images/f2.png
.. image:: images/f3.png
* run and debug migrations or different django managment commands
.. image:: images/f4.png
* and many others..
Known issues
------------
* Pycharm hangs on "Connecting to Debugger"
.. image:: images/issue1.png
This might be fault of your firewall. Take a look on this ticket - https://youtrack.jetbrains.com/issue/PY-18913
* Modified files in `.idea` directory
Most of the files from `.idea/` were added to `.gitignore` with a few exceptions, which were made, to provide "ready to go" configuration. After adding remote interpreter some of these files are altered by PyCharm:
.. image:: images/issue2.png
In theory you can remove them from repository, but then, other people will lose a ability to initialize a project from provided configurations as you did. To get rid of this annoying state, you can run command::
$ git update-index --assume-unchanged {{cookiecutter.project_slug}}.iml

Binary file not shown.

After

Width:  |  Height:  |  Size: 66 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 15 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 177 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 110 KiB

Some files were not shown because too many files have changed in this diff Show More