Merge remote-tracking branch 'upstream/master'

* upstream/master: (127 commits)
  Update tox from 2.8.2 to 2.9.0
  Update sphinx from 1.6.3 to 1.6.4
  Update whitenoise from 3.3.0 to 3.3.1
  Update raven from 6.2.0 to 6.2.1 (#1341)
  Update raven from 6.1.0 to 6.2.0 (#1340)
  Fix formatting in README.rst
  Update EMAIL_BACKEND to "anymail.backends.espname.EmailBackend" to match anymail 1.0 (#1335)
  Removed references and old CSS specific to Bootstrap alpha. (#1333)
  Update django-anymail from 0.11.1 to 1.0 (#1334)
  Change pep 8 for pycodestyle in docs and project requirements (#1332)
  Update django-extensions from 1.9.0 to 1.9.1 (#1330)
  Update tox from 2.8.1 to 2.8.2 (#1328)
  Update wheel from 0.29.0 to 0.30.0 (#1329)
  Update pytest from 3.2.1 to 3.2.2 (#1326)
  Require django-coverage-plugin==1.5.0 for testing
  Prettify production postgres service Dockerfile entries
  Added Whitelist for /compose/local/ (#1322)
  Update django from 1.10.7 to 1.10.8 (#1320)
  Update bootstrap to 4.0.0-beta (#1319)
  Re-organize compose/ into environment-specific file groups (#1317)
  ...
This commit is contained in:
Delio Castillo 2017-09-29 14:45:18 -07:00
commit 9391608324
No known key found for this signature in database
GPG Key ID: ACD687FF72F7B469
72 changed files with 1172 additions and 827 deletions

32
.github/ISSUE_TEMPLATE.md vendored Normal file
View File

@ -0,0 +1,32 @@
**Note: for support questions, please use the `cookiecutter-django` tag on stackoverflow**. This repository's issues are reserved for feature requests and bug reports. If you need quick professional paid support for your project, contact support@cookiecutter.io.
* **I'm submitting a ... **
- [ ] bug report
- [ ] feature request
- [ ] support request => Please do not submit support request here, see note at the top of this template.
* **Do you want to request a *feature* or report a *bug*?**
* **What is the current behavior?**
* **If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem**
* **What is the expected behavior?**
* **What is the motivation / use case for changing the behavior?**
* **Please tell us about your environment:**
* **Other information** (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links for us to have context, eg. stackoverflow, gitter, etc)

242
.gitignore vendored
View File

@ -1,44 +1,228 @@
### OSX ###
.DS_Store
.AppleDouble
.LSOverride
### Python template
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
### SublimeText ###
# cache files for sublime text
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.coverage.*
.cache/
nosetests.xml
coverage.xml
*.cover
.hypothesis/
# Translations
*.mo
*.pot
# Django stuff:
*.log
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# pyenv
.python-version
# celery beat schedule file
celerybeat-schedule
# Environments
.env
.venv
env/
venv/
ENV/
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
### Linux template
*~
# temporary files which can be created if a process still has a handle open of a deleted file
.fuse_hidden*
# KDE directory preferences
.directory
# Linux trash folder which might appear on any partition or disk
.Trash-*
# .nfs files are created when an open file is removed but is still being accessed
.nfs*
### VisualStudioCode template
.vscode/*
!.vscode/settings.json
!.vscode/tasks.json
!.vscode/launch.json
!.vscode/extensions.json
### Windows template
# Windows thumbnail cache files
Thumbs.db
ehthumbs.db
ehthumbs_vista.db
# Dump file
*.stackdump
# Folder config file
Desktop.ini
# Recycle Bin used on file shares
$RECYCLE.BIN/
# Windows Installer files
*.cab
*.msi
*.msm
*.msp
# Windows shortcuts
*.lnk
### SublimeText template
# Cache files for Sublime Text
*.tmlanguage.cache
*.tmPreferences.cache
*.stTheme.cache
# workspace files are user-specific
# Workspace files are user-specific
*.sublime-workspace
# project files should be checked into the repository, unless a significant
# proportion of contributors will probably not be using SublimeText
# Project files should be checked into the repository, unless a significant
# proportion of contributors will probably not be using Sublime Text
# *.sublime-project
# sftp configuration file
# SFTP configuration file
sftp-config.json
# Generated files
*.log
*.pot
*.pyc
.idea
_build
*.egg-info/
# Package control specific files
Package Control.last-run
Package Control.ca-list
Package Control.ca-bundle
Package Control.system-ca-bundle
Package Control.cache/
Package Control.ca-certs/
Package Control.merged-ca-bundle
Package Control.user-ca-bundle
oscrypto-ca-bundle.crt
bh_unicode_properties.cache
# Project Specific Stuff
local_settings.py
project_slug
my_test_project/*
# Sublime-github package stores a github token in this file
# https://packagecontrol.io/packages/sublime-github
GitHub.sublime-settings
# Generated when running py.test for the Cookiecutter Django generation tests
.cache/
# Generated when running celery beat
celerybeat-schedule.db
### macOS template
# General
*.DS_Store
.AppleDouble
.LSOverride
# Unit test / coverage reports
.coverage
.tox
.cache
# Icon must end with two \r
Icon
# Thumbnails
._*
# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent
# Directories potentially created on remote AFP share
.AppleDB
.AppleDesktop
Network Trash Folder
Temporary Items
.apdisk
### Vim template
# Swap
[._]*.s[a-v][a-z]
[._]*.sw[a-p]
[._]s[a-v][a-z]
[._]sw[a-p]
# Session
Session.vim
# Temporary
.netrwhist
# Auto-generated tag files
tags
### VirtualEnv template
# Virtualenv
# http://iamzed.com/2009/05/07/a-primer-on-virtualenv/
[Bb]in
[Ii]nclude
[Ll]ib
[Ll]ib64
[Ll]ocal
[Ss]cripts
pyvenv.cfg
pip-selfcheck.json
# Even though the project might be opened and edited
# in any of the JetBrains IDEs, it makes no sence whatsoever
# to 'run' anything within it since any particular cookiecutter
# is declarative by nature.
.idea/

View File

@ -53,15 +53,19 @@ Listed in alphabetical order.
Andy Rose
Anna Callahan `@jazztpt`_
Antonia Blair `@antoniablair`_ @antoniablairart
Arcuri Davide `@dadokkio`_
Areski Belaid `@areski`_
Ashley Camba
Barclay Gauld `@yunti`_
Ben Warren `@bwarren2`
Ben Lopatin
Benjamin Abel
Bert de Miranda `@bertdemiranda`_
Bo Lopker `@blopker`_
Bouke Haarsma
Brent Payne `@brentpayne`_ @brentpayne
Burhan Khalid `@burhan`_ @burhan
Bruno Alla               `@browniebroke`_ @_BrunoAlla
Burhan Khalid            `@burhan`_                   @burhan
Catherine Devlin `@catherinedevlin`_
Cédric Gaspoz `@cgaspoz`_
Chris Curvey `@ccurvey`_
@ -86,6 +90,7 @@ Listed in alphabetical order.
Felipe Arruda `@arruda`_
Garry Cairns `@garry-cairns`_
Garry Polley `@garrypolley`_
Hamish Durkin `@durkode`_
Harry Percival `@hjwp`_
Henrique G. G. Pereira `@ikkebr`_
Ian Lee `@IanLee1521`_
@ -123,6 +128,7 @@ Listed in alphabetical order.
Peter Bittner `@bittner`_
Raphael Pierzina `@hackebrot`_
Raony Guimarães Corrêa `@raonyguimaraes`_
Reggie Riser `@reggieriser`_
René Muhl `@rm--`_
Roman Afanaskin `@siauPatrick`_
Roman Osipenko `@romanosipenko`_
@ -142,6 +148,7 @@ Listed in alphabetical order.
Travis McNeill `@Travistock`_ @tavistock_esq
Vitaly Babiy
Vivian Guillen `@viviangb`_
Wan Liuyang `@sfdye`_ @sfdye
Will Farley `@goldhand`_ @g01dhand
William Archinal `@archinal`_
Yaroslav Halchenko
@ -162,6 +169,7 @@ Listed in alphabetical order.
.. _@bloodpet: https://github.com/bloodpet
.. _@blopker: https://github.com/blopker
.. _@bogdal: https://github.com/bogdal
.. _@browniebroke: https://github.com/browniebroke
.. _@burhan: https://github.com/burhan
.. _@c-rhodes: https://github.com/c-rhodes
.. _@caffodian: https://github.com/caffodian
@ -177,6 +185,7 @@ Listed in alphabetical order.
.. _@dhepper: https://github.com/dhepper
.. _@dot2dotseurat: https://github.com/dot2dotseurat
.. _@dsclementsen: https://github.com/dsclementsen
.. _@durkode: https://github.com/durkode
.. _@epileptic-fish: https://gihub.com/epileptic-fish
.. _@eraldo: https://github.com/eraldo
.. _@eriol: https://github.com/eriol
@ -212,11 +221,13 @@ Listed in alphabetical order.
.. _@oubiga: https://github.com/oubiga
.. _@parbhat: https://github.com/parbhat
.. _@raonyguimaraes: https://github.com/raonyguimaraes
.. _@reggieriser: https://github.com/reggieriser
.. _@rm--: https://github.com/rm--
.. _@romanosipenko: https://github.com/romanosipenko
.. _@shireenrao: https://github.com/shireenrao
.. _@webyneter: https://github.com/webyneter
.. _@show0k: https://github.com/show0k
.. _@sfdye: https://github.com/sfdye
.. _@shultz: https://github.com/shultz
.. _@siauPatrick: https://github.com/siauPatrick
.. _@slafs: https://github.com/slafs

View File

@ -1,22 +1,27 @@
Cookiecutter Django
=======================
.. image:: https://pyup.io/repos/github/pydanny/cookiecutter-django/shield.svg
:target: https://pyup.io/repos/github/pydanny/cookiecutter-django/
:alt: Updates
.. image:: https://travis-ci.org/pydanny/cookiecutter-django.svg?branch=master
:target: https://travis-ci.org/pydanny/cookiecutter-django?branch=master
:alt: Build Status
.. image:: https://pyup.io/repos/github/pydanny/cookiecutter-django/shield.svg
:target: https://pyup.io/repos/github/pydanny/cookiecutter-django/
:alt: Updates
.. image:: https://badges.gitter.im/Join Chat.svg
:target: https://gitter.im/pydanny/cookiecutter-django?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
Powered by Cookiecutter_, Cookiecutter Django is a framework for jumpstarting production-ready Django projects quickly.
Powered by Cookiecutter_, Cookiecutter Django is a framework for jumpstarting
production-ready Django projects quickly.
* Documentation: https://cookiecutter-django.readthedocs.io/en/latest/
* See Troubleshooting_ for common errors and obstacles
* If you have problems with Cookiecutter Django, please open issues_ before sending emails to the maintainers. You will get a much, MUCH faster response.
* If you have problems with Cookiecutter Django, please open issues_ don't send
emails to the maintainers.
* Need quick professional paid support? Contact `support@cookiecutter.io`_.
This includes configuring your servers, fixing bugs, reviewing your code and
everything in between.
.. _cookiecutter: https://github.com/audreyr/cookiecutter
@ -24,6 +29,7 @@ Powered by Cookiecutter_, Cookiecutter Django is a framework for jumpstarting pr
.. _528: https://github.com/pydanny/cookiecutter-django/issues/528#issuecomment-212650373
.. _issues: https://github.com/pydanny/cookiecutter-django/issues/new
.. _support@cookiecutter.io: support@cookiecutter.io
Features
---------
@ -31,7 +37,7 @@ Features
* For Django 1.10
* Works with Python 3.4.x or 3.5.x. Python 3.6 is experimental
* Renders Django projects with 100% starting test coverage
* Twitter Bootstrap_ v4.0.0 - alpha 6 (`maintained Foundation fork`_ also available)
* Twitter Bootstrap_ v4.0.0 - beta 1 (`maintained Foundation fork`_ also available)
* 12-Factor_ based settings via django-environ_
* Secure by default. We believe in SSL.
* Optimized development and production settings
@ -40,7 +46,7 @@ Features
* Grunt build for compass and livereload
* Send emails via Anymail_ (using Mailgun_ by default, but switchable)
* Media storage using Amazon S3
* Docker support using docker-compose_ for development and production
* Docker support using docker-compose_ for development and production (using Caddy_ with LetsEncrypt_ support)
* Procfile_ for deploying to Heroku
* Instructions for deploying to PythonAnywhere_
* Run tests with unittest or py.test
@ -76,7 +82,8 @@ Optional Integrations
.. _docker-compose: https://github.com/docker/compose
.. _Opbeat: https://opbeat.com/
.. _PythonAnywhere: https://www.pythonanywhere.com/
.. _Caddy: https://caddyserver.com/
.. _LetsEncrypt: https://letsencrypt.org/
Constraints
-----------
@ -85,6 +92,36 @@ Constraints
* Uses PostgreSQL everywhere (9.2+)
* Environment variables for configuration (This won't work with Apache/mod_wsgi except on AWS ELB).
Support this Project!
----------------------
This project is run by volunteers. Please support them in their efforts to maintain and improve Cookiecutter Django:
* https://www.patreon.com/danielroygreenfeld: Project lead. Expertise in AWS ELB and Django.
Projects that provide financial support to the maintainers:
Two Scoops of Django 1.11
~~~~~~~~~~~~~~~~~~~~~~~~~
.. image:: https://cdn.shopify.com/s/files/1/0304/6901/products/tsd-111-alpha_medium.jpg?v=1499531513
:name: Two Scoops of Django 1.11 Cover
:align: center
:alt: Two Scoops of Django
:target: http://twoscoopspress.org/products/two-scoops-of-django-1-11
Two Scoops of Django is the best dairy-themed Django reference in the universe
pyup
~~~~~~~~~~~~~~~~~~
.. image:: https://pyup.io/static/images/logo.png
:name: pyup
:align: center
:alt: pyup
:target: https://pyup.io/
Pyup brings you automated security and dependency updates used by Google and other organizations. Free for open source projects!
Usage
------
@ -141,7 +178,6 @@ Answer the prompts with your own desired options_. For example::
2 - Grunt
3 - None
Choose from 1, 2, 3, 4 [1]: 1
use_lets_encrypt [n]: n
Select open_source_license:
1 - MIT
2 - BSD
@ -258,31 +294,5 @@ Code of Conduct
Everyone interacting in the Cookiecutter project's codebases, issue trackers, chat
rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_.
Support This Project
---------------------------
This project is maintained by volunteers. Support their efforts by spreading the word about:
Two Scoops of Django 1.11
~~~~~~~~~~~~~~~~~~~~~~~~~
.. image:: https://cdn.shopify.com/s/files/1/0304/6901/files/tsd-111-alpha-470x235.jpg?2934688328290951771
:name: Two Scoops of Django 1.11 Cover
:align: center
:alt: Two Scoops of Django
:target: http://twoscoopspress.org/products/two-scoops-of-django-1-11
Two Scoops of Django is the best dairy-themed Django reference in the universe
pyup
~~~~~~~~~~~~~~~~~~
.. image:: https://pyup.io/static/images/logo.png
:name: pyup
:align: center
:alt: pyup
:target: https://pyup.io/
Pyup brings you automated security and dependency updates used by Google and other organizations. Free for open source projects!
.. _`PyPA Code of Conduct`: https://www.pypa.io/en/latest/code-of-conduct/

View File

@ -20,6 +20,6 @@
"use_compressor": "n",
"postgresql_version": ["9.6", "9.5", "9.4", "9.3", "9.2"],
"js_task_runner": ["Gulp", "Grunt", "None"],
"use_lets_encrypt": "n",
"custom_bootstrap_compilation": "n",
"open_source_license": ["MIT", "BSD", "GPLv3", "Apache Software License 2.0", "Not open source"]
}

View File

@ -12,12 +12,12 @@ Prerequisites
Understand the Compose Setup
--------------------------------
Before you start, check out the `docker-compose.yml` file in the root of this project. This is where each component
Before you start, check out the `production.yml` file in the root of this project. This is where each component
of this application gets its configuration from. Notice how it provides configuration for these services:
* `postgres` service that runs the database
* `redis` for caching
* `nginx` as reverse proxy
* `caddy` as webserver
* `django` is the Django project run by gunicorn
If you chose the `use_celery` option, there are two more services:
@ -25,10 +25,6 @@ If you chose the `use_celery` option, there are two more services:
* `celeryworker` which runs the celery worker process
* `celerybeat` which runs the celery beat process
If you chose the `use_letsencrypt` option, you also have:
* `certbot` which keeps your certs from letsencrypt up-to-date
Populate .env With Your Environment Variables
---------------------------------------------
@ -37,6 +33,8 @@ root directory of this project as a starting point. Add your own variables to th
file won't be tracked by git by default so you'll have to make sure to use some other mechanism to copy your secret if
you are relying solely on git.
It is **highly recommended** that before you build your production application, you set your POSTGRES_USER value here. This will create a non-default user for the postgres image. If you do not set this user before building the application, the default user 'postgres' will be created, and this user will not be able to create or restore backups.
To obtain logs and information about crashes in a production setup, make sure that you have access to an external Sentry instance (e.g. by creating an account with `sentry.io`_), and set the `DJANGO_SENTRY_DSN` variable. This should be enough to report crashes to Sentry.
You will probably also need to setup the Mail backend, for example by adding a `Mailgun`_ API key and a `Mailgun`_ sender domain, otherwise, the account creation view will crash and result in a 500 error when the backend attempts to send an email to the account owner.
@ -53,68 +51,26 @@ It is always better to deploy a site behind HTTPS and will become crucial as the
* In the `.env.example`, we have made it simpler for you to change the default `Django Admin` into a custom name through an environmental variable. This should make it harder to guess the access to the admin panel.
* If you are not using a subdomain of the domain name set in the project, then remember to put the your staging/production IP address in the ``ALLOWED_HOSTS``_ environment variable before you deploy your website. Failure to do this will mean you will not have access to your website through the HTTP protocol.
* If you are not using a subdomain of the domain name set in the project, then remember to put the your staging/production IP address in the :code:`DJANGO_ALLOWED_HOSTS` environment variable (see :ref:`settings`) before you deploy your website. Failure to do this will mean you will not have access to your website through the HTTP protocol.
* Access to the Django admin is set up by default to require HTTPS in production or once *live*. We recommend that you look into setting up the *Certbot and Let's Encrypt Setup* mentioned below or another HTTPS certification service.
* Access to the Django admin is set up by default to require HTTPS in production or once *live*.
Optional: nginx-proxy Setup
---------------------------
By default, the application is configured to listen on all interfaces on port 80. If you want to change that, open the
`docker-compose.yml` file and replace `0.0.0.0` with your own ip.
HTTPS is configured by default
------------------------------
If you are using `nginx-proxy`_ to run multiple application stacks on one host, remove the port setting entirely and add `VIRTUAL_HOST=example.com` to your env file. Here, replace example.com with the value you entered for `domain_name`.
The Caddy webserver used in the default configuration will get you a valid certificate from Lets Encrypt and update it automatically. All you need to do to enable this is to make sure that your DNS records are pointing to the server Caddy runs on.
This pass all incoming requests on `nginx-proxy`_ to the nginx service your application is using.
You can read more about this here at `Automatic HTTPS`_ in the Caddy docs.
.. _Automatic HTTPS: https://caddyserver.com/docs/automatic-https
.. _nginx-proxy: https://github.com/jwilder/nginx-proxy
Optional: Postgres Data Volume Modifications
---------------------------------------------
Postgres is saving its database files to the `postgres_data` volume by default. Change that if you want something else and make sure to make backups since this is not done automatically.
Optional: Certbot and Let's Encrypt Setup
------------------------------------------
If you chose `use_letsencrypt` and will be using certbot for https, you must do the following before running anything with docker-compose:
Replace dhparam.pem.example with a generated dhparams.pem file before running anything with docker-compose. You can generate this on ubuntu or OS X by running the following in the project root:
::
$ openssl dhparam -out /path/to/project/compose/nginx/dhparams.pem 2048
If you would like to add additional subdomains to your certificate, you must add additional parameters to the certbot command in the `docker-compose.yml` file:
Replace:
::
command: bash -c "sleep 6 && certbot certonly -n --standalone -d {{ cookiecutter.domain_name }} --text --agree-tos --email mjsisley@relawgo.com --server https://acme-v01.api.letsencrypt.org/directory --rsa-key-size 4096 --verbose --keep-until-expiring --standalone-supported-challenges http-01"
With:
::
command: bash -c "sleep 6 && certbot certonly -n --standalone -d {{ cookiecutter.domain_name }} -d www.{{ cookiecutter.domain_name }} -d etc.{{ cookiecutter.domain_name }} --text --agree-tos --email {{ cookiecutter.email }} --server https://acme-v01.api.letsencrypt.org/directory --rsa-key-size 4096 --verbose --keep-until-expiring --standalone-supported-challenges http-01"
Please be cognizant of Certbot/Letsencrypt certificate requests limits when getting this set up. The provide a test server that does not count against the limit while you are getting set up.
The certbot certificates expire after 3 months.
If you would like to set up autorenewal of your certificates, the following commands can be put into a bash script:
::
#!/bin/bash
cd <project directory>
docker-compose run --rm --name certbot certbot bash -c "sleep 6 && certbot certonly --standalone -d {{ cookiecutter.domain_name }} --text --agree-tos --email {{ cookiecutter.email }} --server https://acme-v01.api.letsencrypt.org/directory --rsa-key-size 4096 --verbose --keep-until-expiring --standalone-supported-challenges http-01"
docker exec {{ cookiecutter.project_name }}_nginx_1 nginx -s reload
And then set a cronjob by running `crontab -e` and placing in it (period can be adjusted as desired)::
0 4 * * 1 /path/to/bashscript/renew_certbot.sh
Run your app with docker-compose
--------------------------------
@ -123,40 +79,40 @@ directory.
You'll need to build the stack first. To do that, run::
docker-compose build
docker-compose -f production.yml build
Once this is ready, you can run it with::
docker-compose up
docker-compose -f production.yml up
To run a migration, open up a second terminal and run::
docker-compose run django python manage.py migrate
docker-compose -f production.yml run django python manage.py migrate
To create a superuser, run::
docker-compose run django python manage.py createsuperuser
docker-compose -f production.yml run django python manage.py createsuperuser
If you need a shell, run::
docker-compose run django python manage.py shell
docker-compose -f production.yml run django python manage.py shell
To get an output of all running containers.
To check your logs, run::
docker-compose logs
docker-compose -f production.yml logs
If you want to scale your application, run::
docker-compose scale django=4
docker-compose scale celeryworker=2
docker-compose -f production.yml scale django=4
docker-compose -f production.yml scale celeryworker=2
.. warning:: Don't run the scale command on postgres, celerybeat, certbot, or nginx.
.. warning:: Don't run the scale command on postgres, celerybeat, or caddy.
If you have errors, you can always check your stack with `docker-compose`. Switch to your projects root directory and run::
docker-compose ps
docker-compose -f production.yml ps
Supervisor Example
@ -164,12 +120,12 @@ Supervisor Example
Once you are ready with your initial setup, you want to make sure that your application is run by a process manager to
survive reboots and auto restarts in case of an error. You can use the process manager you are most familiar with. All
it needs to do is to run `docker-compose up` in your projects root directory.
it needs to do is to run `docker-compose -f production.yml up` in your projects root directory.
If you are using `supervisor`, you can use this file as a starting point::
[program:{{cookiecutter.project_slug}}]
command=docker-compose up
command=docker-compose -f production.yml up
directory=/path/to/{{cookiecutter.project_slug}}
redirect_stderr=true
autostart=true

View File

@ -16,12 +16,13 @@ If you don't already have it installed, follow the instructions for your OS:
- On Mac OS X, you'll need `Docker for Mac`_
- On Windows, you'll need `Docker for Windows`_
- On Linux, you'll need `docker-engine`_
.. _`Docker for Mac`: https://docs.docker.com/engine/installation/mac/
.. _`Docker for Windows`: https://docs.docker.com/engine/installation/windows/
.. _`docker-engine`: https://docs.docker.com/engine/installation/
Attention Windows users
-------------
-----------------------
Currently PostgreSQL (``psycopg2`` python package) is not installed inside Docker containers for Windows users, while it is required by the generated Django project. To fix this, add ``psycopg2`` to the list of requirements inside ``requirements/base.txt``::
@ -36,9 +37,9 @@ Build the Stack
This can take a while, especially the first time you run this particular command
on your development system::
$ docker-compose -f dev.yml build
$ docker-compose -f local.yml build
If you want to build the production environment you don't have to pass an argument -f, it will automatically use docker-compose.yml.
If you want to build the production environment you use ``production.yml`` as -f argument (``docker-compose.yml`` or ``docker-compose.yaml`` are the defaults).
Boot the System
---------------
@ -50,11 +51,11 @@ runs will occur quickly.
Open a terminal at the project root and run the following for local development::
$ docker-compose -f dev.yml up
$ docker-compose -f local.yml up
You can also set the environment variable ``COMPOSE_FILE`` pointing to ``dev.yml`` like this::
You can also set the environment variable ``COMPOSE_FILE`` pointing to ``local.yml`` like this::
$ export COMPOSE_FILE=dev.yml
$ export COMPOSE_FILE=local.yml
And then run::
@ -64,24 +65,24 @@ Running management commands
~~~~~~~~~~~~~~~~~~~~~~~~~~~
As with any shell command that we wish to run in our container, this is done
using the ``docker-compose run`` command.
using the ``docker-compose -f local.yml run`` command.
To migrate your app and to create a superuser, run::
$ docker-compose -f dev.yml run django python manage.py migrate
$ docker-compose -f dev.yml run django python manage.py createsuperuser
$ docker-compose -f local.yml run django python manage.py migrate
$ docker-compose -f local.yml run django python manage.py createsuperuser
Here we specify the ``django`` container as the location to run our management commands.
Add your Docker development server IP
------------------------------------
-------------------------------------
When ``DEBUG`` is set to `True`, the host is validated against ``['localhost', '127.0.0.1', '[::1]']``. This is adequate when running a ``virtualenv``. For Docker, in the ``config.settings.local``, add your host development server IP to ``INTERNAL_IPS`` or ``ALLOWED_HOSTS`` if the variable exists.
Production Mode
~~~~~~~~~~~~~~~
Instead of using `dev.yml`, you would use `docker-compose.yml`.
Instead of using `local.yml`, you would use `production.yml`.
Other Useful Tips
-----------------
@ -103,7 +104,7 @@ If you want to run the stack in detached mode (in the background), use the ``-d`
::
$ docker-compose -f dev.yml up -d
$ docker-compose -f local.yml up -d
Debugging
~~~~~~~~~~~~~
@ -121,13 +122,13 @@ Then you may need to run the following for it to work as desired:
::
$ docker-compose run -f dev.yml --service-ports django
$ docker-compose -f local.yml run --service-ports django
django-debug-toolbar
""""""""""""""""""""
In order for django-debug-toolbar to work with docker you need to add your docker-machine ip address (the output of `Get the IP ADDRESS`_) to INTERNAL_IPS in local.py
In order for django-debug-toolbar to work with docker you need to add your docker-machine ip address to ``INTERNAL_IPS`` in ``local.py``
.. May be a better place to put this, as it is not Docker specific.

View File

@ -2,26 +2,26 @@
Database Backups with Docker
============================
The database has to be running to create/restore a backup. These examples show local examples. If you want to use it on a remote server, remove ``-f dev.yml`` from each example.
The database has to be running to create/restore a backup. These examples show local examples. If you want to use it on a remote server, remove ``-f local.yml`` from each example.
Running Backups
================
Run the app with `docker-compose -f dev.yml up`.
Run the app with `docker-compose -f local.yml up`.
To create a backup, run::
docker-compose -f dev.yml run postgres backup
docker-compose -f local.yml run postgres backup
To list backups, run::
docker-compose -f dev.yml run postgres list-backups
docker-compose -f local.yml run postgres list-backups
To restore a backup, run::
docker-compose -f dev.yml run postgres restore filename.sql
docker-compose -f local.yml run postgres restore filename.sql
Where <containerId> is the ID of the Postgres container. To get it, run::

View File

@ -33,5 +33,4 @@ Indices and tables
* :ref:`genindex`
* :ref:`search`
.. At some point it would be good to have a module index of the high level things
we are doing. Then we can * :ref:`modindex` back in.
.. At some point it would be good to have a module index of the high level things we are doing. Then we can * :ref:`modindex` back in.

View File

@ -14,7 +14,7 @@ To run flake8:
The config for flake8 is located in setup.cfg. It specifies:
* Set max line length to 120 chars
* Exclude .tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules
* Exclude ``.tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules``
pylint
------
@ -30,14 +30,14 @@ The config for pylint is located in .pylintrc. It specifies:
* Disable linting messages for missing docstring and invalid name
* max-parents=13
pep8
pycodestyle
-----
This is included in flake8's checks, but you can also run it separately to see a more detailed report:
$ pep8 <python files that you wish to lint>
$ pycodestyle <python files that you wish to lint>
The config for pep8 is located in setup.cfg. It specifies:
The config for pycodestyle is located in setup.cfg. It specifies:
* Set max line length to 120 chars
* Exclude .tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules
* Exclude ``.tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules``

View File

@ -21,4 +21,4 @@ The base app will now run as it would with the usual ``manage.py runserver`` but
To get live reloading to work you'll probably need to install an `appropriate browser extension`_
.. _appropriate browser extension: http://feedback.livereload.com/knowledgebase/articles/86242-how-do-i-install-and-use-the-browser-extensions-
.. _appropriate browser extension: http://livereload.com/extensions/

View File

@ -72,8 +72,8 @@ js_task_runner [1]
2. Grunt_
3. None
use_lets_encrypt [n]
Use `Let's Encrypt`_ as the certificate authority for this project.
custom_bootstrap_compilation [n]
If you use Grunt, scaffold out recompiling Bootstrap as as task. (Useful for letting you change Bootstrap variables in real time.) Consult project README for more details.
open_source_license [1]
Select a software license for the project. The choices are:

View File

@ -1,3 +1,5 @@
.. _settings:
Settings
==========

View File

@ -130,7 +130,7 @@ def remove_docker_files():
"""
Removes files needed for docker if it isn't going to be used
"""
for filename in ["dev.yml", "docker-compose.yml", ".dockerignore"]:
for filename in ["local.yml", "production.yml", ".dockerignore"]:
os.remove(os.path.join(
PROJECT_DIRECTORY, filename
))
@ -167,14 +167,6 @@ def remove_packageJSON_file():
PROJECT_DIRECTORY, filename
))
def remove_certbot_files():
"""
Removes files needed for certbot if it isn't going to be used
"""
nginx_dir_location = os.path.join(PROJECT_DIRECTORY, 'compose/nginx')
for filename in ["nginx-secure.conf", "start.sh", "dhparams.example.pem"]:
file_name = os.path.join(nginx_dir_location, filename)
remove_file(file_name)
def remove_copying_files():
"""
@ -229,26 +221,26 @@ def remove_open_source_files():
# dst = os.path.join(target_dir, name)
# shutil.copyfile(src, dst)
# 1. Generates and saves random secret key
# Generates and saves random secret key
make_secret_key(PROJECT_DIRECTORY)
# 2. Removes the taskapp if celery isn't going to be used
# Removes the taskapp if celery isn't going to be used
if '{{ cookiecutter.use_celery }}'.lower() == 'n':
remove_task_app(PROJECT_DIRECTORY)
# 3. Removes the .idea directory if PyCharm isn't going to be used
# Removes the .idea directory if PyCharm isn't going to be used
if '{{ cookiecutter.use_pycharm }}'.lower() != 'y':
remove_pycharm_dir(PROJECT_DIRECTORY)
# 4. Removes all heroku files if it isn't going to be used
# Removes all heroku files if it isn't going to be used
if '{{ cookiecutter.use_heroku }}'.lower() != 'y':
remove_heroku_files()
# 5. Removes all docker files if it isn't going to be used
# Removes all docker files if it isn't going to be used
if '{{ cookiecutter.use_docker }}'.lower() != 'y':
remove_docker_files()
# 6. Removes all JS task manager files if it isn't going to be used
# Removes all JS task manager files if it isn't going to be used
if '{{ cookiecutter.js_task_runner}}'.lower() == 'gulp':
remove_grunt_files()
elif '{{ cookiecutter.js_task_runner}}'.lower() == 'grunt':
@ -258,11 +250,7 @@ else:
remove_grunt_files()
remove_packageJSON_file()
# 7. Removes all certbot/letsencrypt files if it isn't going to be used
if '{{ cookiecutter.use_lets_encrypt }}'.lower() != 'y':
remove_certbot_files()
# 8. Display a warning if use_docker and use_grunt are selected. Grunt isn't
# Display a warning if use_docker and use_grunt are selected. Grunt isn't
# supported by our docker config atm.
if '{{ cookiecutter.js_task_runner }}'.lower() in ['grunt', 'gulp'] and '{{ cookiecutter.use_docker }}'.lower() == 'y':
print(
@ -271,29 +259,15 @@ if '{{ cookiecutter.js_task_runner }}'.lower() in ['grunt', 'gulp'] and '{{ cook
"js task runner service to your docker configuration manually."
)
# 9. Removes the certbot/letsencrypt files and display a warning if use_lets_encrypt is selected and use_docker isn't.
if '{{ cookiecutter.use_lets_encrypt }}'.lower() == 'y' and '{{ cookiecutter.use_docker }}'.lower() != 'y':
remove_certbot_files()
print(
"You selected to use Let's Encrypt and didn't select to use docker. This is NOT supported out of the box for now. You "
"can continue to use the project like you normally would, but Let's Encrypt files have been included."
)
# 10. Directs the user to the documentation if certbot and docker are selected.
if '{{ cookiecutter.use_lets_encrypt }}'.lower() == 'y' and '{{ cookiecutter.use_docker }}'.lower() == 'y':
print(
"You selected to use Let's Encrypt, please see the documentation for instructions on how to use this in production. "
"You must generate a dhparams.pem file before running docker-compose in a production environment."
)
# 11. Removes files needed for the GPLv3 licence if it isn't going to be used.
# Removes files needed for the GPLv3 licence if it isn't going to be used.
if '{{ cookiecutter.open_source_license}}' != 'GPLv3':
remove_copying_files()
# 12. Remove Elastic Beanstalk files
# Remove Elastic Beanstalk files
if '{{ cookiecutter.use_elasticbeanstalk_experimental }}'.lower() != 'y':
remove_elasticbeanstalk()
# 13. Remove files conventional to opensource projects only.
# Remove files conventional to opensource projects only.
if '{{ cookiecutter.open_source_license }}' == 'Not open source':
remove_open_source_files()

View File

@ -1,11 +1,11 @@
cookiecutter==1.5.1
flake8==3.3.0 # pyup: != 2.6.0
sh==1.12.13
binaryornot==0.4.3
flake8==3.4.1 # pyup: != 2.6.0
sh==1.12.14
binaryornot==0.4.4
# Testing
pytest==3.0.7
pep8==1.7.0
pyflakes==1.5.0
tox==2.7.0
pytest==3.2.2
pycodestyle==2.3.1
pyflakes==1.6.0
tox==2.9.0
pytest-cookies==0.2.0

View File

@ -15,7 +15,7 @@ cookiecutter ../../ --no-input --overwrite-if-exists use_docker=y js_task_runner
cd project_name
# run the project's tests
docker-compose -f dev.yml run django python manage.py test
docker-compose -f local.yml run django python manage.py test
# return non-zero status code if there are migrations that have not been created
docker-compose -f dev.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; }
docker-compose -f local.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; }

View File

@ -1,81 +1,369 @@
### OSX ###
.DS_Store
.AppleDouble
.LSOverride
### SublimeText ###
# cache files for sublime text
*.tmlanguage.cache
*.tmPreferences.cache
*.stTheme.cache
# workspace files are user-specific
*.sublime-workspace
# project files should be checked into the repository, unless a significant
# proportion of contributors will probably not be using SublimeText
# *.sublime-project
# sftp configuration file
sftp-config.json
# Basics
### Python template
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
__pycache__
*$py.class
# Logs
logs
*.log
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
npm-debug.log*
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.tox
.coverage.*
.cache
nosetests.xml
htmlcov
coverage.xml
*.cover
.hypothesis/
# Translations
*.mo
*.pot
# Pycharm
.idea/*
{% if cookiecutter.use_pycharm == 'y' %}
# Django stuff:
staticfiles/
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# pyenv
.python-version
# celery beat schedule file
celerybeat-schedule
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
### Node template
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
# nyc test coverage
.nyc_output
# Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# Bower dependency directory (https://bower.io/)
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons (http://nodejs.org/api/addons.html)
build/Release
# Dependency directories
node_modules/
jspm_packages/
# Typescript v1 declaration files
typings/
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
### Linux template
*~
# temporary files which can be created if a process still has a handle open of a deleted file
.fuse_hidden*
# KDE directory preferences
.directory
# Linux trash folder which might appear on any partition or disk
.Trash-*
# .nfs files are created when an open file is removed but is still being accessed
.nfs*
### VisualStudioCode template
.vscode/*
!.vscode/settings.json
!.vscode/tasks.json
!.vscode/launch.json
!.vscode/extensions.json
{% if cookiecutter.use_pycharm == 'y' -%}
# Provided default Pycharm Run/Debug Configurations should be tracked by git
# In case of local modifications made by Pycharm, use update-index command
# for each changed file, like this:
# git update-index --assume-unchanged .idea/{{cookiecutter.project_slug}}.iml
!.idea/runConfigurations/
!.idea/{{cookiecutter.project_slug}}.iml
!.idea/vcs.xml
!.idea/webResources.xml
### JetBrains template
# Covers JetBrains IDEs: IntelliJ, RubyMine, PhpStorm, AppCode, PyCharm, CLion, Android Studio and Webstorm
# Reference: https://intellij-support.jetbrains.com/hc/en-us/articles/206544839
# User-specific stuff:
.idea/**/workspace.xml
.idea/**/tasks.xml
.idea/dictionaries
# Sensitive or high-churn files:
.idea/**/dataSources/
.idea/**/dataSources.ids
.idea/**/dataSources.xml
.idea/**/dataSources.local.xml
.idea/**/sqlDataSources.xml
.idea/**/dynamic.xml
.idea/**/uiDesigner.xml
# Gradle:
.idea/**/gradle.xml
.idea/**/libraries
# CMake
cmake-build-debug/
# Mongo Explorer plugin:
.idea/**/mongoSettings.xml
## File-based project format:
*.iws
## Plugin-specific files:
# IntelliJ
out/
# mpeltonen/sbt-idea plugin
.idea_modules/
# JIRA plugin
atlassian-ide-plugin.xml
# Cursive Clojure plugin
.idea/replstate.xml
# Crashlytics plugin (for Android Studio and IntelliJ)
com_crashlytics_export_strings.xml
crashlytics.properties
crashlytics-build.properties
fabric.properties
{% endif %}
# Vim
*~
*.swp
*.swo
### Windows template
# Windows thumbnail cache files
Thumbs.db
ehthumbs.db
ehthumbs_vista.db
# npm
node_modules/
# Dump file
*.stackdump
# Compass
.sass-cache
# Folder config file
Desktop.ini
# virtual environments
.env
# Recycle Bin used on file shares
$RECYCLE.BIN/
# User-uploaded media
{{ cookiecutter.project_slug }}/media/
# Windows Installer files
*.cab
*.msi
*.msm
*.msp
{% if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' %}
# MailHog binary
# Windows shortcuts
*.lnk
### macOS template
# General
*.DS_Store
.AppleDouble
.LSOverride
# Icon must end with two \r
Icon
# Thumbnails
._*
# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent
# Directories potentially created on remote AFP share
.AppleDB
.AppleDesktop
Network Trash Folder
Temporary Items
.apdisk
### SublimeText template
# Cache files for Sublime Text
*.tmlanguage.cache
*.tmPreferences.cache
*.stTheme.cache
# Workspace files are user-specific
*.sublime-workspace
# Project files should be checked into the repository, unless a significant
# proportion of contributors will probably not be using Sublime Text
# *.sublime-project
# SFTP configuration file
sftp-config.json
# Package control specific files
Package Control.last-run
Package Control.ca-list
Package Control.ca-bundle
Package Control.system-ca-bundle
Package Control.cache/
Package Control.ca-certs/
Package Control.merged-ca-bundle
Package Control.user-ca-bundle
oscrypto-ca-bundle.crt
bh_unicode_properties.cache
# Sublime-github package stores a github token in this file
# https://packagecontrol.io/packages/sublime-github
GitHub.sublime-settings
### Vim template
# Swap
[._]*.s[a-v][a-z]
[._]*.sw[a-p]
[._]s[a-v][a-z]
[._]sw[a-p]
# Session
Session.vim
# Temporary
.netrwhist
# Auto-generated tag files
tags
### VirtualEnv template
# Virtualenv
# http://iamzed.com/2009/05/07/a-primer-on-virtualenv/
[Bb]in
[Ii]nclude
[Ll]ib
[Ll]ib64
[Ll]ocal
[Ss]cripts
pyvenv.cfg
pip-selfcheck.json
{% if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' -%}
mailhog
{% endif %}
staticfiles/
.cache/
{{ cookiecutter.project_slug }}/media/
{% if cookiecutter.use_docker == 'y' -%}
# Added to maintain local compose files which are ignored by something above.
# See issue https://github.com/pydanny/cookiecutter-django/issues/1321
!/compose/local/
{% endif %}

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="false" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:django/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:django/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="false" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:django/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:django/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="false" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.test" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.test" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.test" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.test" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.test" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -60,6 +60,9 @@ module.exports = function (grunt) {
dev: {
options: {
outputStyle: 'nested',
{% if cookiecutter.custom_bootstrap_compilation == 'y' %}
includePaths: ['bower_components/bootstrap-sass/assets/stylesheets/bootstrap/'],
{% endif %}
sourceMap: false,
precision: 10
},
@ -70,6 +73,9 @@ module.exports = function (grunt) {
dist: {
options: {
outputStyle: 'compressed',
{% if cookiecutter.custom_bootstrap_compilation == 'y' %}
includePaths: ['bower_components/bootstrap-sass/assets/stylesheets/bootstrap/'],
{% endif %}
sourceMap: false,
precision: 10
},

View File

@ -145,3 +145,12 @@ See detailed `cookiecutter-django Elastic Beanstalk documentation`_.
.. _`cookiecutter-django Docker documentation`: http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-elastic-beanstalk.html
{% endif %}
{% if cookiecutter.custom_bootstrap_compilation == "y" %}
Custom Bootstrap Compilation
^^^^^^
To get automatic Bootstrap recompilation with variables of your choice, install bootstrap sass (`bower install bootstrap-sass`) and tweak your variables in `static/sass/custom_bootstrap_vars`.
(You can find a list of available variables [in the bootstrap-sass source](https://github.com/twbs/bootstrap-sass/blob/master/assets/stylesheets/bootstrap/_variables.scss), or get explanations on them in the [Bootstrap docs](https://getbootstrap.com/customize/).)
{% endif %}

View File

@ -1,27 +0,0 @@
FROM python:3.5
ENV PYTHONUNBUFFERED 1
# Requirements have to be pulled and installed here, otherwise caching won't work
COPY ./requirements /requirements
RUN pip install -r /requirements/production.txt \
&& groupadd -r django \
&& useradd -r -g django django
COPY ./compose/django/gunicorn.sh /gunicorn.sh
COPY ./compose/django/entrypoint.sh /entrypoint.sh
RUN sed -i 's/\r//' /entrypoint.sh \
&& sed -i 's/\r//' /gunicorn.sh \
&& chmod +x /entrypoint.sh \
&& chown django /entrypoint.sh \
&& chmod +x /gunicorn.sh \
&& chown django /gunicorn.sh
COPY . /app
RUN chown -R django /app
USER django
WORKDIR /app
ENTRYPOINT ["/entrypoint.sh"]

View File

@ -1,18 +0,0 @@
FROM python:3.5
ENV PYTHONUNBUFFERED 1
# Requirements have to be pulled and installed here, otherwise caching won't work
COPY ./requirements /requirements
RUN pip install -r /requirements/local.txt
COPY ./compose/django/entrypoint.sh /entrypoint.sh
RUN sed -i 's/\r//' /entrypoint.sh
RUN chmod +x /entrypoint.sh
COPY ./compose/django/start-dev.sh /start-dev.sh
RUN sed -i 's/\r//' /start-dev.sh
RUN chmod +x /start-dev.sh
WORKDIR /app
ENTRYPOINT ["/entrypoint.sh"]

View File

@ -1,3 +0,0 @@
#!/bin/sh
python manage.py migrate
python manage.py runserver_plus 0.0.0.0:8000

View File

@ -0,0 +1,27 @@
FROM python:3.5
ENV PYTHONUNBUFFERED 1
# Requirements have to be pulled and installed here, otherwise caching won't work
COPY ./requirements /requirements
RUN pip install -r /requirements/local.txt
COPY ./compose/production/django/entrypoint.sh /entrypoint.sh
RUN sed -i 's/\r//' /entrypoint.sh
RUN chmod +x /entrypoint.sh
COPY ./compose/local/django/start.sh /start.sh
RUN sed -i 's/\r//' /start.sh
RUN chmod +x /start.sh
COPY ./compose/local/django/celery/worker/start.sh /start-celeryworker.sh
RUN sed -i 's/\r//' /start-celeryworker.sh
RUN chmod +x /start-celeryworker.sh
COPY ./compose/local/django/celery/beat/start.sh /start-celerybeat.sh
RUN sed -i 's/\r//' /start-celerybeat.sh
RUN chmod +x /start-celerybeat.sh
WORKDIR /app
ENTRYPOINT ["/entrypoint.sh"]

View File

@ -0,0 +1,10 @@
#!/usr/bin/env bash
set -o errexit
set -o pipefail
set -o nounset
set -o xtrace
rm -f './celerybeat.pid'
celery -A {{cookiecutter.project_slug}}.taskapp beat -l INFO

View File

@ -0,0 +1,9 @@
#!/usr/bin/env bash
set -o errexit
set -o pipefail
set -o nounset
set -o xtrace
celery -A {{cookiecutter.project_slug}}.taskapp worker -l INFO

View File

@ -0,0 +1,10 @@
#!/usr/bin/env bash
set -o errexit
set -o pipefail
set -o nounset
set -o xtrace
python manage.py migrate
python manage.py runserver_plus 0.0.0.0:8000

View File

@ -1,9 +0,0 @@
FROM nginx:latest
ADD nginx.conf /etc/nginx/nginx.conf
{% if cookiecutter.use_lets_encrypt == 'y' and cookiecutter.use_docker == 'y' %}
ADD start.sh /start.sh
ADD nginx-secure.conf /etc/nginx/nginx-secure.conf
ADD dhparams.pem /etc/ssl/private/dhparams.pem
CMD /start.sh
{% endif %}

View File

@ -1,3 +0,0 @@
-----BEGIN DH PARAMETERS-----
EXAMPLE_FILE
-----END DH PARAMETERS-----

View File

@ -1,96 +0,0 @@
user nginx;
worker_processes 1;
error_log /var/log/nginx/error.log warn;
pid /var/run/nginx.pid;
events {
worker_connections 1024;
}
http {
include /etc/nginx/mime.types;
default_type application/octet-stream;
log_format main '$remote_addr - $remote_user [$time_local] "$request" '
'$status $body_bytes_sent "$http_referer" '
'"$http_user_agent" "$http_x_forwarded_for"';
access_log /var/log/nginx/access.log main;
sendfile on;
#tcp_nopush on;
keepalive_timeout 65;
proxy_headers_hash_bucket_size 52;
gzip on;
upstream app {
server django:5000;
}
server {
listen 80;
server_name ___my.example.com___ www.___my.example.com___;
location /.well-known/acme-challenge {
# Since the certbot container isn't up constantly, need to resolve ip dynamically using docker's dns
resolver ___NAMESERVER___;
set $certbot_addr_port certbot:80;
proxy_pass http://$certbot_addr_port;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-Proto $scheme;
}
location / {
return 301 https://$server_name$request_uri;
}
}
server {
listen 443;
server_name ___my.example.com___ www.___my.example.com___;
ssl on;
ssl_certificate /etc/letsencrypt/live/___my.example.com___/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/___my.example.com___/privkey.pem;
ssl_session_timeout 5m;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_ciphers 'EECDH+AESGCM:EDH+AESGCM:AES256+EECDH:AES256+EDH';
ssl_prefer_server_ciphers on;
ssl_session_cache shared:SSL:10m;
ssl_dhparam /etc/ssl/private/dhparams.pem;
location /.well-known/acme-challenge {
resolver ___NAMESERVER___;
set $certbot_addr_port certbot:443;
proxy_pass http://$certbot_addr_port;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-Proto https;
}
location / {
# checks for static file, if not found proxy to app
try_files $uri @proxy_to_app;
}
# cookiecutter-django app
location @proxy_to_app {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_redirect off;
proxy_pass http://app;
}
}
}

View File

@ -1,61 +0,0 @@
user nginx;
worker_processes 1;
error_log /var/log/nginx/error.log warn;
pid /var/run/nginx.pid;
events {
worker_connections 1024;
}
http {
include /etc/nginx/mime.types;
default_type application/octet-stream;
log_format main '$remote_addr - $remote_user [$time_local] "$request" '
'$status $body_bytes_sent "$http_referer" '
'"$http_user_agent" "$http_x_forwarded_for"';
access_log /var/log/nginx/access.log main;
sendfile on;
#tcp_nopush on;
keepalive_timeout 65;
#gzip on;
upstream app {
server django:5000;
}
server {
listen 80;
charset utf-8;
{% if cookiecutter.use_lets_encrypt == 'y' and cookiecutter.use_docker == 'y' %}
server_name ___my.example.com___ ;
location /.well-known/acme-challenge {
proxy_pass http://certbot:80;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-Proto https;
}
{% endif %}
location / {
# checks for static file, if not found proxy to app
try_files $uri @proxy_to_app;
}
# cookiecutter-django app
location @proxy_to_app {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_redirect off;
proxy_pass http://app;
}
}
}

View File

@ -1,62 +0,0 @@
echo sleep 5
sleep 5
echo build starting nginx config
echo replacing ___my.example.com___/$MY_DOMAIN_NAME
# Put your domain name into the nginx reverse proxy config.
sed -i "s/___my.example.com___/$MY_DOMAIN_NAME/g" /etc/nginx/nginx.conf
cat /etc/nginx/nginx.conf
echo .
echo Firing up nginx in the background.
nginx
# # Check user has specified domain name
if [ -z "$MY_DOMAIN_NAME" ]; then
echo "Need to set MY_DOMAIN_NAME (to a letsencrypt-registered name)."
exit 1
fi
# This bit waits until the letsencrypt container has done its thing.
# We see the changes here bceause there's a docker volume mapped.
echo Waiting for folder /etc/letsencrypt/live/$MY_DOMAIN_NAME to exist
while [ ! -d /etc/letsencrypt/live/$MY_DOMAIN_NAME ] ;
do
sleep 2
done
while [ ! -f /etc/letsencrypt/live/$MY_DOMAIN_NAME/fullchain.pem ] ;
do
echo Waiting for file fullchain.pem to exist
sleep 2
done
while [ ! -f /etc/letsencrypt/live/$MY_DOMAIN_NAME/privkey.pem ] ;
do
echo Waiting for file privkey.pem to exist
sleep 2
done
# This is added so that when the certificate is being renewed or is already in place, nginx waits for everything to be good.
sleep 15
echo replacing ___my.example.com___/$MY_DOMAIN_NAME
# Put your domain name into the nginx reverse proxy config.
sed -i "s/___my.example.com___/$MY_DOMAIN_NAME/g" /etc/nginx/nginx-secure.conf
# Add the system's nameserver (the docker network dns) so we can resolve container names in nginx
NAMESERVER=`cat /etc/resolv.conf | grep "nameserver" | awk '{print $2}' | tr '\n' ' '`
echo replacing ___NAMESERVER___/$NAMESERVER
sed -i "s/___NAMESERVER___/$NAMESERVER/g" /etc/nginx/nginx-secure.conf
#go!
kill $(ps aux | grep 'nginx' | grep -v 'grep' | awk '{print $2}')
cp /etc/nginx/nginx-secure.conf /etc/nginx/nginx.conf
nginx -g 'daemon off;'

View File

@ -1,11 +0,0 @@
FROM postgres:{{ cookiecutter.postgresql_version }}
# add backup scripts
ADD backup.sh /usr/local/bin/backup
ADD restore.sh /usr/local/bin/restore
ADD list-backups.sh /usr/local/bin/list-backups
# make them executable
RUN chmod +x /usr/local/bin/restore
RUN chmod +x /usr/local/bin/list-backups
RUN chmod +x /usr/local/bin/backup

View File

@ -0,0 +1,14 @@
www.{% raw %}{$DOMAIN_NAME}{% endraw %} {
redir https://{{cookiecutter.domain_name}}
}
{% raw %}{$DOMAIN_NAME}{% endraw %} {
proxy / django:5000 {
header_upstream Host {host}
header_upstream X-Real-IP {remote}
header_upstream X-Forwarded-Proto {scheme}
}
log stdout
errors stdout
gzip
}

View File

@ -0,0 +1,3 @@
FROM abiosoft/caddy:0.10.6
COPY ./compose/production/caddy/Caddyfile /etc/Caddyfile

View File

@ -0,0 +1,39 @@
FROM python:3.5
ENV PYTHONUNBUFFERED 1
RUN groupadd -r django \
&& useradd -r -g django django
# Requirements have to be pulled and installed here, otherwise caching won't work
COPY ./requirements /requirements
RUN pip install --no-cache-dir -r /requirements/production.txt \
&& rm -rf /requirements
COPY ./compose/production/django/gunicorn.sh /gunicorn.sh
RUN sed -i 's/\r//' /gunicorn.sh
RUN chmod +x /gunicorn.sh
RUN chown django /gunicorn.sh
COPY ./compose/production/django/entrypoint.sh /entrypoint.sh
RUN sed -i 's/\r//' /entrypoint.sh
RUN chmod +x /entrypoint.sh
RUN chown django /entrypoint.sh
COPY ./compose/production/django/celery/worker/start.sh /start-celeryworker.sh
RUN sed -i 's/\r//' /start-celeryworker.sh
RUN chmod +x /start-celeryworker.sh
COPY ./compose/production/django/celery/beat/start.sh /start-celerybeat.sh
RUN sed -i 's/\r//' /start-celerybeat.sh
RUN chmod +x /start-celerybeat.sh
COPY . /app
RUN chown -R django /app
USER django
WORKDIR /app
ENTRYPOINT ["/entrypoint.sh"]

View File

@ -0,0 +1,8 @@
#!/usr/bin/env bash
set -o errexit
set -o pipefail
set -o nounset
celery -A {{cookiecutter.project_slug}}.taskapp beat -l INFO

View File

@ -0,0 +1,8 @@
#!/usr/bin/env bash
set -o errexit
set -o pipefail
set -o nounset
celery -A {{cookiecutter.project_slug}}.taskapp worker -l INFO

View File

@ -1,5 +1,12 @@
#!/bin/bash
set -e
#!/usr/bin/env bash
set -o errexit
set -o pipefail
# todo: turn on after #1295
# set -o nounset
cmd="$@"
# This entrypoint is used to play nicely with the current cookiecutter configuration.

View File

@ -1,3 +1,9 @@
#!/bin/sh
#!/usr/bin/env bash
set -o errexit
set -o pipefail
set -o nounset
python /app/manage.py collectstatic --noinput
/usr/local/bin/gunicorn config.wsgi -w 4 -b 0.0.0.0:5000 --chdir=/app

View File

@ -0,0 +1,10 @@
FROM postgres:{{ cookiecutter.postgresql_version }}
COPY ./compose/production/postgres/backup.sh /usr/local/bin/backup
RUN chmod +x /usr/local/bin/backup
COPY ./compose/production/postgres/restore.sh /usr/local/bin/restore
RUN chmod +x /usr/local/bin/restore
COPY ./compose/production/postgres/list-backups.sh /usr/local/bin/list-backups
RUN chmod +x /usr/local/bin/list-backups

View File

@ -1,6 +1,9 @@
#!/bin/bash
# stop on errors
set -e
#!/usr/bin/env bash
set -o errexit
set -o pipefail
set -o nounset
# we might run into trouble when using the default `postgres` user, e.g. when dropping the postgres
# database in restore.sh. Check that something else is used here

View File

@ -1,4 +1,10 @@
#!/bin/bash
#!/usr/bin/env bash
set -o errexit
set -o pipefail
set -o nounset
echo "listing available backups"
echo "-------------------------"
ls /backups/

View File

@ -1,7 +1,9 @@
#!/bin/bash
#!/usr/bin/env bash
set -o errexit
set -o pipefail
set -o nounset
# stop on errors
set -e
# we might run into trouble when using the default `postgres` user, e.g. when dropping the postgres
# database in restore.sh. Check that something else is used here
@ -17,10 +19,10 @@ export PGPASSWORD=$POSTGRES_PASSWORD
# check that we have an argument for a filename candidate
if [[ $# -eq 0 ]] ; then
echo 'usage:'
echo ' docker-compose run postgres restore <backup-file>'
echo ' docker-compose -f production.yml run postgres restore <backup-file>'
echo ''
echo 'to get a list of available backups, run:'
echo ' docker-compose run postgres list-backups'
echo ' docker-compose -f production.yml run postgres list-backups'
exit 1
fi
@ -31,7 +33,7 @@ BACKUPFILE=/backups/$1
if ! [ -f $BACKUPFILE ]; then
echo "backup file not found"
echo 'to get a list of available backups, run:'
echo ' docker-compose run postgres list-backups'
echo ' docker-compose -f production.yml run postgres list-backups'
exit 1
fi

View File

@ -4,6 +4,8 @@ Local settings
- Run in Debug mode
{% if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'y' %}
- Use mailhog for emails
{% elif cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' %}
- Use mailhog for emails
{% else %}
- Use console backend for emails
{% endif %}
@ -30,6 +32,8 @@ SECRET_KEY = env('DJANGO_SECRET_KEY', default='CHANGEME!!!')
EMAIL_PORT = 1025
{% if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'y' %}
EMAIL_HOST = env('EMAIL_HOST', default='mailhog')
{% elif cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' %}
EMAIL_HOST = 'localhost'
{% else %}
EMAIL_HOST = 'localhost'
EMAIL_BACKEND = env('DJANGO_EMAIL_BACKEND',

View File

@ -1,7 +1,9 @@
"""
Production Configurations
- Use Amazon's S3 for storing static files and uploaded media
{% if cookiecutter.use_whitenoise == 'y' -%}
- Use WhiteNoise for serving static files{% endif %}
- Use Amazon's S3 for storing {% if cookiecutter.use_whitenoise == 'n' -%}static files and {% endif %}uploaded media
- Use mailgun to send emails
- Use Redis for cache
{% if cookiecutter.use_sentry_for_error_reporting == 'y' %}
@ -12,7 +14,6 @@ Production Configurations
{% endif %}
"""
from boto.s3.connection import OrdinaryCallingFormat
{% if cookiecutter.use_sentry_for_error_reporting == 'y' %}
import logging
{% endif %}
@ -98,7 +99,6 @@ AWS_SECRET_ACCESS_KEY = env('DJANGO_AWS_SECRET_ACCESS_KEY')
AWS_STORAGE_BUCKET_NAME = env('DJANGO_AWS_STORAGE_BUCKET_NAME')
AWS_AUTO_CREATE_BUCKET = True
AWS_QUERYSTRING_AUTH = False
AWS_S3_CALLING_FORMAT = OrdinaryCallingFormat()
# AWS cache settings, don't change unless you know what you're doing:
AWS_EXPIRY = 60 * 60 * 24 * 7
@ -115,11 +115,12 @@ AWS_HEADERS = {
# stored files.
{% if cookiecutter.use_whitenoise == 'y' -%}
MEDIA_URL = 'https://s3.amazonaws.com/%s/' % AWS_STORAGE_BUCKET_NAME
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
{% else %}
# See:http://stackoverflow.com/questions/10390244/
from storages.backends.s3boto import S3BotoStorage
StaticRootS3BotoStorage = lambda: S3BotoStorage(location='static')
MediaRootS3BotoStorage = lambda: S3BotoStorage(location='media')
from storages.backends.s3boto3 import S3Boto3Storage
StaticRootS3BotoStorage = lambda: S3Boto3Storage(location='static')
MediaRootS3BotoStorage = lambda: S3Boto3Storage(location='media')
DEFAULT_FILE_STORAGE = 'config.settings.production.MediaRootS3BotoStorage'
MEDIA_URL = 'https://s3.amazonaws.com/%s/media/' % AWS_STORAGE_BUCKET_NAME
@ -141,7 +142,7 @@ INSTALLED_APPS = ['collectfast', ] + INSTALLED_APPS
{% if cookiecutter.use_compressor == 'y'-%}
# COMPRESSOR
# ------------------------------------------------------------------------------
COMPRESS_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
COMPRESS_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
COMPRESS_URL = STATIC_URL
COMPRESS_ENABLED = env.bool('COMPRESS_ENABLED', default=True)
{%- endif %}
@ -158,7 +159,7 @@ ANYMAIL = {
'MAILGUN_API_KEY': env('DJANGO_MAILGUN_API_KEY'),
'MAILGUN_SENDER_DOMAIN': env('MAILGUN_SENDER_DOMAIN')
}
EMAIL_BACKEND = 'anymail.backends.mailgun.MailgunBackend'
EMAIL_BACKEND = 'anymail.backends.mailgun.EmailBackend'
# TEMPLATE CONFIGURATION
# ------------------------------------------------------------------------------

View File

@ -1,50 +0,0 @@
version: '2'
volumes:
postgres_data_dev: {}
postgres_backup_dev: {}
services:
postgres:
build: ./compose/postgres
volumes:
- postgres_data_dev:/var/lib/postgresql/data
- postgres_backup_dev:/backups
environment:
- POSTGRES_USER={{cookiecutter.project_slug}}
django:
build:
context: .
dockerfile: ./compose/django/Dockerfile-dev
command: /start-dev.sh
depends_on:
- postgres{% if cookiecutter.use_mailhog == 'y' %}
- mailhog{% endif %}
environment:
- POSTGRES_USER={{cookiecutter.project_slug}}
- USE_DOCKER=yes
volumes:
- .:/app
ports:
- "8000:8000"
{% if cookiecutter.use_pycharm == 'y' %}
pycharm:
build:
context: .
dockerfile: ./compose/django/Dockerfile-dev
depends_on:
- postgres
environment:
- POSTGRES_USER={{cookiecutter.project_slug}}
volumes:
- .:/app
{% endif %}
{% if cookiecutter.use_mailhog == 'y' %}
mailhog:
image: mailhog/mailhog
ports:
- "8025:8025"
{% endif %}

View File

@ -1,80 +0,0 @@
version: '2'
volumes:
postgres_data: {}
postgres_backup: {}
services:
postgres:
build: ./compose/postgres
volumes:
- postgres_data:/var/lib/postgresql/data
- postgres_backup:/backups
env_file: .env
django:
build:
context: .
dockerfile: ./compose/django/Dockerfile
depends_on:
- postgres
- redis
command: /gunicorn.sh
env_file: .env
nginx:
build: ./compose/nginx
depends_on:
- django
{% if cookiecutter.use_lets_encrypt == 'y' %}
- certbot
{% endif %}
ports:
- "0.0.0.0:80:80"
{% if cookiecutter.use_lets_encrypt == 'y' %}
environment:
- MY_DOMAIN_NAME={{ cookiecutter.domain_name }}
ports:
- "0.0.0.0:80:80"
- "0.0.0.0:443:443"
volumes:
- /etc/letsencrypt:/etc/letsencrypt
- /var/lib/letsencrypt:/var/lib/letsencrypt
certbot:
image: quay.io/letsencrypt/letsencrypt
command: bash -c "sleep 6 && certbot certonly -n --standalone -d {{ cookiecutter.domain_name }} --text --agree-tos --email {{ cookiecutter.email }} --server https://acme-v01.api.letsencrypt.org/directory --rsa-key-size 4096 --verbose --keep-until-expiring --standalone-supported-challenges http-01"
entrypoint: ""
volumes:
- /etc/letsencrypt:/etc/letsencrypt
- /var/lib/letsencrypt:/var/lib/letsencrypt
ports:
- "80"
- "443"
environment:
- TERM=xterm
{% endif %}
redis:
image: redis:3.0
{% if cookiecutter.use_celery == 'y' %}
celeryworker:
build:
context: .
dockerfile: ./compose/django/Dockerfile
env_file: .env
depends_on:
- postgres
- redis
command: celery -A {{cookiecutter.project_slug}}.taskapp worker -l INFO
celerybeat:
build:
context: .
dockerfile: ./compose/django/Dockerfile
env_file: .env
depends_on:
- postgres
- redis
command: celery -A {{cookiecutter.project_slug}}.taskapp beat -l INFO
{% endif %}

View File

@ -29,7 +29,7 @@ The Docker compose tool (previously known as `fig`_) makes linking these contain
webserver/
Dockerfile
...
docker-compose.yml
production.yml
Each component of your application would get its own `Dockerfile`_. The rest of this example assumes you are using the `base postgres image`_ for your database. Your database settings in `config/base.py` might then look something like:
@ -48,7 +48,7 @@ Each component of your application would get its own `Dockerfile`_. The rest of
}
}
The `Docker compose documentation`_ explains in detail what you can accomplish in the `docker-compose.yml` file, but an example configuration might look like this:
The `Docker compose documentation`_ explains in detail what you can accomplish in the `production.yml` file, but an example configuration might look like this:
.. _Docker compose documentation: https://docs.docker.com/compose/#compose-documentation
@ -107,9 +107,9 @@ We'll ignore the webserver for now (you'll want to comment that part out while w
# uncomment the line below to use container as a non-root user
USER python:python
Running `sudo docker-compose build` will follow the instructions in your `docker-compose.yml` file and build the database container, then your webapp, before mounting your cookiecutter project files as a volume in the webapp container and linking to the database. Our example yaml file runs in development mode but changing it to production mode is as simple as commenting out the line using `runserver` and uncommenting the line using `gunicorn`.
Running `sudo docker-compose -f production.yml build` will follow the instructions in your `production.yml` file and build the database container, then your webapp, before mounting your cookiecutter project files as a volume in the webapp container and linking to the database. Our example yaml file runs in development mode but changing it to production mode is as simple as commenting out the line using `runserver` and uncommenting the line using `gunicorn`.
Both are set to run on port `0.0.0.0:8000`, which is where the Docker daemon will discover it. You can now run `sudo docker-compose up` and browse to `localhost:8000` to see your application running.
Both are set to run on port `0.0.0.0:8000`, which is where the Docker daemon will discover it. You can now run `sudo docker-compose -f production.yml up` and browse to `localhost:8000` to see your application running.
Deployment
^^^^^^^^^^
@ -155,7 +155,7 @@ That Dockerfile assumes you have an Nginx conf file named `site.conf` in the sam
}
}
Running `sudo docker-compose build webserver` will build your server container. Running `sudo docker-compose up` will now expose your application directly on `localhost` (no need to specify the port number).
Running `sudo docker-compose -f production.yml build webserver` will build your server container. Running `sudo docker-compose -f production.yml up` will now expose your application directly on `localhost` (no need to specify the port number).
Building and running your app on EC2
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@ -166,9 +166,9 @@ All you now need to do to run your app in production is:
* Install your preferred source control solution, Docker and Docker compose on the news instance.
* Pull in your code from source control. The root directory should be the one with your `docker-compose.yml` file in it.
* Pull in your code from source control. The root directory should be the one with your `production.yml` file in it.
* Run `sudo docker-compose build` and `sudo docker-compose up`.
* Run `sudo docker-compose -f production.yml build` and `sudo docker-compose -f production.yml up`.
* Assign an `Elastic IP address`_ to your new machine.

View File

@ -21,18 +21,10 @@ Next, you have to add new remote python interpreter, based on already tested dep
.. image:: images/3.png
Switch to *Docker Compose* and select `dev.yml` file from directory of your project, next set *Service name* to `django`
Switch to *Docker Compose* and select `local.yml` file from directory of your project, next set *Service name* to `django`
.. image:: images/4.png
Because Pycharm restarts container every time you use Configuration Run, to not have server restarted during running tests, we defined second service in `dev.yml` file called pycharm. To use it, you have to add interpreter of second service as well.
.. image:: images/5.png
The final result should be:
.. image:: images/6.png
Having that, click *OK*. Close *Settings* panel, and wait few seconds...
.. image:: images/7.png

Binary file not shown.

Before

Width:  |  Height:  |  Size: 110 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 67 KiB

View File

@ -3,6 +3,9 @@
POSTGRES_PASSWORD=mysecretpass
POSTGRES_USER=postgresuser
# Domain name, used by caddy
DOMAIN_NAME={{ cookiecutter.domain_name }}
# General settings
# DJANGO_READ_DOT_ENV_FILE=True
DJANGO_ADMIN_URL=
@ -30,9 +33,9 @@ DJANGO_ACCOUNT_ALLOW_REGISTRATION=True
DJANGO_SENTRY_DSN=
{% endif %}
{% if cookiecutter.use_opbeat == 'y' -%}
DJANGO_OPBEAT_ORGANIZATION_ID
DJANGO_OPBEAT_APP_ID
DJANGO_OPBEAT_SECRET_TOKEN
DJANGO_OPBEAT_ORGANIZATION_ID=
DJANGO_OPBEAT_APP_ID=
DJANGO_OPBEAT_SECRET_TOKEN=
{% endif %}
{% if cookiecutter.use_compressor == 'y' -%}
COMPRESS_ENABLED=

View File

@ -16,7 +16,7 @@ var gulp = require('gulp'),
pixrem = require('gulp-pixrem'),
uglify = require('gulp-uglify'),
imagemin = require('gulp-imagemin'),
exec = require('child_process').exec,
spawn = require('child_process').spawn,
runSequence = require('run-sequence'),
browserSync = require('browser-sync').create(),
reload = browserSync.reload;
@ -45,7 +45,7 @@ var paths = pathsConfig();
// Styles autoprefixing and minification
gulp.task('styles', function() {
return gulp.src(paths.sass + '/project.scss')
return gulp.src(paths.sass + '/*.scss')
.pipe(sass().on('error', sass.logError))
.pipe(plumber()) // Checks for errors
.pipe(autoprefixer({browsers: ['last 2 versions']})) // Adds vendor prefixes
@ -73,10 +73,11 @@ gulp.task('imgCompression', function(){
});
// Run django server
gulp.task('runServer', function() {
exec('python manage.py runserver', function (err, stdout, stderr) {
console.log(stdout);
console.log(stderr);
gulp.task('runServer', function(cb) {
var cmd = spawn('python', ['manage.py', 'runserver'], {stdio: 'inherit'});
cmd.on('close', function(code) {
console.log('runServer exited with code ' + code);
cb(code);
});
});
@ -100,5 +101,5 @@ gulp.task('watch', function() {
// Default task
gulp.task('default', function() {
runSequence(['styles', 'scripts', 'imgCompression'], 'runServer', 'browserSync', 'watch');
runSequence(['styles', 'scripts', 'imgCompression'], ['runServer', 'browserSync', 'watch']);
});

View File

@ -0,0 +1,62 @@
version: '2'
volumes:
postgres_data_local: {}
postgres_backup_local: {}
services:
django:{% if cookiecutter.use_celery == 'y' %} &django{% endif %}
build:
context: .
dockerfile: ./compose/local/django/Dockerfile
depends_on:
- postgres{% if cookiecutter.use_mailhog == 'y' %}
- mailhog{% endif %}
volumes:
- .:/app
environment:
- POSTGRES_USER={{cookiecutter.project_slug}}
- USE_DOCKER=yes
ports:
- "8000:8000"
command: /start.sh
postgres:
build:
context: .
dockerfile: ./compose/production/postgres/Dockerfile
volumes:
- postgres_data_local:/var/lib/postgresql/data
- postgres_backup_local:/backups
environment:
- POSTGRES_USER={{cookiecutter.project_slug}}
{% if cookiecutter.use_mailhog == 'y' %}
mailhog:
image: mailhog/mailhog:v1.0.0
ports:
- "8025:8025"
{% endif %}
{% if cookiecutter.use_celery == 'y' %}
redis:
image: redis:3.0
celeryworker:
# https://github.com/docker/compose/issues/3220
<<: *django
depends_on:
- redis
- postgres{% if cookiecutter.use_mailhog == 'y' %}
- mailhog{% endif %}
ports: []
command: /start-celeryworker.sh
celerybeat:
# https://github.com/docker/compose/issues/3220
<<: *django
depends_on:
- redis
- postgres{% if cookiecutter.use_mailhog == 'y' %}
- mailhog{% endif %}
ports: []
command: /start-celerybeat.sh
{% endif %}

View File

@ -0,0 +1,57 @@
version: '2'
volumes:
postgres_data: {}
postgres_backup: {}
caddy: {}
services:
django:{% if cookiecutter.use_celery == 'y' %} &django{% endif %}
build:
context: .
dockerfile: ./compose/production/django/Dockerfile
depends_on:
- postgres
- redis
env_file: .env
command: /gunicorn.sh
postgres:
build:
context: .
dockerfile: ./compose/production/postgres/Dockerfile
volumes:
- postgres_data:/var/lib/postgresql/data
- postgres_backup:/backups
env_file: .env
caddy:
build:
context: .
dockerfile: ./compose/production/caddy/Dockerfile
depends_on:
- django
volumes:
- caddy:/root/.caddy
env_file: .env
ports:
- "0.0.0.0:80:80"
- "0.0.0.0:443:443"
redis:
image: redis:3.0
{% if cookiecutter.use_celery == 'y' %}
celeryworker:
<<: *django
depends_on:
- postgres
- redis
command: /start-celeryworker.sh
celerybeat:
<<: *django
depends_on:
- postgres
- redis
command: /start-celerybeat.sh
{% endif %}

View File

@ -2,42 +2,41 @@
# like Pillow and psycopg2
# See http://bitly.com/wheel-building-fails-CPython-35
# Verified bug on Python 3.5.1
wheel==0.29.0
wheel==0.30.0
# Bleeding edge Django
django==1.10.7 # pyup: >=1.10,<1.11
django==1.10.8 # pyup: >=1.10,<1.11
# Configuration
django-environ==0.4.3
django-environ==0.4.4
{% if cookiecutter.use_whitenoise == 'y' -%}
whitenoise==3.3.0
whitenoise==3.3.1
{%- endif %}
# Forms
django-braces==1.11.0
django-crispy-forms==1.6.1
# Models
django-model-utils==3.0.0
# Images
Pillow==4.1.1
Pillow==4.2.1
# Password storage
argon2-cffi==16.3.0
# For user registration, either via email or social
# Well-built with regular release cycles!
django-allauth==0.32.0
django-allauth==0.33.0
{% if cookiecutter.windows == 'y' -%}
# On Windows, you must download/install psycopg2 manually
# from http://www.lfd.uci.edu/~gohlke/pythonlibs/#psycopg
{% else %}
# Python-PostgreSQL Database Adapter
psycopg2==2.7.1
psycopg2==2.7.3.1
{%- endif %}
# Unicode slugification
@ -51,12 +50,12 @@ django-redis==4.8.0
redis>=2.10.5
{% if cookiecutter.use_celery == "y" %}
celery==3.1.24
celery==3.1.25
{% endif %}
{% if cookiecutter.use_compressor == "y" %}
rcssmin==1.0.6 {% if cookiecutter.windows == 'y' %}--install-option="--without-c-extensions"{% endif %}
django-compressor==2.1.1
django-compressor==2.2
{% endif %}
# Your custom requirements go here

View File

@ -1,19 +1,19 @@
# Local development dependencies go here
-r base.txt
coverage==4.3.4
coverage==4.4.1
django-coverage-plugin==1.5.0
Sphinx==1.5.5
django-extensions==1.7.8
Werkzeug==0.12.1
django-test-plus==1.0.17
factory-boy==2.8.1
Sphinx==1.6.4
django-extensions==1.9.1
Werkzeug==0.12.2
django-test-plus==1.0.18
factory-boy==2.9.2
django-debug-toolbar==1.7
django-debug-toolbar==1.8
# improved REPL
ipdb==0.10.3
pytest-django==3.1.2
pytest-sugar==0.8.0
pytest-sugar==0.9.0

View File

@ -4,32 +4,32 @@
{% if cookiecutter.windows == 'y' -%}
# Python-PostgreSQL Database Adapter
# If using Win for dev, this assumes Unix in prod
# ------------------------------------------------
psycopg2==2.7.1
# Assuming Windows is used locally, and *nix -- in production.
# ------------------------------------------------------------
psycopg2==2.7.3.1
{%- endif %}
# WSGI Handler
# ------------------------------------------------
gevent==1.2.1
gevent==1.2.2
gunicorn==19.7.1
# Static and Media Storage
# ------------------------------------------------
boto==2.46.1
django-storages-redux==1.3.2
boto3==1.4.7
django-storages==1.6.5
{% if cookiecutter.use_whitenoise != 'y' -%}
Collectfast==0.5.2
{%- endif %}
# Email backends for Mailgun, Postmark, SendGrid and more
# -------------------------------------------------------
django-anymail==0.9
django-anymail==1.0
{% if cookiecutter.use_sentry_for_error_reporting == "y" -%}
# Raven is the Sentry client
# --------------------------
raven==6.0.0
raven==6.2.1
{%- endif %}
{% if cookiecutter.use_opbeat == "y" -%}

View File

@ -4,14 +4,15 @@
{% if cookiecutter.windows == 'y' -%}
# Python-PostgreSQL Database Adapter
# If using Win for dev, this assumes Unix in test/prod
psycopg2==2.7.1
psycopg2==2.7.3.1
{%- endif %}
coverage==4.3.4
flake8==3.3.0 # pyup: != 2.6.0
django-test-plus==1.0.17
factory-boy==2.8.1
coverage==4.4.1
flake8==3.4.1 # pyup: != 2.6.0
django-test-plus==1.0.18
factory-boy==2.9.2
django-coverage-plugin==1.5.0
# pytest
pytest-django==3.1.2
pytest-sugar==0.8.0
pytest-sugar==0.9.0

View File

@ -1 +1 @@
python-3.5.3
python-3.6.2

View File

@ -12,23 +12,6 @@
border-color: #eed3d7;
}
/* This is a fix for the bootstrap4 alpha release */
@media (max-width: 47.9em) {
.navbar-nav .nav-item {
float: none;
width: 100%;
display: inline-block;
}
.navbar-nav .nav-item + .nav-item {
margin-left: 0;
}
.nav.navbar-nav.pull-xs-right {
float: none !important;
}
}
/* Display django-debug-toolbar.
See https://github.com/django-debug-toolbar/django-debug-toolbar/issues/742
and https://github.com/pydanny/cookiecutter-django/issues/317

View File

@ -1,3 +1,57 @@
{% if cookiecutter.custom_bootstrap_compilation == 'y' %}
@import "variables";
@import "custom_bootstrap_vars";
@import "mixins";
// Reset and dependencies
@import "normalize";
@import "print";
@import "glyphicons";
// Core CSS
@import "scaffolding";
@import "type";
@import "code";
@import "grid";
@import "tables";
@import "forms";
@import "buttons";
// Components
@import "component-animations";
@import "dropdowns";
@import "button-groups";
@import "input-groups";
@import "navs";
@import "navbar";
@import "breadcrumbs";
@import "pagination";
@import "pager";
@import "labels";
@import "badges";
@import "jumbotron";
@import "thumbnails";
@import "alerts";
@import "progress-bars";
@import "media";
@import "list-group";
@import "panels";
@import "responsive-embed";
@import "wells";
@import "close";
// Components w/ JavaScript
@import "modals";
@import "tooltip";
@import "popovers";
@import "carousel";
// Utility classes
@import "utilities";
@import "responsive-utilities";
{% endif %}
// project specific CSS goes here
@ -33,32 +87,6 @@ $red: #b94a48;
color: $red;
}
////////////////////////////////
//Navbar//
////////////////////////////////
// This is a fix for the bootstrap4 alpha release
.navbar {
border-radius: 0px;
}
@media (max-width: 47.9em) {
.navbar-nav .nav-item {
display: inline-block;
float: none;
width: 100%;
}
.navbar-nav .nav-item + .nav-item {
margin-left: 0;
}
.nav.navbar-nav.pull-xs-right {
float: none !important;
}
}
////////////////////////////////
//Django Toolbar//
////////////////////////////////

View File

@ -1,4 +1,4 @@
{% raw %}{% load staticfiles i18n {% endraw %}{% if cookiecutter.use_compressor == "y" %}compress{% endif %}{% raw %}%}<!DOCTYPE html>
{% raw %}{% load static i18n {% endraw %}{% if cookiecutter.use_compressor == "y" %}compress{% endif %}{% raw %}%}<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
@ -14,8 +14,8 @@
<![endif]-->
{% block css %}
<!-- Latest compiled and minified Bootstrap 4 Alpha 4 CSS -->
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0-alpha.6/css/bootstrap.min.css" integrity="sha384-rwoIResjU2yc3z8GV/NPeZWAv56rSmLldC3R/AZzGRnGxQQKnKkoFVhFQhNUwEyJ" crossorigin="anonymous">
<!-- Latest compiled and minified Bootstrap 4 beta CSS -->
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0-beta/css/bootstrap.min.css" integrity="sha384-/Y6pD6FV/Vv2HJnA6t+vslU6fwYXjCFtcEpHbNJ0lyAFsXTsjBbfaDjzALeQsN6M" crossorigin="anonymous">
<!-- Your stuff: Third-party CSS libraries go here -->
{% endraw %}{% if cookiecutter.use_compressor == "y" %}{% raw %}{% compress css %}{% endraw %}{% endif %}{% raw %}
@ -29,7 +29,7 @@
<body>
<div class="m-b-1">
<nav class="navbar navbar-toggleable-md navbar-light bg-faded">
<nav class="navbar navbar-expand-md navbar-light bg-light">
<button class="navbar-toggler navbar-toggler-right" type="button" data-toggle="collapse" data-target="#navbarSupportedContent" aria-controls="navbarSupportedContent" aria-expanded="false" aria-label="Toggle navigation">
<span class="navbar-toggler-icon"></span>
</button>
@ -88,10 +88,10 @@
================================================== -->
<!-- Placed at the end of the document so the pages load faster -->
{% block javascript %}
<!-- Required by Bootstrap v4 Alpha 4 -->
<script src="https://code.jquery.com/jquery-3.1.1.slim.min.js" integrity="sha384-A7FZj7v+d/sdmMqp/nOQwliLvUsJfDHW+k9Omg/a/EheAdgtzNs3hpfag6Ed950n" crossorigin="anonymous"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/tether/1.4.0/js/tether.min.js" integrity="sha384-DztdAPBWPRXSA/3eYEEUWrWCy7G5KFbe8fFjk5JAIxUYHKkDx6Qin1DkWx51bBrb" crossorigin="anonymous"></script>
<script src="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0-alpha.6/js/bootstrap.min.js" integrity="sha384-vBWWzlZJ8ea9aCX4pEW3rVHjgjt7zpkNpZk+02D9phzyeVkE+jo0ieGizqPLForn" crossorigin="anonymous"></script>
<!-- Required by Bootstrap v4 beta -->
<script src="https://code.jquery.com/jquery-3.2.1.slim.min.js" integrity="sha384-KJ3o2DKtIkvYIK3UENzmM7KCkRr/rE9/Qpg6aAZGJwFDMVNA/GpGFF93hXpG5KkN" crossorigin="anonymous"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/popper.js/1.11.0/umd/popper.min.js" integrity="sha384-b/U6ypiBEHpOf/4+1nzFpr53nxSS+GLCkfwBdFNTxtclqqenISfwAzpKaMNFNmj4" crossorigin="anonymous"></script>
<script src="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0-beta/js/bootstrap.min.js" integrity="sha384-h0AbiXch4ZDo7tp9hKZ4TsHbi047NrKGLO3SEJAg45jXxnGIfYzk4Si90RDIqNm1" crossorigin="anonymous"></script>
<!-- Your stuff: Third-party javascript libraries go here -->