Merge remote-tracking branch 'upstream/master'

This commit is contained in:
Trung Dong Huynh 2017-08-31 15:46:34 +01:00
commit b5d72e62dc
91 changed files with 1094 additions and 788 deletions

32
.github/ISSUE_TEMPLATE.md vendored Normal file
View File

@ -0,0 +1,32 @@
**Note: for support questions, please use the `cookiecutter-django` tag on stackoverflow**. This repository's issues are reserved for feature requests and bug reports. If you need quick professional paid support for your project, contact support@cookiecutter.io.
* **I'm submitting a ... **
- [ ] bug report
- [ ] feature request
- [ ] support request => Please do not submit support request here, see note at the top of this template.
* **Do you want to request a *feature* or report a *bug*?**
* **What is the current behavior?**
* **If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem**
* **What is the expected behavior?**
* **What is the motivation / use case for changing the behavior?**
* **Please tell us about your environment:**
* **Other information** (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links for us to have context, eg. stackoverflow, gitter, etc)

240
.gitignore vendored
View File

@ -1,44 +1,228 @@
### OSX ###
.DS_Store
.AppleDouble
.LSOverride
### Python template
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
### SublimeText ###
# cache files for sublime text
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.coverage.*
.cache/
nosetests.xml
coverage.xml
*.cover
.hypothesis/
# Translations
*.mo
*.pot
# Django stuff:
*.log
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# pyenv
.python-version
# celery beat schedule file
celerybeat-schedule
# Environments
.env
.venv
env/
venv/
ENV/
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
### Linux template
*~
# temporary files which can be created if a process still has a handle open of a deleted file
.fuse_hidden*
# KDE directory preferences
.directory
# Linux trash folder which might appear on any partition or disk
.Trash-*
# .nfs files are created when an open file is removed but is still being accessed
.nfs*
### VisualStudioCode template
.vscode/*
!.vscode/settings.json
!.vscode/tasks.json
!.vscode/launch.json
!.vscode/extensions.json
### Windows template
# Windows thumbnail cache files
Thumbs.db
ehthumbs.db
ehthumbs_vista.db
# Dump file
*.stackdump
# Folder config file
Desktop.ini
# Recycle Bin used on file shares
$RECYCLE.BIN/
# Windows Installer files
*.cab
*.msi
*.msm
*.msp
# Windows shortcuts
*.lnk
### SublimeText template
# Cache files for Sublime Text
*.tmlanguage.cache
*.tmPreferences.cache
*.stTheme.cache
# workspace files are user-specific
# Workspace files are user-specific
*.sublime-workspace
# project files should be checked into the repository, unless a significant
# Project files should be checked into the repository, unless a significant
# proportion of contributors will probably not be using Sublime Text
# *.sublime-project
# sftp configuration file
# SFTP configuration file
sftp-config.json
# Generated files
*.log
*.pot
*.pyc
.idea
_build
*.egg-info/
# Package control specific files
Package Control.last-run
Package Control.ca-list
Package Control.ca-bundle
Package Control.system-ca-bundle
Package Control.cache/
Package Control.ca-certs/
Package Control.merged-ca-bundle
Package Control.user-ca-bundle
oscrypto-ca-bundle.crt
bh_unicode_properties.cache
# Project Specific Stuff
local_settings.py
project_slug
my_test_project/*
# Sublime-github package stores a github token in this file
# https://packagecontrol.io/packages/sublime-github
GitHub.sublime-settings
# Generated when running py.test for the Cookiecutter Django generation tests
.cache/
# Generated when running celery beat
celerybeat-schedule.db
### macOS template
# General
*.DS_Store
.AppleDouble
.LSOverride
# Unit test / coverage reports
.coverage
.tox
.cache
# Icon must end with two \r
Icon
# Thumbnails
._*
# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent
# Directories potentially created on remote AFP share
.AppleDB
.AppleDesktop
Network Trash Folder
Temporary Items
.apdisk
### Vim template
# Swap
[._]*.s[a-v][a-z]
[._]*.sw[a-p]
[._]s[a-v][a-z]
[._]sw[a-p]
# Session
Session.vim
# Temporary
.netrwhist
# Auto-generated tag files
tags
### VirtualEnv template
# Virtualenv
# http://iamzed.com/2009/05/07/a-primer-on-virtualenv/
[Bb]in
[Ii]nclude
[Ll]ib
[Ll]ib64
[Ll]ocal
[Ss]cripts
pyvenv.cfg
pip-selfcheck.json
# Even though the project might be opened and edited
# in any of the JetBrains IDEs, it makes no sence whatsoever
# to 'run' anything within it since any particular cookiecutter
# is declarative by nature.
.idea/

View File

@ -10,7 +10,6 @@ language: python
python: 3.5
env:
- TOX_ENV=py27
- TOX_ENV=py34
- TOX_ENV=py35

View File

@ -39,9 +39,9 @@ To run all tests using various versions of python in virtualenvs defined in tox.
It is possible to tests with some versions of python, to do this the command
is::
$ tox -e py27,py34
$ tox -e py34,py35
Will run py.test with the python2.7, and python3.4 interpreters, for
Will run py.test with the python3.4, and python3.5 interpreters, for
example.
To run a particular test with tox for against your current Python version::

View File

@ -16,6 +16,7 @@ Fábio C. Barrionuevo da Luz `@luzfcb`_ @luzfcb
Saurabh Kumar `@theskumar`_ @_theskumar
Jannis Gebauer `@jayfk`_
Burhan Khalid `@burhan`_ @burhan
Shupeyko Nikita `@webyneter`_ @webyneter
=========================== ============= ===========
*Audrey is also the creator of Cookiecutter. Audrey and
@ -26,6 +27,7 @@ Daniel are on the Cookiecutter core team.*
.. _@theskumar: https://github.com/theskumar
.. _@audreyr: https://github.com/audreyr
.. _@jayfk: https://github.com/jayfk
.. _@webyneter: https://github.com/webyneter
Other Contributors
------------------
@ -54,12 +56,15 @@ Listed in alphabetical order.
Areski Belaid `@areski`_
Ashley Camba
Barclay Gauld `@yunti`_
Ben Warren `@bwarren2`
Ben Lopatin
Benjamin Abel
Bert de Miranda `@bertdemiranda`_
Bo Lopker `@blopker`_
Bouke Haarsma
Brent Payne `@brentpayne`_ @brentpayne
Burhan Khalid `@burhan`_ @burhan
Bruno Alla               `@browniebroke`_ @_BrunoAlla
Burhan Khalid            `@burhan`_                   @burhan
Catherine Devlin `@catherinedevlin`_
Cédric Gaspoz `@cgaspoz`_
Chris Curvey `@ccurvey`_
@ -84,6 +89,7 @@ Listed in alphabetical order.
Felipe Arruda `@arruda`_
Garry Cairns `@garry-cairns`_
Garry Polley `@garrypolley`_
Hamish Durkin `@durkode`_
Harry Percival `@hjwp`_
Henrique G. G. Pereira `@ikkebr`_
Ian Lee `@IanLee1521`_
@ -140,6 +146,7 @@ Listed in alphabetical order.
Travis McNeill `@Travistock`_ @tavistock_esq
Vitaly Babiy
Vivian Guillen `@viviangb`_
Wan Liuyang `@sfdye`_ @sfdye
Will Farley `@goldhand`_ @g01dhand
William Archinal `@archinal`_
Yaroslav Halchenko
@ -160,6 +167,7 @@ Listed in alphabetical order.
.. _@bloodpet: https://github.com/bloodpet
.. _@blopker: https://github.com/blopker
.. _@bogdal: https://github.com/bogdal
.. _@browniebroke: https://github.com/browniebroke
.. _@burhan: https://github.com/burhan
.. _@c-rhodes: https://github.com/c-rhodes
.. _@caffodian: https://github.com/caffodian
@ -175,6 +183,7 @@ Listed in alphabetical order.
.. _@dhepper: https://github.com/dhepper
.. _@dot2dotseurat: https://github.com/dot2dotseurat
.. _@dsclementsen: https://github.com/dsclementsen
.. _@durkode: https://github.com/durkode
.. _@epileptic-fish: https://gihub.com/epileptic-fish
.. _@eraldo: https://github.com/eraldo
.. _@eriol: https://github.com/eriol
@ -215,6 +224,7 @@ Listed in alphabetical order.
.. _@shireenrao: https://github.com/shireenrao
.. _@webyneter: https://github.com/webyneter
.. _@show0k: https://github.com/show0k
.. _@sfdye: https://github.com/sfdye
.. _@shultz: https://github.com/shultz
.. _@siauPatrick: https://github.com/siauPatrick
.. _@slafs: https://github.com/slafs

View File

@ -1,14 +1,14 @@
Cookiecutter Django
=======================
.. image:: https://pyup.io/repos/github/pydanny/cookiecutter-django/shield.svg
:target: https://pyup.io/repos/github/pydanny/cookiecutter-django/
:alt: Updates
.. image:: https://travis-ci.org/pydanny/cookiecutter-django.svg?branch=master
:target: https://travis-ci.org/pydanny/cookiecutter-django?branch=master
:alt: Build Status
.. image:: https://pyup.io/repos/github/pydanny/cookiecutter-django/shield.svg
:target: https://pyup.io/repos/github/pydanny/cookiecutter-django/
:alt: Updates
.. image:: https://badges.gitter.im/Join Chat.svg
:target: https://gitter.im/pydanny/cookiecutter-django?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
@ -16,7 +16,9 @@ Powered by Cookiecutter_, Cookiecutter Django is a framework for jumpstarting pr
* Documentation: https://cookiecutter-django.readthedocs.io/en/latest/
* See Troubleshooting_ for common errors and obstacles
* If you have problems with Cookiecutter Django, please open issues_ before sending emails to the maintainers. You will get a much, MUCH faster response.
* If you have problems with Cookiecutter Django, please open issues_ don't send emails to the maintainers.
* Need quick professional paid support? Contact `support@cookiecutter.io`_. This includes configuring your servers,
fixing bugs, reviewing your code and everything in between.
.. _cookiecutter: https://github.com/audreyr/cookiecutter
@ -24,13 +26,15 @@ Powered by Cookiecutter_, Cookiecutter Django is a framework for jumpstarting pr
.. _528: https://github.com/pydanny/cookiecutter-django/issues/528#issuecomment-212650373
.. _issues: https://github.com/pydanny/cookiecutter-django/issues/new
.. _support@cookiecutter.io: support@cookiecutter.io
Features
---------
* For Django 1.10
* Works with Python 3.4.x or 3.5.x. Python 3.6 is experimental
* Renders Django projects with 100% starting test coverage
* Twitter Bootstrap_ v4.0.0 - `alpha 4`_ (`maintained Foundation fork`_ also available)
* Twitter Bootstrap_ v4.0.0 - alpha 6 (`maintained Foundation fork`_ also available)
* 12-Factor_ based settings via django-environ_
* Secure by default. We believe in SSL.
* Optimized development and production settings
@ -39,10 +43,9 @@ Features
* Grunt build for compass and livereload
* Send emails via Anymail_ (using Mailgun_ by default, but switchable)
* Media storage using Amazon S3
* Docker support using docker-compose_ for development and production
* Docker support using docker-compose_ for development and production (using Caddy_ with LetsEncrypt_ support)
* Procfile_ for deploying to Heroku
* Instructions for deploying to PythonAnywhere_
* Works with Python 2.7.x or 3.5.x
* Run tests with unittest or py.test
* Customizable PostgreSQL version
* Experimental support for Amazon Elastic Beanstalk
@ -61,7 +64,6 @@ Optional Integrations
* Integration with Sentry_ for error logging
* Integration with Opbeat_ for performance monitoring
.. _`alpha 4`: http://blog.getbootstrap.com/2016/09/05/bootstrap-4-alpha-4/
.. _Bootstrap: https://github.com/twbs/bootstrap
.. _django-environ: https://github.com/joke2k/django-environ
.. _12-Factor: http://12factor.net/
@ -77,15 +79,46 @@ Optional Integrations
.. _docker-compose: https://github.com/docker/compose
.. _Opbeat: https://opbeat.com/
.. _PythonAnywhere: https://www.pythonanywhere.com/
.. _Caddy: https://caddyserver.com/
.. _LetsEncrypt: https://letsencrypt.org/
Constraints
-----------
* Only maintained 3rd party libraries are used.
* Uses PostgreSQL everywhere (9.2+)
* Environment variables for configuration (This won't work with Apache/mod_wsgi).
* Environment variables for configuration (This won't work with Apache/mod_wsgi except on AWS ELB).
Support this Project!
----------------------
This project is run by volunteers. Please support them in their efforts to maintain and improve Cookiecutter Django:
* https://www.patreon.com/danielroygreenfeld: Project lead. Expertise in AWS ELB and Django.
Projects that provide financial support to the maintainers:
Two Scoops of Django 1.11
~~~~~~~~~~~~~~~~~~~~~~~~~
.. image:: https://cdn.shopify.com/s/files/1/0304/6901/products/tsd-111-alpha_medium.jpg?v=1499531513
:name: Two Scoops of Django 1.11 Cover
:align: center
:alt: Two Scoops of Django
:target: http://twoscoopspress.org/products/two-scoops-of-django-1-11
Two Scoops of Django is the best dairy-themed Django reference in the universe
pyup
~~~~~~~~~~~~~~~~~~
.. image:: https://pyup.io/static/images/logo.png
:name: pyup
:align: center
:alt: pyup
:target: https://pyup.io/
Pyup brings you automated security and dependency updates used by Google and other organizations. Free for open source projects!
Usage
------
@ -128,7 +161,6 @@ Answer the prompts with your own desired options_. For example::
use_opbeat [n]: y
use_pycharm [n]: y
windows [n]: n
use_python3 [y]: y
use_docker [y]: n
use_heroku [n]: y
use_compressor [n]: y
@ -143,7 +175,6 @@ Answer the prompts with your own desired options_. For example::
2 - Grunt
3 - None
Choose from 1, 2, 3, 4 [1]: 1
use_lets_encrypt [n]: n
Select open_source_license:
1 - MIT
2 - BSD
@ -260,31 +291,5 @@ Code of Conduct
Everyone interacting in the Cookiecutter project's codebases, issue trackers, chat
rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_.
Support This Project
---------------------------
This project is maintained by volunteers. Support their efforts by spreading the word about:
Two Scoops Press
~~~~~~~~~~~~~~~~~~
.. image:: https://cdn.shopify.com/s/files/1/0304/6901/t/2/assets/logo.png?11985289740589874793
:name: Two Scoops Press
:align: center
:alt: Two Scoops Press
:target: https://twoscoopspress.com
Two Scoops Press brings you the best dairy-themed Django references in the universe
pyup
~~~~~~~~~~~~~~~~~~
.. image:: https://pyup.io/static/images/logo.png
:name: pyup
:align: center
:alt: pyup
:target: https://pyup.io/
Pyup brings you automated security and dependency updates used by Google and other organizations. Free for open source projects!
.. _`PyPA Code of Conduct`: https://www.pypa.io/en/latest/code-of-conduct/

View File

@ -14,13 +14,12 @@
"use_opbeat": "n",
"use_pycharm": "n",
"windows": "n",
"use_python3": "y",
"use_docker": "n",
"use_heroku": "n",
"use_elasticbeanstalk_experimental": "n",
"use_compressor": "n",
"postgresql_version": ["9.6", "9.5", "9.4", "9.3", "9.2"],
"js_task_runner": ["Gulp", "Grunt", "None"],
"use_lets_encrypt": "n",
"custom_bootstrap_compilation": "n",
"open_source_license": ["MIT", "BSD", "GPLv3", "Apache Software License 2.0", "Not open source"]
}

View File

@ -1,5 +1,3 @@
# -*- coding: utf-8 -*-
#
# cookiecutter-django documentation build configuration file.
#
# This file is execfile()d with the current directory set to its containing dir.
@ -10,8 +8,6 @@
# All configuration values have a default; values that are commented out
# serve to show the default.
from __future__ import unicode_literals
from datetime import datetime
import os
import sys

View File

@ -35,7 +35,7 @@ Make sure your project is fully commited and pushed up to Bitbucket or Github or
git clone <my-repo-url> # you can also use hg
cd my-project-name
mkvirtualenv --python=/usr/bin/python3.5 my-project-name # or python2.7, etc
mkvirtualenv --python=/usr/bin/python3.5 my-project-name
pip install -r requirements/production.txt # may take a few minutes

View File

@ -12,12 +12,12 @@ Prerequisites
Understand the Compose Setup
--------------------------------
Before you start, check out the `docker-compose.yml` file in the root of this project. This is where each component
Before you start, check out the `production.yml` file in the root of this project. This is where each component
of this application gets its configuration from. Notice how it provides configuration for these services:
* `postgres` service that runs the database
* `redis` for caching
* `nginx` as reverse proxy
* `caddy` as webserver
* `django` is the Django project run by gunicorn
If you chose the `use_celery` option, there are two more services:
@ -25,10 +25,6 @@ If you chose the `use_celery` option, there are two more services:
* `celeryworker` which runs the celery worker process
* `celerybeat` which runs the celery beat process
If you chose the `use_letsencrypt` option, you also have:
* `certbot` which keeps your certs from letsencrypt up-to-date
Populate .env With Your Environment Variables
---------------------------------------------
@ -37,6 +33,15 @@ root directory of this project as a starting point. Add your own variables to th
file won't be tracked by git by default so you'll have to make sure to use some other mechanism to copy your secret if
you are relying solely on git.
It is **highly recommended** that before you build your production application, you set your POSTGRES_USER value here. This will create a non-default user for the postgres image. If you do not set this user before building the application, the default user 'postgres' will be created, and this user will not be able to create or restore backups.
To obtain logs and information about crashes in a production setup, make sure that you have access to an external Sentry instance (e.g. by creating an account with `sentry.io`_), and set the `DJANGO_SENTRY_DSN` variable. This should be enough to report crashes to Sentry.
You will probably also need to setup the Mail backend, for example by adding a `Mailgun`_ API key and a `Mailgun`_ sender domain, otherwise, the account creation view will crash and result in a 500 error when the backend attempts to send an email to the account owner.
.. _sentry.io: https://sentry.io/welcome
.. _Mailgun: https://mailgun.com
HTTPS is on by default
----------------------
@ -46,68 +51,26 @@ It is always better to deploy a site behind HTTPS and will become crucial as the
* In the `.env.example`, we have made it simpler for you to change the default `Django Admin` into a custom name through an environmental variable. This should make it harder to guess the access to the admin panel.
* If you are not using a subdomain of the domain name set in the project, then remember to put the your staging/production IP address in the ``ALLOWED_HOSTS``_ environment variable before you deploy your website. Failure to do this will mean you will not have access to your website through the HTTP protocol.
* If you are not using a subdomain of the domain name set in the project, then remember to put the your staging/production IP address in the :code:`DJANGO_ALLOWED_HOSTS` environment variable (see :ref:`settings`) before you deploy your website. Failure to do this will mean you will not have access to your website through the HTTP protocol.
* Access to the Django admin is set up by default to require HTTPS in production or once *live*. We recommend that you look into setting up the *Certbot and Let's Encrypt Setup* mentioned below or another HTTPS certification service.
* Access to the Django admin is set up by default to require HTTPS in production or once *live*.
Optional: nginx-proxy Setup
---------------------------
By default, the application is configured to listen on all interfaces on port 80. If you want to change that, open the
`docker-compose.yml` file and replace `0.0.0.0` with your own ip.
HTTPS is configured by default
------------------------------
If you are using `nginx-proxy`_ to run multiple application stacks on one host, remove the port setting entirely and add `VIRTUAL_HOST=example.com` to your env file. Here, replace example.com with the value you entered for `domain_name`.
The Caddy webserver used in the default configuration will get you a valid certificate from Lets Encrypt and update it automatically. All you need to do to enable this is to make sure that your DNS records are pointing to the server Caddy runs on.
This pass all incoming requests on `nginx-proxy`_ to the nginx service your application is using.
You can read more about this here at `Automatic HTTPS`_ in the Caddy docs.
.. _Automatic HTTPS: https://caddyserver.com/docs/automatic-https
.. _nginx-proxy: https://github.com/jwilder/nginx-proxy
Optional: Postgres Data Volume Modifications
---------------------------------------------
Postgres is saving its database files to the `postgres_data` volume by default. Change that if you want something else and make sure to make backups since this is not done automatically.
Optional: Certbot and Let's Encrypt Setup
------------------------------------------
If you chose `use_letsencrypt` and will be using certbot for https, you must do the following before running anything with docker-compose:
Replace dhparam.pem.example with a generated dhparams.pem file before running anything with docker-compose. You can generate this on ubuntu or OS X by running the following in the project root:
::
$ openssl dhparam -out /path/to/project/compose/nginx/dhparams.pem 2048
If you would like to add additional subdomains to your certificate, you must add additional parameters to the certbot command in the `docker-compose.yml` file:
Replace:
::
command: bash -c "sleep 6 && certbot certonly -n --standalone -d {{ cookiecutter.domain_name }} --text --agree-tos --email mjsisley@relawgo.com --server https://acme-v01.api.letsencrypt.org/directory --rsa-key-size 4096 --verbose --keep-until-expiring --standalone-supported-challenges http-01"
With:
::
command: bash -c "sleep 6 && certbot certonly -n --standalone -d {{ cookiecutter.domain_name }} -d www.{{ cookiecutter.domain_name }} -d etc.{{ cookiecutter.domain_name }} --text --agree-tos --email {{ cookiecutter.email }} --server https://acme-v01.api.letsencrypt.org/directory --rsa-key-size 4096 --verbose --keep-until-expiring --standalone-supported-challenges http-01"
Please be cognizant of Certbot/Letsencrypt certificate requests limits when getting this set up. The provide a test server that does not count against the limit while you are getting set up.
The certbot certificates expire after 3 months.
If you would like to set up autorenewal of your certificates, the following commands can be put into a bash script:
::
#!/bin/bash
cd <project directory>
docker-compose run --rm --name certbot certbot bash -c "sleep 6 && certbot certonly --standalone -d {{ cookiecutter.domain_name }} --text --agree-tos --email {{ cookiecutter.email }} --server https://acme-v01.api.letsencrypt.org/directory --rsa-key-size 4096 --verbose --keep-until-expiring --standalone-supported-challenges http-01"
docker exec pearl_nginx_1 nginx -s reload
And then set a cronjob by running `crontab -e` and placing in it (period can be adjusted as desired)::
0 4 * * 1 /path/to/bashscript/renew_certbot.sh
Run your app with docker-compose
--------------------------------
@ -116,40 +79,40 @@ directory.
You'll need to build the stack first. To do that, run::
docker-compose build
docker-compose -f production.yml build
Once this is ready, you can run it with::
docker-compose up
docker-compose -f production.yml up
To run a migration, open up a second terminal and run::
docker-compose run django python manage.py migrate
docker-compose -f production.yml run django python manage.py migrate
To create a superuser, run::
docker-compose run django python manage.py createsuperuser
docker-compose -f production.yml run django python manage.py createsuperuser
If you need a shell, run::
docker-compose run django python manage.py shell
docker-compose -f production.yml run django python manage.py shell
To get an output of all running containers.
To check your logs, run::
docker-compose logs
docker-compose -f production.yml logs
If you want to scale your application, run::
docker-compose scale django=4
docker-compose scale celeryworker=2
docker-compose -f production.yml scale django=4
docker-compose -f production.yml scale celeryworker=2
.. warning:: Don't run the scale command on postgres, celerybeat, certbot, or nginx.
.. warning:: Don't run the scale command on postgres, celerybeat, or caddy.
If you have errors, you can always check your stack with `docker-compose`. Switch to your projects root directory and run::
docker-compose ps
docker-compose -f production.yml ps
Supervisor Example
@ -157,12 +120,12 @@ Supervisor Example
Once you are ready with your initial setup, you want to make sure that your application is run by a process manager to
survive reboots and auto restarts in case of an error. You can use the process manager you are most familiar with. All
it needs to do is to run `docker-compose up` in your projects root directory.
it needs to do is to run `docker-compose -f production.yml up` in your projects root directory.
If you are using `supervisor`, you can use this file as a starting point::
[program:{{cookiecutter.project_slug}}]
command=docker-compose up
command=docker-compose -f production.yml up
directory=/path/to/{{cookiecutter.project_slug}}
redirect_stderr=true
autostart=true

View File

@ -16,12 +16,13 @@ If you don't already have it installed, follow the instructions for your OS:
- On Mac OS X, you'll need `Docker for Mac`_
- On Windows, you'll need `Docker for Windows`_
- On Linux, you'll need `docker-engine`_
.. _`Docker for Mac`: https://docs.docker.com/engine/installation/mac/
.. _`Docker for Windows`: https://docs.docker.com/engine/installation/windows/
.. _`docker-engine`: https://docs.docker.com/engine/installation/
Attention Windows users
-------------
-----------------------
Currently PostgreSQL (``psycopg2`` python package) is not installed inside Docker containers for Windows users, while it is required by the generated Django project. To fix this, add ``psycopg2`` to the list of requirements inside ``requirements/base.txt``::
@ -36,9 +37,9 @@ Build the Stack
This can take a while, especially the first time you run this particular command
on your development system::
$ docker-compose -f dev.yml build
$ docker-compose -f local.yml build
If you want to build the production environment you don't have to pass an argument -f, it will automatically use docker-compose.yml.
If you want to build the production environment you use ``production.yml`` as -f argument (``docker-compose.yml`` or ``docker-compose.yaml`` are the defaults).
Boot the System
---------------
@ -50,11 +51,11 @@ runs will occur quickly.
Open a terminal at the project root and run the following for local development::
$ docker-compose -f dev.yml up
$ docker-compose -f local.yml up
You can also set the environment variable ``COMPOSE_FILE`` pointing to ``dev.yml`` like this::
You can also set the environment variable ``COMPOSE_FILE`` pointing to ``local.yml`` like this::
$ export COMPOSE_FILE=dev.yml
$ export COMPOSE_FILE=local.yml
And then run::
@ -64,24 +65,24 @@ Running management commands
~~~~~~~~~~~~~~~~~~~~~~~~~~~
As with any shell command that we wish to run in our container, this is done
using the ``docker-compose run`` command.
using the ``docker-compose -f local.yml run`` command.
To migrate your app and to create a superuser, run::
$ docker-compose -f dev.yml run django python manage.py migrate
$ docker-compose -f dev.yml run django python manage.py createsuperuser
$ docker-compose -f local.yml run django python manage.py migrate
$ docker-compose -f local.yml run django python manage.py createsuperuser
Here we specify the ``django`` container as the location to run our management commands.
Add your Docker development server IP
------------------------------------
-------------------------------------
When ``DEBUG`` is set to `True`, the host is validated against ``['localhost', '127.0.0.1', '[::1]']``. This is adequate when running a ``virtualenv``. For Docker, in the ``config.settings.local``, add your host development server IP to ``INTERNAL_IPS`` or ``ALLOWED_HOSTS`` if the variable exists.
Production Mode
~~~~~~~~~~~~~~~
Instead of using `dev.yml`, you would use `docker-compose.yml`.
Instead of using `local.yml`, you would use `production.yml`.
Other Useful Tips
-----------------
@ -103,7 +104,7 @@ If you want to run the stack in detached mode (in the background), use the ``-d`
::
$ docker-compose -f dev.yml up -d
$ docker-compose -f local.yml up -d
Debugging
~~~~~~~~~~~~~
@ -121,13 +122,13 @@ Then you may need to run the following for it to work as desired:
::
$ docker-compose run -f dev.yml --service-ports django
$ docker-compose -f local.yml run --service-ports django
django-debug-toolbar
""""""""""""""""""""
In order for django-debug-toolbar to work with docker you need to add your docker-machine ip address (the output of `Get the IP ADDRESS`_) to INTERNAL_IPS in local.py
In order for django-debug-toolbar to work with docker you need to add your docker-machine ip address to ``INTERNAL_IPS`` in ``local.py``
.. May be a better place to put this, as it is not Docker specific.

View File

@ -2,26 +2,26 @@
Database Backups with Docker
============================
The database has to be running to create/restore a backup. These examples show local examples. If you want to use it on a remote server, remove ``-f dev.yml`` from each example.
The database has to be running to create/restore a backup. These examples show local examples. If you want to use it on a remote server, remove ``-f local.yml`` from each example.
Running Backups
================
Run the app with `docker-compose -f dev.yml up`.
Run the app with `docker-compose -f local.yml up`.
To create a backup, run::
docker-compose -f dev.yml run postgres backup
docker-compose -f local.yml run postgres backup
To list backups, run::
docker-compose -f dev.yml run postgres list-backups
docker-compose -f local.yml run postgres list-backups
To restore a backup, run::
docker-compose -f dev.yml run postgres restore filename.sql
docker-compose -f local.yml run postgres restore filename.sql
Where <containerId> is the ID of the Postgres container. To get it, run::

View File

@ -33,5 +33,4 @@ Indices and tables
* :ref:`genindex`
* :ref:`search`
.. At some point it would be good to have a module index of the high level things
we are doing. Then we can * :ref:`modindex` back in.
.. At some point it would be good to have a module index of the high level things we are doing. Then we can * :ref:`modindex` back in.

View File

@ -14,7 +14,7 @@ To run flake8:
The config for flake8 is located in setup.cfg. It specifies:
* Set max line length to 120 chars
* Exclude .tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules
* Exclude ``.tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules``
pylint
------
@ -40,4 +40,4 @@ This is included in flake8's checks, but you can also run it separately to see a
The config for pep8 is located in setup.cfg. It specifies:
* Set max line length to 120 chars
* Exclude .tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules
* Exclude ``.tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules``

View File

@ -21,4 +21,4 @@ The base app will now run as it would with the usual ``manage.py runserver`` but
To get live reloading to work you'll probably need to install an `appropriate browser extension`_
.. _appropriate browser extension: http://feedback.livereload.com/knowledgebase/articles/86242-how-do-i-install-and-use-the-browser-extensions-
.. _appropriate browser extension: http://livereload.com/extensions/

View File

@ -54,10 +54,6 @@ use_pycharm [n]
windows [n]
Whether you'll be developing on Windows.
use_python3 [y]
By default, the Python code generated will be for Python 3.x. But if you
answer `n` here, it will be legacy Python 2.7 code.
use_docker [y]
Whether to use Docker_, separating the app and database into separate
containers.
@ -76,8 +72,8 @@ js_task_runner [1]
2. Grunt_
3. None
use_lets_encrypt [n]
Use `Let's Encrypt`_ as the certificate authority for this project.
custom_bootstrap_compilation [n]
If you use Grunt, scaffold out recompiling Bootstrap as as task. (Useful for letting you change Bootstrap variables in real time.) Consult project README for more details.
open_source_license [1]
Select a software license for the project. The choices are:

View File

@ -1,3 +1,5 @@
.. _settings:
Settings
==========

View File

@ -12,7 +12,6 @@ A portion of this code was adopted from Django's standard crypto functions and
utilities, specifically:
https://github.com/django/django/blob/master/django/utils/crypto.py
"""
from __future__ import print_function
import os
import random
import shutil
@ -131,7 +130,7 @@ def remove_docker_files():
"""
Removes files needed for docker if it isn't going to be used
"""
for filename in ["dev.yml", "docker-compose.yml", ".dockerignore"]:
for filename in ["local.yml", "production.yml", ".dockerignore"]:
os.remove(os.path.join(
PROJECT_DIRECTORY, filename
))
@ -168,14 +167,6 @@ def remove_packageJSON_file():
PROJECT_DIRECTORY, filename
))
def remove_certbot_files():
"""
Removes files needed for certbot if it isn't going to be used
"""
nginx_dir_location = os.path.join(PROJECT_DIRECTORY, 'compose/nginx')
for filename in ["nginx-secure.conf", "start.sh", "dhparams.example.pem"]:
file_name = os.path.join(nginx_dir_location, filename)
remove_file(file_name)
def remove_copying_files():
"""
@ -202,6 +193,16 @@ def remove_elasticbeanstalk():
PROJECT_DIRECTORY, filename
))
def remove_open_source_files():
"""
Removes files conventional to opensource projects only.
"""
for filename in ["CONTRIBUTORS.txt"]:
os.remove(os.path.join(
PROJECT_DIRECTORY, filename
))
# IN PROGRESS
# def copy_doc_files(project_directory):
# cookiecutters_dir = DEFAULT_CONFIG['cookiecutters_dir']
@ -220,26 +221,26 @@ def remove_elasticbeanstalk():
# dst = os.path.join(target_dir, name)
# shutil.copyfile(src, dst)
# 1. Generates and saves random secret key
# Generates and saves random secret key
make_secret_key(PROJECT_DIRECTORY)
# 2. Removes the taskapp if celery isn't going to be used
# Removes the taskapp if celery isn't going to be used
if '{{ cookiecutter.use_celery }}'.lower() == 'n':
remove_task_app(PROJECT_DIRECTORY)
# 3. Removes the .idea directory if PyCharm isn't going to be used
# Removes the .idea directory if PyCharm isn't going to be used
if '{{ cookiecutter.use_pycharm }}'.lower() != 'y':
remove_pycharm_dir(PROJECT_DIRECTORY)
# 4. Removes all heroku files if it isn't going to be used
# Removes all heroku files if it isn't going to be used
if '{{ cookiecutter.use_heroku }}'.lower() != 'y':
remove_heroku_files()
# 5. Removes all docker files if it isn't going to be used
# Removes all docker files if it isn't going to be used
if '{{ cookiecutter.use_docker }}'.lower() != 'y':
remove_docker_files()
# 6. Removes all JS task manager files if it isn't going to be used
# Removes all JS task manager files if it isn't going to be used
if '{{ cookiecutter.js_task_runner}}'.lower() == 'gulp':
remove_grunt_files()
elif '{{ cookiecutter.js_task_runner}}'.lower() == 'grunt':
@ -249,11 +250,7 @@ else:
remove_grunt_files()
remove_packageJSON_file()
# 7. Removes all certbot/letsencrypt files if it isn't going to be used
if '{{ cookiecutter.use_lets_encrypt }}'.lower() != 'y':
remove_certbot_files()
# 8. Display a warning if use_docker and use_grunt are selected. Grunt isn't
# Display a warning if use_docker and use_grunt are selected. Grunt isn't
# supported by our docker config atm.
if '{{ cookiecutter.js_task_runner }}'.lower() in ['grunt', 'gulp'] and '{{ cookiecutter.use_docker }}'.lower() == 'y':
print(
@ -262,25 +259,15 @@ if '{{ cookiecutter.js_task_runner }}'.lower() in ['grunt', 'gulp'] and '{{ cook
"js task runner service to your docker configuration manually."
)
# 9. Removes the certbot/letsencrypt files and display a warning if use_lets_encrypt is selected and use_docker isn't.
if '{{ cookiecutter.use_lets_encrypt }}'.lower() == 'y' and '{{ cookiecutter.use_docker }}'.lower() != 'y':
remove_certbot_files()
print(
"You selected to use Let's Encrypt and didn't select to use docker. This is NOT supported out of the box for now. You "
"can continue to use the project like you normally would, but Let's Encrypt files have been included."
)
# 10. Directs the user to the documentation if certbot and docker are selected.
if '{{ cookiecutter.use_lets_encrypt }}'.lower() == 'y' and '{{ cookiecutter.use_docker }}'.lower() == 'y':
print(
"You selected to use Let's Encrypt, please see the documentation for instructions on how to use this in production. "
"You must generate a dhparams.pem file before running docker-compose in a production environment."
)
# 11. Removes files needed for the GPLv3 licence if it isn't going to be used.
# Removes files needed for the GPLv3 licence if it isn't going to be used.
if '{{ cookiecutter.open_source_license}}' != 'GPLv3':
remove_copying_files()
# 12. Remove Elastic Beanstalk files
# Remove Elastic Beanstalk files
if '{{ cookiecutter.use_elasticbeanstalk_experimental }}'.lower() != 'y':
remove_elasticbeanstalk()
# Remove files conventional to opensource projects only.
if '{{ cookiecutter.open_source_license }}' == 'Not open source':
remove_open_source_files()

View File

@ -9,3 +9,23 @@ docker = '{{ cookiecutter.use_docker }}'.lower()
if elasticbeanstalk == 'y' and (heroku == 'y' or docker == 'y'):
raise Exception("Cookiecutter Django's EXPERIMENTAL Elastic Beanstalk support is incompatible with Heroku and Docker setups.")
if docker == 'n':
import sys
python_major_version = sys.version_info[0]
if python_major_version == 2:
sys.stdout.write("WARNING: Cookiecutter Django does not support Python 2! Stability is guaranteed with Python 3.4+ only. Are you sure you want to proceed? (y/n)")
yes_options = set(['y'])
no_options = set(['n', ''])
choice = raw_input().lower()
if choice in no_options:
sys.exit(1)
elif choice in yes_options:
pass
else:
sys.stdout.write("Please respond with %s or %s"
% (', '.join([o for o in yes_options if not o == ''])
, ', '.join([o for o in no_options if not o == ''])))

View File

@ -1,11 +1,11 @@
cookiecutter==1.5.1
flake8==3.3.0 # pyup: != 2.6.0
sh==1.12.13
binaryornot==0.4.0
flake8==3.4.1 # pyup: != 2.6.0
sh==1.12.14
binaryornot==0.4.4
# Testing
pytest==3.0.7
pytest==3.2.1
pep8==1.7.0
pyflakes==1.5.0
tox==2.6.0
pyflakes==1.6.0
tox==2.7.0
pytest-cookies==0.2.0

View File

@ -10,7 +10,7 @@ except ImportError:
# Our version ALWAYS matches the version of Django we support
# If Django has a new release, we branch, tag, then update this setting after the tag.
version = '1.10.1'
version = '1.10.7'
if sys.argv[-1] == 'tag':
os.system('git tag -a %s -m "version %s"' % (version, version))
@ -39,8 +39,6 @@ setup(
'Natural Language :: English',
'License :: OSI Approved :: BSD License',
'Programming Language :: Python',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',

View File

@ -1,5 +1,3 @@
# -*- coding: utf-8 -*-
import os
import re
import sh

View File

@ -15,7 +15,7 @@ cookiecutter ../../ --no-input --overwrite-if-exists use_docker=y js_task_runner
cd project_name
# run the project's tests
docker-compose -f dev.yml run django python manage.py test
docker-compose -f local.yml run django python manage.py test
# return non-zero status code if there are migrations that have not been created
docker-compose -f dev.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; }
docker-compose -f local.yml run django python manage.py makemigrations --dry-run --check || { echo "ERROR: there were changes in the models, but migration listed above have not been created and are not saved in version control"; exit 1; }

View File

@ -1,6 +1,6 @@
[tox]
skipsdist = true
envlist = py27,py34,py35
envlist = py34,py35
[testenv]
passenv = LC_ALL, LANG, HOME

View File

@ -1,78 +1,363 @@
### OSX ###
.DS_Store
.AppleDouble
.LSOverride
### SublimeText ###
# cache files for sublime text
*.tmlanguage.cache
*.tmPreferences.cache
*.stTheme.cache
# workspace files are user-specific
*.sublime-workspace
# project files should be checked into the repository, unless a significant
# proportion of contributors will probably not be using SublimeText
# *.sublime-project
# sftp configuration file
sftp-config.json
# Basics
### Python template
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
__pycache__
*$py.class
# Logs
logs
*.log
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
npm-debug.log*
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.tox
.coverage.*
.cache
nosetests.xml
htmlcov
coverage.xml
*.cover
.hypothesis/
# Translations
*.mo
*.pot
# Pycharm
.idea/*
{% if cookiecutter.use_pycharm == 'y' %}
# Django stuff:
staticfiles/
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# pyenv
.python-version
# celery beat schedule file
celerybeat-schedule
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
### Node template
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
# nyc test coverage
.nyc_output
# Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# Bower dependency directory (https://bower.io/)
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons (http://nodejs.org/api/addons.html)
build/Release
# Dependency directories
node_modules/
jspm_packages/
# Typescript v1 declaration files
typings/
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
### Linux template
*~
# temporary files which can be created if a process still has a handle open of a deleted file
.fuse_hidden*
# KDE directory preferences
.directory
# Linux trash folder which might appear on any partition or disk
.Trash-*
# .nfs files are created when an open file is removed but is still being accessed
.nfs*
### VisualStudioCode template
.vscode/*
!.vscode/settings.json
!.vscode/tasks.json
!.vscode/launch.json
!.vscode/extensions.json
{% if cookiecutter.use_pycharm == 'y' -%}
# Provided default Pycharm Run/Debug Configurations should be tracked by git
# In case of local modifications made by Pycharm, use update-index command
# for each changed file, like this:
# git update-index --assume-unchanged .idea/{{cookiecutter.project_slug}}.iml
!.idea/runConfigurations/
!.idea/{{cookiecutter.project_slug}}.iml
!.idea/vcs.xml
!.idea/webResources.xml
### JetBrains template
# Covers JetBrains IDEs: IntelliJ, RubyMine, PhpStorm, AppCode, PyCharm, CLion, Android Studio and Webstorm
# Reference: https://intellij-support.jetbrains.com/hc/en-us/articles/206544839
# User-specific stuff:
.idea/**/workspace.xml
.idea/**/tasks.xml
.idea/dictionaries
# Sensitive or high-churn files:
.idea/**/dataSources/
.idea/**/dataSources.ids
.idea/**/dataSources.xml
.idea/**/dataSources.local.xml
.idea/**/sqlDataSources.xml
.idea/**/dynamic.xml
.idea/**/uiDesigner.xml
# Gradle:
.idea/**/gradle.xml
.idea/**/libraries
# CMake
cmake-build-debug/
# Mongo Explorer plugin:
.idea/**/mongoSettings.xml
## File-based project format:
*.iws
## Plugin-specific files:
# IntelliJ
out/
# mpeltonen/sbt-idea plugin
.idea_modules/
# JIRA plugin
atlassian-ide-plugin.xml
# Cursive Clojure plugin
.idea/replstate.xml
# Crashlytics plugin (for Android Studio and IntelliJ)
com_crashlytics_export_strings.xml
crashlytics.properties
crashlytics-build.properties
fabric.properties
{% endif %}
# Vim
*~
*.swp
*.swo
### Windows template
# Windows thumbnail cache files
Thumbs.db
ehthumbs.db
ehthumbs_vista.db
# npm
node_modules/
# Dump file
*.stackdump
# Compass
.sass-cache
# Folder config file
Desktop.ini
# virtual environments
.env
# Recycle Bin used on file shares
$RECYCLE.BIN/
# User-uploaded media
{{ cookiecutter.project_slug }}/media/
# Windows Installer files
*.cab
*.msi
*.msm
*.msp
{% if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' %}
# MailHog binary
# Windows shortcuts
*.lnk
### macOS template
# General
*.DS_Store
.AppleDouble
.LSOverride
# Icon must end with two \r
Icon
# Thumbnails
._*
# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent
# Directories potentially created on remote AFP share
.AppleDB
.AppleDesktop
Network Trash Folder
Temporary Items
.apdisk
### SublimeText template
# Cache files for Sublime Text
*.tmlanguage.cache
*.tmPreferences.cache
*.stTheme.cache
# Workspace files are user-specific
*.sublime-workspace
# Project files should be checked into the repository, unless a significant
# proportion of contributors will probably not be using Sublime Text
# *.sublime-project
# SFTP configuration file
sftp-config.json
# Package control specific files
Package Control.last-run
Package Control.ca-list
Package Control.ca-bundle
Package Control.system-ca-bundle
Package Control.cache/
Package Control.ca-certs/
Package Control.merged-ca-bundle
Package Control.user-ca-bundle
oscrypto-ca-bundle.crt
bh_unicode_properties.cache
# Sublime-github package stores a github token in this file
# https://packagecontrol.io/packages/sublime-github
GitHub.sublime-settings
### Vim template
# Swap
[._]*.s[a-v][a-z]
[._]*.sw[a-p]
[._]s[a-v][a-z]
[._]sw[a-p]
# Session
Session.vim
# Temporary
.netrwhist
# Auto-generated tag files
tags
### VirtualEnv template
# Virtualenv
# http://iamzed.com/2009/05/07/a-primer-on-virtualenv/
[Bb]in
[Ii]nclude
[Ll]ib
[Ll]ib64
[Ll]ocal
[Ss]cripts
pyvenv.cfg
pip-selfcheck.json
{% if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' -%}
mailhog
{% endif %}
staticfiles/
{{ cookiecutter.project_slug }}/media/

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="false" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:django/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:django/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="false" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.local" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:django/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:django/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="false" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.test" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.test" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.test" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.test" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -6,7 +6,7 @@
<env name="PYTHONUNBUFFERED" value="1" />
<env name="DJANGO_SETTINGS_MODULE" value="config.settings.test" />
</envs>
<option name="SDK_HOME" value="docker-compose://$PROJECT_DIR$/dev.yml:pycharm/python" />
<option name="SDK_HOME" value="docker-compose://[$PROJECT_DIR$/dev.yml]:pycharm/python" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />

View File

@ -8,8 +8,4 @@ before_install:
- sudo apt-get install -qq libsqlite3-dev libxml2 libxml2-dev libssl-dev libbz2-dev wget curl llvm
language: python
python:
{% if cookiecutter.use_python3 == 'y' -%}
- "3.5"
{% else %}
- "2.7"
{%- endif %}

View File

@ -60,6 +60,9 @@ module.exports = function (grunt) {
dev: {
options: {
outputStyle: 'nested',
{% if cookiecutter.custom_bootstrap_compilation == 'y' %}
includePaths: ['bower_components/bootstrap-sass/assets/stylesheets/bootstrap/'],
{% endif %}
sourceMap: false,
precision: 10
},
@ -70,6 +73,9 @@ module.exports = function (grunt) {
dist: {
options: {
outputStyle: 'compressed',
{% if cookiecutter.custom_bootstrap_compilation == 'y' %}
includePaths: ['bower_components/bootstrap-sass/assets/stylesheets/bootstrap/'],
{% endif %}
sourceMap: false,
precision: 10
},

View File

@ -145,3 +145,12 @@ See detailed `cookiecutter-django Elastic Beanstalk documentation`_.
.. _`cookiecutter-django Docker documentation`: http://cookiecutter-django.readthedocs.io/en/latest/deployment-with-elastic-beanstalk.html
{% endif %}
{% if cookiecutter.custom_bootstrap_compilation == "y" %}
Custom Bootstrap Compilation
^^^^^^
To get automatic Bootstrap recompilation with variables of your choice, install bootstrap sass (`bower install bootstrap-sass`) and tweak your variables in `static/sass/custom_bootstrap_vars`.
(You can find a list of available variables [in the bootstrap-sass source](https://github.com/twbs/bootstrap-sass/blob/master/assets/stylesheets/bootstrap/_variables.scss), or get explanations on them in the [Bootstrap docs](https://getbootstrap.com/customize/).)
{% endif %}

View File

@ -0,0 +1,14 @@
www.{% raw %}{%DOMAIN_NAME%}{% endraw %} {
redir https://{{cookiecutter.domain_name}}
}
{% raw %}{%DOMAIN_NAME%}{% endraw %} {
proxy / django:5000 {
header_upstream Host {host}
header_upstream X-Real-IP {remote}
header_upstream X-Forwarded-Proto {scheme}
}
log stdout
errors stdout
gzip
}

View File

@ -0,0 +1,2 @@
FROM abiosoft/caddy:0.10.6
COPY Caddyfile /etc/Caddyfile

View File

@ -1,22 +1,16 @@
{% if cookiecutter.use_python3 == 'y' -%}
FROM python:3.5
{% else %}
FROM python:2.7
{%- endif %}
ENV PYTHONUNBUFFERED 1
RUN groupadd -r django \
&& useradd -r -g django django
# Requirements have to be pulled and installed here, otherwise caching won't work
COPY ./requirements /requirements
RUN pip install --no-cache-dir -r /requirements/production.txt \
&& rm -rf /requirements
RUN pip install -r /requirements/production.txt \
&& groupadd -r django \
&& useradd -r -g django django
COPY . /app
RUN chown -R django /app
COPY ./compose/django/gunicorn.sh /gunicorn.sh
COPY ./compose/django/entrypoint.sh /entrypoint.sh
COPY ./compose/django/gunicorn.sh ./compose/django/entrypoint.sh /
RUN sed -i 's/\r//' /entrypoint.sh \
&& sed -i 's/\r//' /gunicorn.sh \
&& chmod +x /entrypoint.sh \
@ -24,6 +18,12 @@ RUN sed -i 's/\r//' /entrypoint.sh \
&& chmod +x /gunicorn.sh \
&& chown django /gunicorn.sh
COPY . /app
RUN chown -R django /app
USER django
WORKDIR /app
ENTRYPOINT ["/entrypoint.sh"]

View File

@ -1,8 +1,4 @@
{% if cookiecutter.use_python3 == 'y' -%}
FROM python:3.5
{% else %}
FROM python:2.7
{%- endif %}
ENV PYTHONUNBUFFERED 1
# Requirements have to be pulled and installed here, otherwise caching won't work
@ -17,6 +13,14 @@ COPY ./compose/django/start-dev.sh /start-dev.sh
RUN sed -i 's/\r//' /start-dev.sh
RUN chmod +x /start-dev.sh
COPY ./compose/django/celery/worker/start-dev.sh /start-celeryworker-dev.sh
RUN sed -i 's/\r//' /start-celeryworker-dev.sh
RUN chmod +x /start-celeryworker-dev.sh
COPY ./compose/django/celery/beat/start-dev.sh /start-celerybeat-dev.sh
RUN sed -i 's/\r//' /start-celerybeat-dev.sh
RUN chmod +x /start-celerybeat-dev.sh
WORKDIR /app
ENTRYPOINT ["/entrypoint.sh"]

View File

@ -0,0 +1,8 @@
#!/bin/sh
set -o errexit
set -o nounset
set -o xtrace
rm -f './celerybeat.pid'
celery -A {{cookiecutter.project_slug}}.taskapp beat -l INFO

View File

@ -0,0 +1,7 @@
#!/bin/sh
set -o errexit
set -o nounset
set -o xtrace
celery -A {{cookiecutter.project_slug}}.taskapp worker -l INFO

View File

@ -1,9 +0,0 @@
FROM nginx:latest
ADD nginx.conf /etc/nginx/nginx.conf
{% if cookiecutter.use_lets_encrypt == 'y' and cookiecutter.use_docker == 'y' %}
ADD start.sh /start.sh
ADD nginx-secure.conf /etc/nginx/nginx-secure.conf
ADD dhparams.pem /etc/ssl/private/dhparams.pem
CMD /start.sh
{% endif %}

View File

@ -1,3 +0,0 @@
-----BEGIN DH PARAMETERS-----
EXAMPLE_FILE
-----END DH PARAMETERS-----

View File

@ -1,96 +0,0 @@
user nginx;
worker_processes 1;
error_log /var/log/nginx/error.log warn;
pid /var/run/nginx.pid;
events {
worker_connections 1024;
}
http {
include /etc/nginx/mime.types;
default_type application/octet-stream;
log_format main '$remote_addr - $remote_user [$time_local] "$request" '
'$status $body_bytes_sent "$http_referer" '
'"$http_user_agent" "$http_x_forwarded_for"';
access_log /var/log/nginx/access.log main;
sendfile on;
#tcp_nopush on;
keepalive_timeout 65;
proxy_headers_hash_bucket_size 52;
gzip on;
upstream app {
server django:5000;
}
server {
listen 80;
server_name ___my.example.com___ www.___my.example.com___;
location /.well-known/acme-challenge {
# Since the certbot container isn't up constantly, need to resolve ip dynamically using docker's dns
resolver ___NAMESERVER___;
set $certbot_addr_port certbot:80;
proxy_pass http://$certbot_addr_port;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-Proto $scheme;
}
location / {
return 301 https://$server_name$request_uri;
}
}
server {
listen 443;
server_name ___my.example.com___ www.___my.example.com___;
ssl on;
ssl_certificate /etc/letsencrypt/live/___my.example.com___/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/___my.example.com___/privkey.pem;
ssl_session_timeout 5m;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_ciphers 'EECDH+AESGCM:EDH+AESGCM:AES256+EECDH:AES256+EDH';
ssl_prefer_server_ciphers on;
ssl_session_cache shared:SSL:10m;
ssl_dhparam /etc/ssl/private/dhparams.pem;
location /.well-known/acme-challenge {
resolver ___NAMESERVER___;
set $certbot_addr_port certbot:443;
proxy_pass http://$certbot_addr_port;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-Proto https;
}
location / {
# checks for static file, if not found proxy to app
try_files $uri @proxy_to_app;
}
# cookiecutter-django app
location @proxy_to_app {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_redirect off;
proxy_pass http://app;
}
}
}

View File

@ -1,61 +0,0 @@
user nginx;
worker_processes 1;
error_log /var/log/nginx/error.log warn;
pid /var/run/nginx.pid;
events {
worker_connections 1024;
}
http {
include /etc/nginx/mime.types;
default_type application/octet-stream;
log_format main '$remote_addr - $remote_user [$time_local] "$request" '
'$status $body_bytes_sent "$http_referer" '
'"$http_user_agent" "$http_x_forwarded_for"';
access_log /var/log/nginx/access.log main;
sendfile on;
#tcp_nopush on;
keepalive_timeout 65;
#gzip on;
upstream app {
server django:5000;
}
server {
listen 80;
charset utf-8;
{% if cookiecutter.use_lets_encrypt == 'y' and cookiecutter.use_docker == 'y' %}
server_name ___my.example.com___ ;
location /.well-known/acme-challenge {
proxy_pass http://certbot:80;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-Proto https;
}
{% endif %}
location / {
# checks for static file, if not found proxy to app
try_files $uri @proxy_to_app;
}
# cookiecutter-django app
location @proxy_to_app {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_redirect off;
proxy_pass http://app;
}
}
}

View File

@ -1,62 +0,0 @@
echo sleep 5
sleep 5
echo build starting nginx config
echo replacing ___my.example.com___/$MY_DOMAIN_NAME
# Put your domain name into the nginx reverse proxy config.
sed -i "s/___my.example.com___/$MY_DOMAIN_NAME/g" /etc/nginx/nginx.conf
cat /etc/nginx/nginx.conf
echo .
echo Firing up nginx in the background.
nginx
# # Check user has specified domain name
if [ -z "$MY_DOMAIN_NAME" ]; then
echo "Need to set MY_DOMAIN_NAME (to a letsencrypt-registered name)."
exit 1
fi
# This bit waits until the letsencrypt container has done its thing.
# We see the changes here bceause there's a docker volume mapped.
echo Waiting for folder /etc/letsencrypt/live/$MY_DOMAIN_NAME to exist
while [ ! -d /etc/letsencrypt/live/$MY_DOMAIN_NAME ] ;
do
sleep 2
done
while [ ! -f /etc/letsencrypt/live/$MY_DOMAIN_NAME/fullchain.pem ] ;
do
echo Waiting for file fullchain.pem to exist
sleep 2
done
while [ ! -f /etc/letsencrypt/live/$MY_DOMAIN_NAME/privkey.pem ] ;
do
echo Waiting for file privkey.pem to exist
sleep 2
done
# This is added so that when the certificate is being renewed or is already in place, nginx waits for everything to be good.
sleep 15
echo replacing ___my.example.com___/$MY_DOMAIN_NAME
# Put your domain name into the nginx reverse proxy config.
sed -i "s/___my.example.com___/$MY_DOMAIN_NAME/g" /etc/nginx/nginx-secure.conf
# Add the system's nameserver (the docker network dns) so we can resolve container names in nginx
NAMESERVER=`cat /etc/resolv.conf | grep "nameserver" | awk '{print $2}' | tr '\n' ' '`
echo replacing ___NAMESERVER___/$NAMESERVER
sed -i "s/___NAMESERVER___/$NAMESERVER/g" /etc/nginx/nginx-secure.conf
#go!
kill $(ps aux | grep 'nginx' | grep -v 'grep' | awk '{print $2}')
cp /etc/nginx/nginx-secure.conf /etc/nginx/nginx.conf
nginx -g 'daemon off;'

View File

@ -17,10 +17,10 @@ export PGPASSWORD=$POSTGRES_PASSWORD
# check that we have an argument for a filename candidate
if [[ $# -eq 0 ]] ; then
echo 'usage:'
echo ' docker-compose run postgres restore <backup-file>'
echo ' docker-compose -f production.yml run postgres restore <backup-file>'
echo ''
echo 'to get a list of available backups, run:'
echo ' docker-compose run postgres list-backups'
echo ' docker-compose -f production.yml run postgres list-backups'
exit 1
fi
@ -31,7 +31,7 @@ BACKUPFILE=/backups/$1
if ! [ -f $BACKUPFILE ]; then
echo "backup file not found"
echo 'to get a list of available backups, run:'
echo ' docker-compose run postgres list-backups'
echo ' docker-compose -f production.yml run postgres list-backups'
exit 1
fi

View File

@ -1 +0,0 @@
# -*- coding: utf-8 -*-

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
"""
Django settings for {{cookiecutter.project_name}} project.
@ -8,8 +7,6 @@ https://docs.djangoproject.com/en/dev/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/dev/ref/settings/
"""
from __future__ import absolute_import, unicode_literals
import environ
ROOT_DIR = environ.Path(__file__) - 3 # ({{ cookiecutter.project_slug }}/config/settings/base.py - 3 = {{ cookiecutter.project_slug }}/)
@ -270,11 +267,11 @@ AUTOSLUG_SLUGIFY_FUNCTION = 'slugify.slugify'
{% if cookiecutter.use_celery == 'y' %}
########## CELERY
INSTALLED_APPS += ['{{cookiecutter.project_slug}}.taskapp.celery.CeleryConfig']
BROKER_URL = env('CELERY_BROKER_URL', default='django://')
if BROKER_URL == 'django://':
CELERY_BROKER_URL = env('CELERY_BROKER_URL', default='django://')
if CELERY_BROKER_URL == 'django://':
CELERY_RESULT_BACKEND = 'redis://'
else:
CELERY_RESULT_BACKEND = BROKER_URL
CELERY_RESULT_BACKEND = CELERY_BROKER_URL
########## END CELERY
{% endif %}

View File

@ -1,10 +1,11 @@
# -*- coding: utf-8 -*-
"""
Local settings
- Run in Debug mode
{% if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'y' %}
- Use mailhog for emails
{% elif cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' %}
- Use mailhog for emails
{% else %}
- Use console backend for emails
{% endif %}
@ -12,8 +13,6 @@ Local settings
- Add django-extensions as app
"""
import socket
import os
from .base import * # noqa
# DEBUG
@ -33,6 +32,8 @@ SECRET_KEY = env('DJANGO_SECRET_KEY', default='CHANGEME!!!')
EMAIL_PORT = 1025
{% if cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'y' %}
EMAIL_HOST = env('EMAIL_HOST', default='mailhog')
{% elif cookiecutter.use_mailhog == 'y' and cookiecutter.use_docker == 'n' %}
EMAIL_HOST = 'localhost'
{% else %}
EMAIL_HOST = 'localhost'
EMAIL_BACKEND = env('DJANGO_EMAIL_BACKEND',
@ -54,11 +55,15 @@ MIDDLEWARE += ['debug_toolbar.middleware.DebugToolbarMiddleware', ]
INSTALLED_APPS += ['debug_toolbar', ]
INTERNAL_IPS = ['127.0.0.1', '10.0.2.2', ]
{% if cookiecutter.use_docker == 'y' %}
{# [cookiecutter-django] This is a workaround to flake8 "imported but unused" errors #}
import socket
import os
# tricks to have debug toolbar when developing with docker
if os.environ.get('USE_DOCKER') == 'yes':
ip = socket.gethostbyname(socket.gethostname())
INTERNAL_IPS += [ip[:-1] + '1']
{% endif %}
DEBUG_TOOLBAR_CONFIG = {
'DISABLE_PANELS': [
'debug_toolbar.panels.redirects.RedirectsPanel',

View File

@ -1,8 +1,9 @@
# -*- coding: utf-8 -*-
"""
Production Configurations
- Use Amazon's S3 for storing static files and uploaded media
{% if cookiecutter.use_whitenoise == 'y' -%}
- Use WhiteNoise for serving static files{% endif %}
- Use Amazon's S3 for storing {% if cookiecutter.use_whitenoise == 'n' -%}static files and {% endif %}uploaded media
- Use mailgun to send emails
- Use Redis for cache
{% if cookiecutter.use_sentry_for_error_reporting == 'y' %}
@ -12,10 +13,7 @@ Production Configurations
- Use opbeat for error reporting
{% endif %}
"""
from __future__ import absolute_import, unicode_literals
from boto.s3.connection import OrdinaryCallingFormat
from django.utils import six
{% if cookiecutter.use_sentry_for_error_reporting == 'y' %}
import logging
{% endif %}
@ -101,7 +99,6 @@ AWS_SECRET_ACCESS_KEY = env('DJANGO_AWS_SECRET_ACCESS_KEY')
AWS_STORAGE_BUCKET_NAME = env('DJANGO_AWS_STORAGE_BUCKET_NAME')
AWS_AUTO_CREATE_BUCKET = True
AWS_QUERYSTRING_AUTH = False
AWS_S3_CALLING_FORMAT = OrdinaryCallingFormat()
# AWS cache settings, don't change unless you know what you're doing:
AWS_EXPIRY = 60 * 60 * 24 * 7
@ -109,20 +106,21 @@ AWS_EXPIRY = 60 * 60 * 24 * 7
# TODO See: https://github.com/jschneier/django-storages/issues/47
# Revert the following and use str after the above-mentioned bug is fixed in
# either django-storage-redux or boto
control = 'max-age=%d, s-maxage=%d, must-revalidate' % (AWS_EXPIRY, AWS_EXPIRY)
AWS_HEADERS = {
'Cache-Control': six.b('max-age=%d, s-maxage=%d, must-revalidate' % (
AWS_EXPIRY, AWS_EXPIRY))
'Cache-Control': bytes(control, encoding='latin-1')
}
# URL that handles the media served from MEDIA_ROOT, used for managing
# stored files.
{% if cookiecutter.use_whitenoise == 'y' -%}
MEDIA_URL = 'https://s3.amazonaws.com/%s/' % AWS_STORAGE_BUCKET_NAME
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
{% else %}
# See:http://stackoverflow.com/questions/10390244/
from storages.backends.s3boto import S3BotoStorage
StaticRootS3BotoStorage = lambda: S3BotoStorage(location='static')
MediaRootS3BotoStorage = lambda: S3BotoStorage(location='media')
from storages.backends.s3boto3 import S3Boto3Storage
StaticRootS3BotoStorage = lambda: S3Boto3Storage(location='static')
MediaRootS3BotoStorage = lambda: S3Boto3Storage(location='media')
DEFAULT_FILE_STORAGE = 'config.settings.production.MediaRootS3BotoStorage'
MEDIA_URL = 'https://s3.amazonaws.com/%s/media/' % AWS_STORAGE_BUCKET_NAME
@ -144,7 +142,7 @@ INSTALLED_APPS = ['collectfast', ] + INSTALLED_APPS
{% if cookiecutter.use_compressor == 'y'-%}
# COMPRESSOR
# ------------------------------------------------------------------------------
COMPRESS_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
COMPRESS_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
COMPRESS_URL = STATIC_URL
COMPRESS_ENABLED = env.bool('COMPRESS_ENABLED', default=True)
{%- endif %}

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
'''
Test settings

View File

@ -1,6 +1,3 @@
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.conf import settings
from django.conf.urls import include, url
from django.conf.urls.static import static
@ -35,7 +32,6 @@ if settings.DEBUG:
]
if 'debug_toolbar' in settings.INSTALLED_APPS:
import debug_toolbar
urlpatterns += [
urlpatterns = [
url(r'^__debug__/', include(debug_toolbar.urls)),
]
] + urlpatterns

View File

@ -14,8 +14,15 @@ framework.
"""
import os
import sys
from django.core.wsgi import get_wsgi_application
# This allows easy placement of apps within the interior
# {{ cookiecutter.project_slug }} directory.
app_path = os.path.dirname(os.path.abspath(__file__)).replace('/config', '')
sys.path.append(os.path.join(app_path, '{{ cookiecutter.project_slug }}'))
{% if cookiecutter.use_sentry_for_error_reporting == 'y' -%}
if os.environ.get('DJANGO_SETTINGS_MODULE') == 'config.settings.production':
from raven.contrib.django.raven_compat.middleware.wsgi import Sentry

View File

@ -1,5 +1,3 @@
# -*- coding: utf-8 -*-
#
# {{ cookiecutter.project_name }} documentation build configuration file, created by
# sphinx-quickstart.
#
@ -11,8 +9,6 @@
# All configuration values have a default; values that are commented out
# serve to show the default.
from __future__ import unicode_literals
import os
import sys

View File

@ -29,7 +29,7 @@ The Docker compose tool (previously known as `fig`_) makes linking these contain
webserver/
Dockerfile
...
docker-compose.yml
production.yml
Each component of your application would get its own `Dockerfile`_. The rest of this example assumes you are using the `base postgres image`_ for your database. Your database settings in `config/base.py` might then look something like:
@ -48,7 +48,7 @@ Each component of your application would get its own `Dockerfile`_. The rest of
}
}
The `Docker compose documentation`_ explains in detail what you can accomplish in the `docker-compose.yml` file, but an example configuration might look like this:
The `Docker compose documentation`_ explains in detail what you can accomplish in the `production.yml` file, but an example configuration might look like this:
.. _Docker compose documentation: https://docs.docker.com/compose/#compose-documentation
@ -107,9 +107,9 @@ We'll ignore the webserver for now (you'll want to comment that part out while w
# uncomment the line below to use container as a non-root user
USER python:python
Running `sudo docker-compose build` will follow the instructions in your `docker-compose.yml` file and build the database container, then your webapp, before mounting your cookiecutter project files as a volume in the webapp container and linking to the database. Our example yaml file runs in development mode but changing it to production mode is as simple as commenting out the line using `runserver` and uncommenting the line using `gunicorn`.
Running `sudo docker-compose -f production.yml build` will follow the instructions in your `production.yml` file and build the database container, then your webapp, before mounting your cookiecutter project files as a volume in the webapp container and linking to the database. Our example yaml file runs in development mode but changing it to production mode is as simple as commenting out the line using `runserver` and uncommenting the line using `gunicorn`.
Both are set to run on port `0.0.0.0:8000`, which is where the Docker daemon will discover it. You can now run `sudo docker-compose up` and browse to `localhost:8000` to see your application running.
Both are set to run on port `0.0.0.0:8000`, which is where the Docker daemon will discover it. You can now run `sudo docker-compose -f production.yml up` and browse to `localhost:8000` to see your application running.
Deployment
^^^^^^^^^^
@ -155,7 +155,7 @@ That Dockerfile assumes you have an Nginx conf file named `site.conf` in the sam
}
}
Running `sudo docker-compose build webserver` will build your server container. Running `sudo docker-compose up` will now expose your application directly on `localhost` (no need to specify the port number).
Running `sudo docker-compose -f production.yml build webserver` will build your server container. Running `sudo docker-compose -f production.yml up` will now expose your application directly on `localhost` (no need to specify the port number).
Building and running your app on EC2
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@ -166,9 +166,9 @@ All you now need to do to run your app in production is:
* Install your preferred source control solution, Docker and Docker compose on the news instance.
* Pull in your code from source control. The root directory should be the one with your `docker-compose.yml` file in it.
* Pull in your code from source control. The root directory should be the one with your `production.yml` file in it.
* Run `sudo docker-compose build` and `sudo docker-compose up`.
* Run `sudo docker-compose -f production.yml build` and `sudo docker-compose -f production.yml up`.
* Assign an `Elastic IP address`_ to your new machine.

View File

@ -21,11 +21,11 @@ Next, you have to add new remote python interpreter, based on already tested dep
.. image:: images/3.png
Switch to *Docker Compose* and select `dev.yml` file from directory of your project, next set *Service name* to `django`
Switch to *Docker Compose* and select `local.yml` file from directory of your project, next set *Service name* to `django`
.. image:: images/4.png
Because Pycharm restarts container every time you use Configuration Run, to not have server restarted during running tests, we defined second service in `dev.yml` file called pycharm. To use it, you have to add interpreter of second service as well.
Because Pycharm restarts container every time you use Configuration Run, to not have server restarted during running tests, we defined second service in `local.yml` file called pycharm. To use it, you have to add interpreter of second service as well.
.. image:: images/5.png

View File

@ -3,6 +3,9 @@
POSTGRES_PASSWORD=mysecretpass
POSTGRES_USER=postgresuser
# Domain name, used by caddy
DOMAIN_NAME={{ cookiecutter.domain_name }}
# General settings
# DJANGO_READ_DOT_ENV_FILE=True
DJANGO_ADMIN_URL=
@ -30,9 +33,9 @@ DJANGO_ACCOUNT_ALLOW_REGISTRATION=True
DJANGO_SENTRY_DSN=
{% endif %}
{% if cookiecutter.use_opbeat == 'y' -%}
DJANGO_OPBEAT_ORGANIZATION_ID
DJANGO_OPBEAT_APP_ID
DJANGO_OPBEAT_SECRET_TOKEN
DJANGO_OPBEAT_ORGANIZATION_ID=
DJANGO_OPBEAT_APP_ID=
DJANGO_OPBEAT_SECRET_TOKEN=
{% endif %}
{% if cookiecutter.use_compressor == 'y' -%}
COMPRESS_ENABLED=

View File

@ -16,7 +16,7 @@ var gulp = require('gulp'),
pixrem = require('gulp-pixrem'),
uglify = require('gulp-uglify'),
imagemin = require('gulp-imagemin'),
exec = require('child_process').exec,
spawn = require('child_process').spawn,
runSequence = require('run-sequence'),
browserSync = require('browser-sync').create(),
reload = browserSync.reload;
@ -45,7 +45,7 @@ var paths = pathsConfig();
// Styles autoprefixing and minification
gulp.task('styles', function() {
return gulp.src(paths.sass + '/project.scss')
return gulp.src(paths.sass + '/*.scss')
.pipe(sass().on('error', sass.logError))
.pipe(plumber()) // Checks for errors
.pipe(autoprefixer({browsers: ['last 2 versions']})) // Adds vendor prefixes
@ -73,10 +73,11 @@ gulp.task('imgCompression', function(){
});
// Run django server
gulp.task('runServer', function() {
exec('python manage.py runserver', function (err, stdout, stderr) {
console.log(stdout);
console.log(stderr);
gulp.task('runServer', function(cb) {
var cmd = spawn('python', ['manage.py', 'runserver'], {stdio: 'inherit'});
cmd.on('close', function(code) {
console.log('runServer exited with code ' + code);
cb(code);
});
});
@ -100,5 +101,5 @@ gulp.task('watch', function() {
// Default task
gulp.task('default', function() {
runSequence(['styles', 'scripts', 'imgCompression'], 'runServer', 'browserSync', 'watch');
runSequence(['styles', 'scripts', 'imgCompression'], ['runServer', 'browserSync', 'watch']);
});

View File

@ -5,6 +5,22 @@ volumes:
postgres_backup_dev: {}
services:
django: &django
build:
context: .
dockerfile: ./compose/django/Dockerfile-local
depends_on:
- postgres{% if cookiecutter.use_mailhog == 'y' %}
- mailhog{% endif %}
volumes:
- .:/app
environment:
- POSTGRES_USER={{cookiecutter.project_slug}}
- USE_DOCKER=yes
ports:
- "8000:8000"
command: /start-dev.sh
postgres:
build: ./compose/postgres
volumes:
@ -12,28 +28,11 @@ services:
- postgres_backup_dev:/backups
environment:
- POSTGRES_USER={{cookiecutter.project_slug}}
django:
build:
context: .
dockerfile: ./compose/django/Dockerfile-dev
command: /start-dev.sh
depends_on:
- postgres{% if cookiecutter.use_mailhog == 'y' %}
- mailhog{% endif %}
environment:
- POSTGRES_USER={{cookiecutter.project_slug}}
- USE_DOCKER=yes
volumes:
- .:/app
ports:
- "8000:8000"
{% if cookiecutter.use_pycharm == 'y' %}
pycharm:
build:
context: .
dockerfile: ./compose/django/Dockerfile-dev
dockerfile: ./compose/django/Dockerfile-local
depends_on:
- postgres
environment:
@ -41,10 +40,33 @@ services:
volumes:
- .:/app
{% endif %}
{% if cookiecutter.use_mailhog == 'y' %}
mailhog:
image: mailhog/mailhog
image: mailhog/mailhog:v1.0.0
ports:
- "8025:8025"
{% endif %}
{% if cookiecutter.use_celery == 'y' %}
redis:
image: redis:3.0
celeryworker:
# https://github.com/docker/compose/issues/3220
<<: *django
depends_on:
- redis
- postgres{% if cookiecutter.use_mailhog == 'y' %}
- mailhog{% endif %}
ports: []
command: /start-celeryworker-dev.sh
celerybeat:
# https://github.com/docker/compose/issues/3220
<<: *django
depends_on:
- redis
- postgres{% if cookiecutter.use_mailhog == 'y' %}
- mailhog{% endif %}
ports: []
command: /start-celerybeat-dev.sh
{% endif %}

View File

@ -20,4 +20,10 @@ if __name__ == '__main__':
"forget to activate a virtual environment?"
)
raise
# This allows easy placement of apps within the interior
# {{ cookiecutter.project_slug }} directory.
current_path = os.path.dirname(os.path.abspath(__file__))
sys.path.append(os.path.join(current_path, '{{ cookiecutter.project_slug }}'))
execute_from_command_line(sys.argv)

View File

@ -3,8 +3,19 @@ version: '2'
volumes:
postgres_data: {}
postgres_backup: {}
caddy: {}
services:
django:
build:
context: .
dockerfile: ./compose/django/Dockerfile
depends_on:
- postgres
- redis
command: /gunicorn.sh
env_file: .env
postgres:
build: ./compose/postgres
volumes:
@ -12,58 +23,24 @@ services:
- postgres_backup:/backups
env_file: .env
django:
build:
context: .
dockerfile: ./compose/django/Dockerfile
user: django
depends_on:
- postgres
- redis
command: /gunicorn.sh
env_file: .env
nginx:
build: ./compose/nginx
caddy:
build: ./compose/caddy
depends_on:
- django
{% if cookiecutter.use_lets_encrypt == 'y' %}
- certbot
{% endif %}
ports:
- "0.0.0.0:80:80"
{% if cookiecutter.use_lets_encrypt == 'y' %}
environment:
- MY_DOMAIN_NAME={{ cookiecutter.domain_name }}
ports:
- "0.0.0.0:80:80"
- "0.0.0.0:443:443"
volumes:
- /etc/letsencrypt:/etc/letsencrypt
- /var/lib/letsencrypt:/var/lib/letsencrypt
certbot:
image: quay.io/letsencrypt/letsencrypt
command: bash -c "sleep 6 && certbot certonly -n --standalone -d {{ cookiecutter.domain_name }} --text --agree-tos --email {{ cookiecutter.email }} --server https://acme-v01.api.letsencrypt.org/directory --rsa-key-size 4096 --verbose --keep-until-expiring --standalone-supported-challenges http-01"
entrypoint: ""
volumes:
- /etc/letsencrypt:/etc/letsencrypt
- /var/lib/letsencrypt:/var/lib/letsencrypt
ports:
- "80"
- "443"
environment:
- TERM=xterm
{% endif %}
- caddy:/root/.caddy
env_file: .env
redis:
image: redis:latest
image: redis:3.0
{% if cookiecutter.use_celery == 'y' %}
celeryworker:
build:
context: .
dockerfile: ./compose/django/Dockerfile
user: django
env_file: .env
depends_on:
- postgres
@ -74,7 +51,6 @@ services:
build:
context: .
dockerfile: ./compose/django/Dockerfile
user: django
env_file: .env
depends_on:
- postgres

View File

@ -1,16 +1,15 @@
{% if cookiecutter.use_python3 == 'y' -%}
# Wheel 0.25+ needed to install certain packages on CPython 3.5+
# like Pillow and psycopg2
# See http://bitly.com/wheel-building-fails-CPython-35
# Verified bug on Python 3.5.1
wheel==0.29.0
{%- endif %}
# Bleeding edge Django
django==1.10.7 # pyup: >=1.10,<1.11
# Configuration
django-environ==0.4.1
django-environ==0.4.4
{% if cookiecutter.use_whitenoise == 'y' -%}
whitenoise==3.3.0
{%- endif %}
@ -21,24 +20,24 @@ django-braces==1.11.0
django-crispy-forms==1.6.1
# Models
django-model-utils==2.6.1
django-model-utils==3.0.0
# Images
Pillow==4.0.0
Pillow==4.2.1
# Password storage
argon2-cffi==16.3.0
# For user registration, either via email or social
# Well-built with regular release cycles!
django-allauth==0.31.0
django-allauth==0.33.0
{% if cookiecutter.windows == 'y' -%}
# On Windows, you must download/install psycopg2 manually
# from http://www.lfd.uci.edu/~gohlke/pythonlibs/#psycopg
{% else %}
# Python-PostgreSQL Database Adapter
psycopg2==2.7.1
psycopg2==2.7.3.1
{%- endif %}
# Unicode slugification
@ -48,16 +47,16 @@ awesome-slugify==1.6.5
pytz==2017.2
# Redis support
django-redis==4.7.0
django-redis==4.8.0
redis>=2.10.5
{% if cookiecutter.use_celery == "y" %}
celery==4.0.2
celery==3.1.25
{% endif %}
{% if cookiecutter.use_compressor == "y" %}
rcssmin==1.0.6 {% if cookiecutter.windows == 'y' %}--install-option="--without-c-extensions"{% endif %}
django-compressor==2.1.1
django-compressor==2.2
{% endif %}
# Your custom requirements go here

View File

@ -1,19 +1,19 @@
# Local development dependencies go here
-r base.txt
coverage==4.3.4
coverage==4.4.1
django-coverage-plugin==1.5.0
Sphinx==1.5.3
django-extensions==1.7.8
Werkzeug==0.12.1
django-test-plus==1.0.17
factory-boy==2.8.1
Sphinx==1.6.3
django-extensions==1.9.0
Werkzeug==0.12.2
django-test-plus==1.0.18
factory-boy==2.9.2
django-debug-toolbar==1.7
django-debug-toolbar==1.8
# improved REPL
ipdb==0.10.2
ipdb==0.10.3
pytest-django==3.1.2
pytest-sugar==0.8.0
pytest-sugar==0.9.0

View File

@ -6,30 +6,30 @@
# Python-PostgreSQL Database Adapter
# If using Win for dev, this assumes Unix in prod
# ------------------------------------------------
psycopg2==2.7.1
psycopg2==2.7.3.1
{%- endif %}
# WSGI Handler
# ------------------------------------------------
gevent==1.2.1
gevent==1.2.2
gunicorn==19.7.1
# Static and Media Storage
# ------------------------------------------------
boto==2.46.1
django-storages-redux==1.3.2
boto3==1.4.7
django-storages==1.6.5
{% if cookiecutter.use_whitenoise != 'y' -%}
Collectfast==0.5.2
{%- endif %}
# Email backends for Mailgun, Postmark, SendGrid and more
# -------------------------------------------------------
django-anymail==0.8
django-anymail==0.11.1
{% if cookiecutter.use_sentry_for_error_reporting == "y" -%}
# Raven is the Sentry client
# --------------------------
raven==6.0.0
raven==6.1.0
{%- endif %}
{% if cookiecutter.use_opbeat == "y" -%}

View File

@ -4,14 +4,14 @@
{% if cookiecutter.windows == 'y' -%}
# Python-PostgreSQL Database Adapter
# If using Win for dev, this assumes Unix in test/prod
psycopg2==2.7.1
psycopg2==2.7.3.1
{%- endif %}
coverage==4.3.4
flake8==3.3.0 # pyup: != 2.6.0
django-test-plus==1.0.17
factory-boy==2.8.1
coverage==4.4.1
flake8==3.4.1 # pyup: != 2.6.0
django-test-plus==1.0.18
factory-boy==2.9.2
# pytest
pytest-django==3.1.2
pytest-sugar==0.8.0
pytest-sugar==0.9.0

View File

@ -1 +1 @@
{% if cookiecutter.use_python3 == 'y' -%}python-3.5.1{% else %}python-2.7.10{%- endif %}
python-3.6.2

View File

@ -2,6 +2,6 @@
max-line-length = 120
exclude = .tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules
[pep8]
[pycodestyle]
max-line-length = 120
exclude=.tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules

View File

@ -6,11 +6,7 @@ PROJECT_DIR="$(dirname "$WORK_DIR")"
pip --version >/dev/null 2>&1 || {
echo >&2 -e "\npip is required but it's not installed."
echo >&2 -e "You can install it by running the following command:\n"
{% if cookiecutter.use_python3 == 'y' -%}
echo >&2 "wget https://bootstrap.pypa.io/get-pip.py --output-document=get-pip.py; chmod +x get-pip.py; sudo -H python3 get-pip.py"
{% else %}
echo >&2 "wget https://bootstrap.pypa.io/get-pip.py --output-document=get-pip.py; chmod +x get-pip.py; sudo -H python2 get-pip.py"
{%- endif %}
echo >&2 -e "\n"
echo >&2 -e "\nFor more information, see pip documentation: https://pip.pypa.io/en/latest/"
exit 1;
@ -19,11 +15,7 @@ pip --version >/dev/null 2>&1 || {
virtualenv --version >/dev/null 2>&1 || {
echo >&2 -e "\nvirtualenv is required but it's not installed."
echo >&2 -e "You can install it by running the following command:\n"
{% if cookiecutter.use_python3 == 'y' -%}
echo >&2 "sudo -H pip3 install virtualenv"
{% else %}
echo >&2 "sudo -H pip2 install virtualenv"
{%- endif %}
echo >&2 -e "\n"
echo >&2 -e "\nFor more information, see virtualenv documentation: https://virtualenv.pypa.io/en/latest/"
exit 1;
@ -32,11 +24,7 @@ virtualenv --version >/dev/null 2>&1 || {
if [ -z "$VIRTUAL_ENV" ]; then
echo >&2 -e "\nYou need activate a virtualenv first"
echo >&2 -e 'If you do not have a virtualenv created, run the following command to create and automatically activate a new virtualenv named "venv" on current folder:\n'
{% if cookiecutter.use_python3 == 'y' -%}
echo >&2 -e "virtualenv venv --python=\`which python3\`"
{% else %}
echo >&2 -e "virtualenv venv --python=\`which python2\`"
{%- endif %}
echo >&2 -e "\nTo leave/disable the currently active virtualenv, run the following command:\n"
echo >&2 "deactivate"
echo >&2 -e "\nTo activate the virtualenv again, run the following command:\n"
@ -52,4 +40,3 @@ else
pip install -r $PROJECT_DIR/requirements.txt
{%- endif %}
fi

View File

@ -3,11 +3,7 @@
build-essential
#required to translate
gettext
{% if cookiecutter.use_python3 == 'y' -%}
python3-dev
{% else %}
python-dev
{%- endif %}
##shared dependencies of:
##Pillow, pylibmc

View File

@ -3,11 +3,7 @@
build-essential
#required to translate
gettext
{% if cookiecutter.use_python3 == 'y' -%}
python3-dev
{% else %}
python-dev
{%- endif %}
##shared dependencies of:
##Pillow, pylibmc
@ -25,4 +21,3 @@ libwebp-dev
##django-extensions
graphviz-dev

View File

@ -3,11 +3,7 @@
build-essential
#required to translate
gettext
{% if cookiecutter.use_python3 == 'y' -%}
python3-dev
{% else %}
python-dev
{%- endif %}
##shared dependencies of:
##Pillow, pylibmc
@ -25,4 +21,3 @@ libwebp-dev
##django-extensions
graphviz-dev

View File

@ -1,3 +1,2 @@
# -*- coding: utf-8 -*-
__version__ = '{{ cookiecutter.version }}'
__version_info__ = tuple([int(num) if num.isdigit() else num for num in __version__.replace('-', '.', 1).split('.')])

View File

@ -3,4 +3,3 @@ To understand why this file is here, please read:
http://cookiecutter-django.readthedocs.io/en/latest/faq.html#why-is-there-a-django-contrib-sites-directory-in-cookiecutter-django
"""
# -*- coding: utf-8 -*-

View File

@ -3,4 +3,3 @@ To understand why this file is here, please read:
http://cookiecutter-django.readthedocs.io/en/latest/faq.html#why-is-there-a-django-contrib-sites-directory-in-cookiecutter-django
"""
# -*- coding: utf-8 -*-

View File

@ -1,6 +1,3 @@
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
import django.contrib.sites.models
from django.contrib.sites.models import _simple_domain_name_validator
from django.db import migrations, models

View File

@ -1,6 +1,3 @@
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
import django.contrib.sites.models
from django.db import migrations, models

View File

@ -3,10 +3,6 @@ To understand why this file is here, please read:
http://cookiecutter-django.readthedocs.io/en/latest/faq.html#why-is-there-a-django-contrib-sites-directory-in-cookiecutter-django
"""
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.conf import settings
from django.db import migrations

View File

@ -3,4 +3,3 @@ To understand why this file is here, please read:
http://cookiecutter-django.readthedocs.io/en/latest/faq.html#why-is-there-a-django-contrib-sites-directory-in-cookiecutter-django
"""
# -*- coding: utf-8 -*-

View File

@ -1,3 +1,57 @@
{% if cookiecutter.custom_bootstrap_compilation == 'y' %}
@import "variables";
@import "custom_bootstrap_vars";
@import "mixins";
// Reset and dependencies
@import "normalize";
@import "print";
@import "glyphicons";
// Core CSS
@import "scaffolding";
@import "type";
@import "code";
@import "grid";
@import "tables";
@import "forms";
@import "buttons";
// Components
@import "component-animations";
@import "dropdowns";
@import "button-groups";
@import "input-groups";
@import "navs";
@import "navbar";
@import "breadcrumbs";
@import "pagination";
@import "pager";
@import "labels";
@import "badges";
@import "jumbotron";
@import "thumbnails";
@import "alerts";
@import "progress-bars";
@import "media";
@import "list-group";
@import "panels";
@import "responsive-embed";
@import "wells";
@import "close";
// Components w/ JavaScript
@import "modals";
@import "tooltip";
@import "popovers";
@import "carousel";
// Utility classes
@import "utilities";
@import "responsive-utilities";
{% endif %}
// project specific CSS goes here

View File

@ -1,5 +1,4 @@
{% if cookiecutter.use_celery == 'y' %}
from __future__ import absolute_import
import os
from celery import Celery
from django.apps import apps, AppConfig
@ -28,9 +27,19 @@ class CeleryConfig(AppConfig):
{% if cookiecutter.use_sentry_for_error_reporting == 'y' -%}
if hasattr(settings, 'RAVEN_CONFIG'):
# Celery signal registration
{% if cookiecutter.use_pycharm == 'y' -%}
# Since raven is required in production only,
# imports might (most surely will) be wiped out
# during PyCharm code clean up started
# in other environments.
# @formatter:off
{%- endif %}
from raven import Client as RavenClient
from raven.contrib.celery import register_signal as raven_register_signal
from raven.contrib.celery import register_logger_signal as raven_register_logger_signal
{% if cookiecutter.use_pycharm == 'y' -%}
# @formatter:on
{%- endif %}
raven_client = RavenClient(dsn=settings.RAVEN_CONFIG['DSN'])
raven_register_logger_signal(raven_client)
@ -39,10 +48,20 @@ class CeleryConfig(AppConfig):
{% if cookiecutter.use_opbeat == 'y' -%}
if hasattr(settings, 'OPBEAT'):
{% if cookiecutter.use_pycharm == 'y' -%}
# Since opbeat is required in production only,
# imports might (most surely will) be wiped out
# during PyCharm code clean up started
# in other environments.
# @formatter:off
{%- endif %}
from opbeat.contrib.django.models import client as opbeat_client
from opbeat.contrib.django.models import logger as opbeat_logger
from opbeat.contrib.django.models import register_handlers as opbeat_register_handlers
from opbeat.contrib.celery import register_signal as opbeat_register_signal
{% if cookiecutter.use_pycharm == 'y' -%}
# @formatter:on
{%- endif %}
try:
opbeat_register_signal(opbeat_client)

View File

@ -1,4 +1,4 @@
{% raw %}{% load staticfiles i18n {% endraw %}{% if cookiecutter.use_compressor == "y" %}compress{% endif %}{% raw %}%}<!DOCTYPE html>
{% raw %}{% load static i18n {% endraw %}{% if cookiecutter.use_compressor == "y" %}compress{% endif %}{% raw %}%}<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
@ -15,7 +15,7 @@
{% block css %}
<!-- Latest compiled and minified Bootstrap 4 Alpha 4 CSS -->
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0-alpha.4/css/bootstrap.min.css" integrity="sha384-2hfp1SzUoho7/TsGGGDaFdsuuDL0LX2hnUp6VkX3CUQ2K4K+xjboZdsXyp4oUHZj" crossorigin="anonymous">
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0-alpha.6/css/bootstrap.min.css" integrity="sha384-rwoIResjU2yc3z8GV/NPeZWAv56rSmLldC3R/AZzGRnGxQQKnKkoFVhFQhNUwEyJ" crossorigin="anonymous">
<!-- Your stuff: Third-party CSS libraries go here -->
{% endraw %}{% if cookiecutter.use_compressor == "y" %}{% raw %}{% compress css %}{% endraw %}{% endif %}{% raw %}
@ -29,44 +29,43 @@
<body>
<div class="m-b-1">
<nav class="navbar navbar-dark navbar-static-top bg-inverse">
<div class="container">
<a class="navbar-brand" href="/">{% endraw %}{{ cookiecutter.project_name }}{% raw %}</a>
<button type="button" class="navbar-toggler hidden-sm-up pull-xs-right" data-toggle="collapse" data-target="#bs-navbar-collapse-1">
&#9776;
<nav class="navbar navbar-toggleable-md navbar-light bg-faded">
<button class="navbar-toggler navbar-toggler-right" type="button" data-toggle="collapse" data-target="#navbarSupportedContent" aria-controls="navbarSupportedContent" aria-expanded="false" aria-label="Toggle navigation">
<span class="navbar-toggler-icon"></span>
</button>
<a class="navbar-brand" href="{% url 'home' %}">{% endraw %}{{ cookiecutter.project_name }}{% raw %}</a>
<!-- Collect the nav links, forms, and other content for toggling -->
<div class="collapse navbar-toggleable-xs" id="bs-navbar-collapse-1">
<ul class="nav navbar-nav">
<li class="nav-item">
<a class="nav-link" href="{% url 'home' %}">Home</a>
<div class="collapse navbar-collapse" id="navbarSupportedContent">
<ul class="navbar-nav mr-auto">
<li class="nav-item active">
<a class="nav-link" href="{% url 'home' %}">Home <span class="sr-only">(current)</span></a>
</li>
<li class="nav-item">
<a class="nav-link" href="{% url 'about' %}">About</a>
</li>
</ul>
<ul class="nav navbar-nav pull-xs-right">
{% if request.user.is_authenticated %}
<li class="nav-item">
{# URL provided by django-allauth/account/urls.py #}
<a class="nav-link" href="{% url 'users:detail' request.user.username %}">{% trans "My Profile" %}</a>
</li>
<li class="nav-item">
{# URL provided by django-allauth/account/urls.py #}
<a class="nav-link" href="{% url 'account_logout' %}">{% trans "Sign Out" %}</a>
</li>
{% else %}
<li class="nav-item">
{# URL provided by django-allauth/account/urls.py #}
<a id="sign-up-link" class="nav-link" href="{% url 'account_signup' %}">{% trans "Sign Up" %}</a>
</li>
<li class="nav-item">
{# URL provided by django-allauth/account/urls.py #}
<a id="log-in-link" class="nav-link" href="{% url 'account_login' %}">{% trans "Sign In" %}</a>
</li>
{% endif %}
</ul>
</div>
</div>
</nav>
</div>
<div class="container">
@ -90,9 +89,9 @@
<!-- Placed at the end of the document so the pages load faster -->
{% block javascript %}
<!-- Required by Bootstrap v4 Alpha 4 -->
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.1.1/jquery.min.js" integrity="sha384-3ceskX3iaEnIogmQchP8opvBy3Mi7Ce34nWjpBIwVTHfGYWQS9jwHDVRnpKKHJg7" crossorigin="anonymous"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/tether/1.3.7/js/tether.min.js" integrity="sha384-XTs3FgkjiBgo8qjEjBk0tGmf3wPrWtA6coPfQDfFEY8AnYJwjalXCiosYRBIBZX8" crossorigin="anonymous"></script>
<script src="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0-alpha.4/js/bootstrap.min.js" integrity="sha384-VjEeINv9OSwtWFLAtmc4JCtEJXXBub00gtSnszmspDLCtC0I4z4nqz7rEFbIZLLU" crossorigin="anonymous"></script>
<script src="https://code.jquery.com/jquery-3.1.1.slim.min.js" integrity="sha384-A7FZj7v+d/sdmMqp/nOQwliLvUsJfDHW+k9Omg/a/EheAdgtzNs3hpfag6Ed950n" crossorigin="anonymous"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/tether/1.4.0/js/tether.min.js" integrity="sha384-DztdAPBWPRXSA/3eYEEUWrWCy7G5KFbe8fFjk5JAIxUYHKkDx6Qin1DkWx51bBrb" crossorigin="anonymous"></script>
<script src="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0-alpha.6/js/bootstrap.min.js" integrity="sha384-vBWWzlZJ8ea9aCX4pEW3rVHjgjt7zpkNpZk+02D9phzyeVkE+jo0ieGizqPLForn" crossorigin="anonymous"></script>
<!-- Your stuff: Third-party javascript libraries go here -->

View File

@ -1 +0,0 @@
# -*- coding: utf-8 -*-

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
from django.conf import settings
from allauth.account.adapter import DefaultAccountAdapter
from allauth.socialaccount.adapter import DefaultSocialAccountAdapter

View File

@ -1,6 +1,3 @@
# -*- coding: utf-8 -*-
from __future__ import absolute_import, unicode_literals
from django import forms
from django.contrib import admin
from django.contrib.auth.admin import UserAdmin as AuthUserAdmin

View File

@ -1,7 +1,3 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.10.1 on 2016-09-23 04:36
from __future__ import unicode_literals
import django.contrib.auth.models
import django.contrib.auth.validators
from django.db import migrations, models

View File

@ -1,6 +1,3 @@
# -*- coding: utf-8 -*-
from __future__ import unicode_literals, absolute_import
from django.contrib.auth.models import AbstractUser
from django.core.urlresolvers import reverse
from django.db import models

View File

@ -1,6 +1,3 @@
# -*- coding: utf-8 -*-
from __future__ import absolute_import, unicode_literals
from django.conf.urls import url
from . import views

View File

@ -1,6 +1,3 @@
# -*- coding: utf-8 -*-
from __future__ import absolute_import, unicode_literals
from django.core.urlresolvers import reverse
from django.views.generic import DetailView, ListView, RedirectView, UpdateView