Removed unused docs and files used by removed options.

This commit is contained in:
Corey Oordt 2019-07-22 13:56:24 -05:00
parent b48eb3670b
commit f265b9bc4d
16 changed files with 6 additions and 597 deletions

View File

@ -1,121 +0,0 @@
Deployment on Heroku
====================
.. index:: Heroku
Commands to run
---------------
Run these commands to deploy the project to Heroku:
.. code-block:: bash
heroku create --buildpack https://github.com/heroku/heroku-buildpack-python
heroku addons:create heroku-postgresql:hobby-dev
# On Windows use double quotes for the time zone, e.g.
# heroku pg:backups schedule --at "02:00 America/Los_Angeles" DATABASE_URL
heroku pg:backups schedule --at '02:00 America/Los_Angeles' DATABASE_URL
heroku pg:promote DATABASE_URL
heroku addons:create heroku-redis:hobby-dev
heroku addons:create mailgun:starter
heroku config:set PYTHONHASHSEED=random
heroku config:set WEB_CONCURRENCY=4
heroku config:set DJANGO_DEBUG=False
heroku config:set DJANGO_SETTINGS_MODULE=config.settings.production
heroku config:set DJANGO_SECRET_KEY="$(openssl rand -base64 64)"
# Generating a 32 character-long random string without any of the visually similar characters "IOl01":
heroku config:set DJANGO_ADMIN_URL="$(openssl rand -base64 4096 | tr -dc 'A-HJ-NP-Za-km-z2-9' | head -c 32)/"
# Set this to your Heroku app url, e.g. 'bionic-beaver-28392.herokuapp.com'
heroku config:set DJANGO_ALLOWED_HOSTS=
# Assign with AWS_ACCESS_KEY_ID
heroku config:set DJANGO_AWS_ACCESS_KEY_ID=
# Assign with AWS_SECRET_ACCESS_KEY
heroku config:set DJANGO_AWS_SECRET_ACCESS_KEY=
# Assign with AWS_STORAGE_BUCKET_NAME
heroku config:set DJANGO_AWS_STORAGE_BUCKET_NAME=
git push heroku master
heroku run python manage.py createsuperuser
heroku run python manage.py check --deploy
heroku open
.. warning::
.. include:: mailgun.rst
Optional actions
----------------
Celery
++++++
Celery requires a few extra environment variables to be ready operational. Also, the worker is created,
it's in the ``Procfile``, but is turned off by default:
.. code-block:: bash
# Set the broker URL to Redis
heroku config:set CELERY_BROKER_URL=`heroku config:get REDIS_URL`
# Scale dyno to 1 instance
heroku ps:scale worker=1
Sentry
++++++
If you're opted for Sentry error tracking, you can either install it through the `Sentry add-on`_:
.. code-block:: bash
heroku addons:create sentry:f1
Or add the DSN for your account, if you already have one:
.. code-block:: bash
heroku config:set SENTRY_DSN=https://xxxx@sentry.io/12345
.. _Sentry add-on: https://elements.heroku.com/addons/sentry
Gulp & Bootstrap compilation
++++++++++++++++++++++++++++
If you've opted for a custom bootstrap build, you'll most likely need to setup
your app to use `multiple buildpacks`_: one for Python & one for Node.js:
.. code-block:: bash
heroku buildpacks:add --index 1 heroku/nodejs
At time of writing, this should do the trick: during deployment,
the Heroku should run ``npm install`` and then ``npm build``,
which runs Gulp in cookiecutter-django.
If things don't work, please refer to the Heroku docs.
.. _multiple buildpacks: https://devcenter.heroku.com/articles/using-multiple-buildpacks-for-an-app
About Heroku & Docker
---------------------
Although Heroku has some sort of `Docker support`_, it's not supported by cookiecutter-django.
We invite you to follow Heroku documentation about it.
.. _Docker support: https://devcenter.heroku.com/articles/build-docker-images-heroku-yml

View File

@ -1,181 +0,0 @@
Deployment on PythonAnywhere
============================
.. index:: PythonAnywhere
Overview
--------
Full instructions follow, but here's a high-level view.
**First time config**:
1. Pull your code down to PythonAnywhere using a *Bash console* and setup a virtualenv
2. Set your config variables in the *postactivate* script
3. Run the *manage.py* ``migrate`` and ``collectstatic`` commands
4. Add an entry to the PythonAnywhere *Web tab*
5. Set your config variables in the PythonAnywhere *WSGI config file*
Once you've been through this one-off config, future deployments are much simpler: just ``git pull`` and then hit the "Reload" button :)
Getting your code and dependencies installed on PythonAnywhere
--------------------------------------------------------------
Make sure your project is fully committed and pushed up to Bitbucket or Github or wherever it may be. Then, log into your PythonAnywhere account, open up a **Bash** console, clone your repo, and create a virtualenv:
.. code-block:: bash
git clone <my-repo-url> # you can also use hg
cd my-project-name
mkvirtualenv --python=/usr/bin/python3.6 my-project-name
pip install -r requirements/production.txt # may take a few minutes
Setting environment variables in the console
--------------------------------------------
Generate a secret key for yourself, eg like this:
.. code-block:: bash
python -c 'import random;import string; print("".join(random.SystemRandom().choice(string.digits + string.ascii_letters + string.punctuation) for _ in range(50)))'
Make a note of it, since we'll need it here in the console and later on in the web app config tab.
Set environment variables via the virtualenv "postactivate" script (this will set them every time you use the virtualenv in a console):
.. code-block:: bash
vi $VIRTUAL_ENV/bin/postactivate
**TIP:** *If you don't like vi, you can also edit this file via the PythonAnywhere "Files" menu; look in the ".virtualenvs" folder*.
Add these exports
.. code-block:: bash
export WEB_CONCURRENCY=4
export DJANGO_SETTINGS_MODULE='config.settings.production'
export DJANGO_SECRET_KEY='<secret key goes here>'
export DJANGO_ALLOWED_HOSTS='<www.your-domain.com>'
export DJANGO_ADMIN_URL='<not admin/>'
export MAILGUN_API_KEY='<mailgun key>'
export MAILGUN_DOMAIN='<mailgun sender domain (e.g. mg.yourdomain.com)>'
export DJANGO_AWS_ACCESS_KEY_ID=
export DJANGO_AWS_SECRET_ACCESS_KEY=
export DJANGO_AWS_STORAGE_BUCKET_NAME=
export DATABASE_URL='<see below>'
**NOTE:** *The AWS details are not required if you're using whitenoise or the built-in pythonanywhere static files service, but you do need to set them to blank, as above.*
Database setup:
---------------
Go to the PythonAnywhere **Databases tab** and configure your database.
* For Postgres, setup your superuser password, then open a Postgres console and run a ``CREATE DATABASE my-db-name``. You should probably also set up a specific role and permissions for your app, rather than using the superuser credentials. Make a note of the address and port of your postgres server.
* For MySQL, set the password and create a database. More info here: https://help.pythonanywhere.com/pages/UsingMySQL
* You can also use sqlite if you like! Not recommended for anything beyond toy projects though.
Now go back to the *postactivate* script and set the ``DATABASE_URL`` environment variable:
.. code-block:: bash
export DATABASE_URL='postgres://<postgres-username>:<postgres-password>@<postgres-address>:<postgres-port>/<database-name>'
# or
export DATABASE_URL='mysql://<pythonanywhere-username>:<mysql-password>@<mysql-address>/<database-name>'
# or
export DATABASE_URL='sqlite:////home/yourusername/path/to/db.sqlite'
If you're using MySQL, you may need to run ``pip install mysqlclient``, and maybe add ``mysqlclient`` to *requirements/production.txt* too.
Now run the migration, and collectstatic:
.. code-block:: bash
source $VIRTUAL_ENV/bin/postactivate
python manage.py migrate
python manage.py collectstatic
# and, optionally
python manage.py createsuperuser
Configure the PythonAnywhere Web Tab
------------------------------------
Go to the PythonAnywhere **Web tab**, hit **Add new web app**, and choose **Manual Config**, and then the version of Python you used for your virtualenv.
**NOTE:** *If you're using a custom domain (not on \*.pythonanywhere.com), then you'll need to set up a CNAME with your domain registrar.*
When you're redirected back to the web app config screen, set the **path to your virtualenv**. If you used virtualenvwrapper as above, you can just enter its name.
Click through to the **WSGI configuration file** link (near the top) and edit the wsgi file. Make it look something like this, repeating the environment variables you used earlier:
.. code-block:: python
import os
import sys
path = '/home/<your-username>/<your-project-directory>'
if path not in sys.path:
sys.path.append(path)
os.environ['DJANGO_SETTINGS_MODULE'] = 'config.settings.production'
os.environ['DJANGO_SECRET_KEY'] = '<as above>'
os.environ['DJANGO_ALLOWED_HOSTS'] = '<as above>'
os.environ['DJANGO_ADMIN_URL'] = '<as above>'
os.environ['MAILGUN_API_KEY'] = '<as above>'
os.environ['MAILGUN_DOMAIN'] = '<as above>'
os.environ['DJANGO_AWS_ACCESS_KEY_ID'] = ''
os.environ['DJANGO_AWS_SECRET_ACCESS_KEY'] = ''
os.environ['DJANGO_AWS_STORAGE_BUCKET_NAME'] = ''
os.environ['DATABASE_URL'] = '<as above>'
from django.core.wsgi import get_wsgi_application
application = get_wsgi_application()
Back on the Web tab, hit **Reload**, and your app should be live!
**NOTE:** *you may see security warnings until you set up your SSL certificates. If you
want to suppress them temporarily, set DJANGO_SECURE_SSL_REDIRECT to blank. Follow
the instructions here to get SSL set up: https://help.pythonanywhere.com/pages/SSLOwnDomains/*
Optional: static files
----------------------
If you want to use the PythonAnywhere static files service instead of using whitenoise or S3, you'll find its configuration section on the Web tab. Essentially you'll need an entry to match your ``STATIC_URL`` and ``STATIC_ROOT`` settings. There's more info here: https://help.pythonanywhere.com/pages/DjangoStaticFiles
Future deployments
------------------
For subsequent deployments, the procedure is much simpler. In a Bash console:
.. code-block:: bash
workon my-virtualenv-name
cd project-directory
git pull
python manage.py migrate
python manage.py collectstatic
And then go to the Web tab and hit **Reload**
**TIP:** *if you're really keen, you can set up git-push based deployments: https://blog.pythonanywhere.com/87/*

View File

@ -18,8 +18,6 @@ Contents:
settings settings
linters linters
testing testing
deployment-on-pythonanywhere
deployment-on-heroku
deployment-with-docker deployment-with-docker
docker-postgres-backups docker-postgres-backups
faq faq

View File

@ -42,8 +42,9 @@ cloud_provider:
Select a cloud provider for static & media files. The choices are: Select a cloud provider for static & media files. The choices are:
1. AWS_ 1. AWS_
2. GCP_ 2. Azure
3. None 3. GCP_
4. None
Note that if you choose no cloud provider, media files won't work. Note that if you choose no cloud provider, media files won't work.
@ -63,10 +64,6 @@ use_sentry:
use_whitenoise: use_whitenoise:
Indicates whether the project should be configured to use WhiteNoise_. Indicates whether the project should be configured to use WhiteNoise_.
use_heroku:
Indicates whether the project should be configured so as to be deployable
to Heroku_.
use_travisci: use_travisci:
Indicates whether the project should be configured to use `Travis CI`_. Indicates whether the project should be configured to use `Travis CI`_.

View File

@ -61,22 +61,6 @@ def remove_docker_files():
os.remove(file_name) os.remove(file_name)
def remove_utility_files():
shutil.rmtree("utility")
def remove_heroku_files():
file_names = ["Procfile", "runtime.txt", "requirements.txt"]
for file_name in file_names:
if (
file_name == "requirements.txt"
and "{{ cookiecutter.use_travisci }}".lower() == "y"
):
# don't remove the file if we are using travisci but not using heroku
continue
os.remove(file_name)
def remove_gulp_files(): def remove_gulp_files():
file_names = ["gulpfile.js"] file_names = ["gulpfile.js"]
for file_name in file_names: for file_name in file_names:
@ -293,9 +277,6 @@ def main():
if "{{ cookiecutter.open_source_license}}" != "GPLv3": if "{{ cookiecutter.open_source_license}}" != "GPLv3":
remove_gplv3_files() remove_gplv3_files()
remove_utility_files()
remove_heroku_files()
append_to_gitignore_file(".env") append_to_gitignore_file(".env")
append_to_gitignore_file(".envs/*") append_to_gitignore_file(".envs/*")

View File

@ -1,5 +0,0 @@
release: python manage.py migrate
web: gunicorn config.wsgi:application
{% if cookiecutter.use_celery == "y" -%}
worker: celery worker --app=config.celery_app --loglevel=info
{%- endif %}

View File

@ -1 +0,0 @@
python-3.6.8

View File

@ -1,96 +0,0 @@
#!/bin/bash
WORK_DIR="$(dirname "$0")"
DISTRO_NAME=$(lsb_release -sc)
OS_REQUIREMENTS_FILENAME="requirements-$DISTRO_NAME.apt"
cd $WORK_DIR
# Check if a requirements file exist for the current distribution.
if [ ! -r "$OS_REQUIREMENTS_FILENAME" ]; then
cat <<-EOF >&2
There is no requirements file for your distribution.
You can see one of the files listed below to help search the equivalent package in your system:
$(find ./ -name "requirements-*.apt" -printf " - %f\n")
EOF
exit 1;
fi
# Handle call with wrong command
function wrong_command()
{
echo "${0##*/} - unknown command: '${1}'" >&2
usage_message
}
# Print help / script usage
function usage_message()
{
cat <<-EOF
Usage: $WORK_DIR/${0##*/} <command>
Available commands are:
list Print a list of all packages defined on ${OS_REQUIREMENTS_FILENAME} file
help Print this help
Commands that require superuser permission:
install Install packages defined on ${OS_REQUIREMENTS_FILENAME} file. Note: This
does not upgrade the packages already installed for new versions, even if
new version is available in the repository.
upgrade Same that install, but upgrade the already installed packages, if new
version is available.
EOF
}
# Read the requirements.apt file, and remove comments and blank lines
function list_packages(){
grep -v "#" "${OS_REQUIREMENTS_FILENAME}" | grep -v "^$";
}
function install_packages()
{
list_packages | xargs apt-get --no-upgrade install -y;
}
function upgrade_packages()
{
list_packages | xargs apt-get install -y;
}
function install_or_upgrade()
{
P=${1}
PARAN=${P:-"install"}
if [[ $EUID -ne 0 ]]; then
cat <<-EOF >&2
You must run this script with root privilege
Please do:
sudo $WORK_DIR/${0##*/} $PARAN
EOF
exit 1
else
apt-get update
# Install the basic compilation dependencies and other required libraries of this project
if [ "$PARAN" == "install" ]; then
install_packages;
else
upgrade_packages;
fi
# cleaning downloaded packages from apt-get cache
apt-get clean
exit 0
fi
}
# Handle command argument
case "$1" in
install) install_or_upgrade;;
upgrade) install_or_upgrade "upgrade";;
list) list_packages;;
help|"") usage_message;;
*) wrong_command "$1";;
esac

View File

@ -1,39 +0,0 @@
#!/bin/bash
WORK_DIR="$(dirname "$0")"
PROJECT_DIR="$(dirname "$WORK_DIR")"
pip --version >/dev/null 2>&1 || {
echo >&2 -e "\npip is required but it's not installed."
echo >&2 -e "You can install it by running the following command:\n"
echo >&2 "wget https://bootstrap.pypa.io/get-pip.py --output-document=get-pip.py; chmod +x get-pip.py; sudo -H python3 get-pip.py"
echo >&2 -e "\n"
echo >&2 -e "\nFor more information, see pip documentation: https://pip.pypa.io/en/latest/"
exit 1;
}
virtualenv --version >/dev/null 2>&1 || {
echo >&2 -e "\nvirtualenv is required but it's not installed."
echo >&2 -e "You can install it by running the following command:\n"
echo >&2 "sudo -H pip3 install virtualenv"
echo >&2 -e "\n"
echo >&2 -e "\nFor more information, see virtualenv documentation: https://virtualenv.pypa.io/en/latest/"
exit 1;
}
if [ -z "$VIRTUAL_ENV" ]; then
echo >&2 -e "\nYou need activate a virtualenv first"
echo >&2 -e 'If you do not have a virtualenv created, run the following command to create and automatically activate a new virtualenv named "venv" on current folder:\n'
echo >&2 -e "virtualenv venv --python=\`which python3\`"
echo >&2 -e "\nTo leave/disable the currently active virtualenv, run the following command:\n"
echo >&2 "deactivate"
echo >&2 -e "\nTo activate the virtualenv again, run the following command:\n"
echo >&2 "source venv/bin/activate"
echo >&2 -e "\nFor more information, see virtualenv documentation: https://virtualenv.pypa.io/en/latest/"
echo >&2 -e "\n"
exit 1;
else
pip install -r $PROJECT_DIR/requirements/local.txt
pip install -r $PROJECT_DIR/requirements.txt
fi

View File

@ -1,23 +0,0 @@
##basic build dependencies of various Django apps for Ubuntu Bionic 18.04
#build-essential metapackage install: make, gcc, g++,
build-essential
#required to translate
gettext
python3-dev
##shared dependencies of:
##Pillow, pylibmc
zlib1g-dev
##Postgresql and psycopg2 dependencies
libpq-dev
##Pillow dependencies
libtiff5-dev
libjpeg8-dev
libfreetype6-dev
liblcms2-dev
libwebp-dev
##django-extensions
libgraphviz-dev

View File

@ -1,23 +0,0 @@
##basic build dependencies of various Django apps for Debian Jessie 8.x
#build-essential metapackage install: make, gcc, g++,
build-essential
#required to translate
gettext
python3-dev
##shared dependencies of:
##Pillow, pylibmc
zlib1g-dev
##Postgresql and psycopg2 dependencies
libpq-dev
##Pillow dependencies
libtiff5-dev
libjpeg62-turbo-dev
libfreetype6-dev
liblcms2-dev
libwebp-dev
##django-extensions
graphviz-dev

View File

@ -1,23 +0,0 @@
##basic build dependencies of various Django apps for Debian Jessie 9.x
#build-essential metapackage install: make, gcc, g++,
build-essential
#required to translate
gettext
python3-dev
##shared dependencies of:
##Pillow, pylibmc
zlib1g-dev
##Postgresql and psycopg2 dependencies
libpq-dev
##Pillow dependencies
libtiff5-dev
libjpeg62-turbo-dev
libfreetype6-dev
liblcms2-dev
libwebp-dev
##django-extensions
graphviz-dev

View File

@ -1,23 +0,0 @@
##basic build dependencies of various Django apps for Ubuntu Trusty 14.04
#build-essential metapackage install: make, gcc, g++,
build-essential
#required to translate
gettext
python3-dev
##shared dependencies of:
##Pillow, pylibmc
zlib1g-dev
##Postgresql and psycopg2 dependencies
libpq-dev
##Pillow dependencies
libtiff4-dev
libjpeg8-dev
libfreetype6-dev
liblcms1-dev
libwebp-dev
##django-extensions
graphviz-dev

View File

@ -1,23 +0,0 @@
##basic build dependencies of various Django apps for Ubuntu Xenial 16.04
#build-essential metapackage install: make, gcc, g++,
build-essential
#required to translate
gettext
python3-dev
##shared dependencies of:
##Pillow, pylibmc
zlib1g-dev
##Postgresql and psycopg2 dependencies
libpq-dev
##Pillow dependencies
libtiff5-dev
libjpeg8-dev
libfreetype6-dev
liblcms2-dev
libwebp-dev
##django-extensions
graphviz-dev

View File

@ -26,49 +26,40 @@ CACHES = {
# EMAIL # EMAIL
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
{% if cookiecutter.use_mailhog == 'y' -%} {% if cookiecutter.use_mailhog == 'y' -%}
# https://docs.djangoproject.com/en/dev/ref/settings/#email-host
EMAIL_HOST = env("EMAIL_HOST", default="mailhog") EMAIL_HOST = env("EMAIL_HOST", default="mailhog")
{%- else -%} {%- else -%}
# https://docs.djangoproject.com/en/dev/ref/settings/#email-backend
EMAIL_BACKEND = env( EMAIL_BACKEND = env(
"DJANGO_EMAIL_BACKEND", default="django.core.mail.backends.console.EmailBackend" "DJANGO_EMAIL_BACKEND", default="django.core.mail.backends.console.EmailBackend"
) )
# https://docs.djangoproject.com/en/dev/ref/settings/#email-host
EMAIL_HOST = "localhost" EMAIL_HOST = "localhost"
{%- endif %} {%- endif %}
# https://docs.djangoproject.com/en/dev/ref/settings/#email-port
EMAIL_PORT = 1025 EMAIL_PORT = 1025
# django-debug-toolbar # django-debug-toolbar
# ------------------------------------------------------------------------------
# https://django-debug-toolbar.readthedocs.io/en/latest/installation.html#prerequisites # https://django-debug-toolbar.readthedocs.io/en/latest/installation.html#prerequisites
# ------------------------------------------------------------------------------
INSTALLED_APPS += ["debug_toolbar"] # noqa F405 INSTALLED_APPS += ["debug_toolbar"] # noqa F405
# https://django-debug-toolbar.readthedocs.io/en/latest/installation.html#middleware
MIDDLEWARE += ["debug_toolbar.middleware.DebugToolbarMiddleware"] # noqa F405 MIDDLEWARE += ["debug_toolbar.middleware.DebugToolbarMiddleware"] # noqa F405
# https://django-debug-toolbar.readthedocs.io/en/latest/configuration.html#debug-toolbar-config
DEBUG_TOOLBAR_CONFIG = { DEBUG_TOOLBAR_CONFIG = {
"DISABLE_PANELS": ["debug_toolbar.panels.redirects.RedirectsPanel"], "DISABLE_PANELS": ["debug_toolbar.panels.redirects.RedirectsPanel"],
"SHOW_TEMPLATE_CONTEXT": True, "SHOW_TEMPLATE_CONTEXT": True,
} }
# https://django-debug-toolbar.readthedocs.io/en/latest/installation.html#internal-ips
INTERNAL_IPS = ["127.0.0.1", "10.0.2.2"] INTERNAL_IPS = ["127.0.0.1", "10.0.2.2"]
if env("USE_DOCKER") == "yes": if env.bool("USE_DOCKER", False) == "yes":
import socket import socket
hostname, _, ips = socket.gethostbyname_ex(socket.gethostname()) hostname, _, ips = socket.gethostbyname_ex(socket.gethostname())
INTERNAL_IPS += [ip[:-1] + "1" for ip in ips] INTERNAL_IPS += [ip[:-1] + "1" for ip in ips]
# django-extensions # django-extensions
# ------------------------------------------------------------------------------
# https://django-extensions.readthedocs.io/en/latest/installation_instructions.html#configuration # https://django-extensions.readthedocs.io/en/latest/installation_instructions.html#configuration
# ------------------------------------------------------------------------------
INSTALLED_APPS += ["django_extensions"] # noqa F405 INSTALLED_APPS += ["django_extensions"] # noqa F405
{% if cookiecutter.use_celery == 'y' -%} {% if cookiecutter.use_celery == 'y' -%}
# Celery # Celery
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-always-eager
CELERY_TASK_ALWAYS_EAGER = True CELERY_TASK_ALWAYS_EAGER = True
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-eager-propagates
CELERY_TASK_EAGER_PROPAGATES = True CELERY_TASK_EAGER_PROPAGATES = True
{%- endif %} {%- endif %}