Group environment variables by the corresponding directories (#1295)

* Update generated project's .gitignore

* Post-gen gitignore .env/ and .env

* Fix linesep between gitignored entries

* Persist `.env/**/*` files into cookiecutter-django's VCS

* Rename .env/ to .envs/

* Reference the newly created .envs/**/.* files in local.yml

* Reference the newly created .envs/**/.* files in production.yml

* Delete .env.example

* Refactor post-gen-project.py

Closes #1299.

* Implement production-dotenv-files-to-dotenv-file merge script

* Create shared PyCharm Run Configuration for the automation script

* Randomize POSTGRES_PASSWORD in ./envs/(.local|.production)/.postgres

* Default POSTGRES_PASSWORD and POSTGRES_USER to random values

* Fix jinja linebreaks in local.yml

* Spaces in production.yml

* Fix post-merge leftovers & set DJANGO_ADMIN_URL automatically

* Prettify here and there

* Fix FileNotFoundError

* Leave a TODO in post_gen_hook.py

* Introduce keep_local_envs_in_vcs option

* Remove envs when not opted for

* Inline pre_gen_project.py if-condition

* Get rid of PROJECT_DIR_PATH in post_gen_project.py

* Clean up the docs

* Match copyright notices

* Document envs ins and outs
This commit is contained in:
Nikita Shupeyko 2018-03-08 15:56:15 +03:00 committed by GitHub
parent 6c8538abfe
commit 3f8aa26d0f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
24 changed files with 415 additions and 270 deletions

View File

@ -1,4 +1,4 @@
Copyright (c) 2013-2018, Daniel Greenfeld Copyright (c) 2013-2018, Daniel Roy Greenfeld
All rights reserved. All rights reserved.
Redistribution and use in source and binary forms, with or without modification, Redistribution and use in source and binary forms, with or without modification,

View File

@ -129,7 +129,7 @@ Pyup brings you automated security and dependency updates used by Google and oth
Usage Usage
------ ------
Let's pretend you want to create a Django project called "redditclone". Rather than using `startproject` Let's pretend you want to create a Django project called "redditclone". Rather than using ``startproject``
and then editing the results to include your name, email, and various configuration issues that always get forgotten until the worst possible moment, get cookiecutter_ to do all the work. and then editing the results to include your name, email, and various configuration issues that always get forgotten until the worst possible moment, get cookiecutter_ to do all the work.
First, get Cookiecutter. Trust me, it's awesome:: First, get Cookiecutter. Trust me, it's awesome::
@ -192,6 +192,7 @@ Answer the prompts with your own desired options_. For example::
4 - Apache Software License 2.0 4 - Apache Software License 2.0
5 - Not open source 5 - Not open source
Choose from 1, 2, 3, 4, 5 [1]: 1 Choose from 1, 2, 3, 4, 5 [1]: 1
keep_local_envs_in_vcs [y]: y
Enter the project and take a look around:: Enter the project and take a look around::

View File

@ -39,5 +39,6 @@
"use_opbeat": "n", "use_opbeat": "n",
"use_whitenoise": "y", "use_whitenoise": "y",
"use_heroku": "n", "use_heroku": "n",
"use_travisci": "n" "use_travisci": "n",
"keep_local_envs_in_vcs": "y"
} }

View File

@ -42,7 +42,7 @@ master_doc = 'index'
# General information about the project. # General information about the project.
project = 'Cookiecutter Django' project = 'Cookiecutter Django'
copyright = "2013-2016, Daniel Roy Greenfeld".format(now.year) copyright = "2013-2018, Daniel Roy Greenfeld".format(now.year)
# The version info for the project you're documenting, acts as replacement for # The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the # |version| and |release|, also used in various other places throughout the

View File

@ -83,7 +83,7 @@ Database setup:
Go to the PythonAnywhere **Databases tab** and configure your database. Go to the PythonAnywhere **Databases tab** and configure your database.
* For Postgres, setup your superuser password, then open a Postgres console and run a `CREATE DATABASE my-db-name`. You should probably also set up a specific role and permissions for your app, rather than using the superuser credentials. Make a note of the address and port of your postgres server. * For Postgres, setup your superuser password, then open a Postgres console and run a ``CREATE DATABASE my-db-name``. You should probably also set up a specific role and permissions for your app, rather than using the superuser credentials. Make a note of the address and port of your postgres server.
* For MySQL, set the password and create a database. More info here: https://help.pythonanywhere.com/pages/UsingMySQL * For MySQL, set the password and create a database. More info here: https://help.pythonanywhere.com/pages/UsingMySQL

View File

@ -1,5 +1,5 @@
Deployment with Docker Deployment with Docker
======================= ======================
.. index:: Docker, deployment .. index:: Docker, deployment
@ -10,7 +10,7 @@ Prerequisites
* Docker Compose (at least 1.6) * Docker Compose (at least 1.6)
Understand the Compose Setup Understand the Compose Setup
-------------------------------- ----------------------------
Before you start, check out the `production.yml` file in the root of this project. This is where each component Before you start, check out the `production.yml` file in the root of this project. This is where each component
of this application gets its configuration from. Notice how it provides configuration for these services: of this application gets its configuration from. Notice how it provides configuration for these services:
@ -73,12 +73,12 @@ You can read more about this here at `Automatic HTTPS`_ in the Caddy docs.
.. _Automatic HTTPS: https://caddyserver.com/docs/automatic-https .. _Automatic HTTPS: https://caddyserver.com/docs/automatic-https
Optional: Postgres Data Volume Modifications (Optional) Postgres Data Volume Modifications
--------------------------------------------- ---------------------------------------------
Postgres is saving its database files to the `postgres_data` volume by default. Change that if you want something else and make sure to make backups since this is not done automatically. Postgres is saving its database files to the `postgres_data` volume by default. Change that if you want something else and make sure to make backups since this is not done automatically.
Run your app with docker-compose Run your app with Docker Compose
-------------------------------- --------------------------------
To get started, pull your code from source control (don't forget the `.env` file) and change to your projects root To get started, pull your code from source control (don't forget the `.env` file) and change to your projects root
@ -94,15 +94,15 @@ Once this is ready, you can run it with::
To run a migration, open up a second terminal and run:: To run a migration, open up a second terminal and run::
docker-compose -f production.yml run django python manage.py migrate docker-compose -f production.yml run --rm django python manage.py migrate
To create a superuser, run:: To create a superuser, run::
docker-compose -f production.yml run django python manage.py createsuperuser docker-compose -f production.yml run --rm django python manage.py createsuperuser
If you need a shell, run:: If you need a shell, run::
docker-compose -f production.yml run django python manage.py shell docker-compose -f production.yml run --rm django python manage.py shell
To get an output of all running containers. To get an output of all running containers.

View File

@ -6,23 +6,19 @@ Getting Up and Running Locally With Docker
The steps below will get you up and running with a local development environment. The steps below will get you up and running with a local development environment.
All of these commands assume you are in the root of your generated project. All of these commands assume you are in the root of your generated project.
Prerequisites Prerequisites
------------- -------------
You'll need at least Docker 1.10. * Docker; if you don't have it yet, follow the `installation instructions`_;
* Docker Compose; refer to the official documentation for the `installation guide`_.
If you don't already have it installed, follow the instructions for your OS: .. _`installation instructions`: https://docs.docker.com/install/#supported-platforms
.. _`installation guide`: https://docs.docker.com/compose/install/
- On Mac OS X, you'll need `Docker for Mac`_
- On Windows, you'll need `Docker for Windows`_
- On Linux, you'll need `docker-engine`_
.. _`Docker for Mac`: https://docs.docker.com/engine/installation/mac/ Attention, Windows Users
.. _`Docker for Windows`: https://docs.docker.com/engine/installation/windows/ ------------------------
.. _`docker-engine`: https://docs.docker.com/engine/installation/
Attention Windows users
-----------------------
Currently PostgreSQL (``psycopg2`` python package) is not installed inside Docker containers for Windows users, while it is required by the generated Django project. To fix this, add ``psycopg2`` to the list of requirements inside ``requirements/base.txt``:: Currently PostgreSQL (``psycopg2`` python package) is not installed inside Docker containers for Windows users, while it is required by the generated Django project. To fix this, add ``psycopg2`` to the list of requirements inside ``requirements/base.txt``::
@ -31,23 +27,21 @@ Currently PostgreSQL (``psycopg2`` python package) is not installed inside Docke
Doing this will prevent the project from being installed in an Windows-only environment (thus without usage of Docker). If you want to use this project without Docker, make sure to remove ``psycopg2`` from the requirements again. Doing this will prevent the project from being installed in an Windows-only environment (thus without usage of Docker). If you want to use this project without Docker, make sure to remove ``psycopg2`` from the requirements again.
Build the Stack Build the Stack
--------------- ---------------
This can take a while, especially the first time you run this particular command This can take a while, especially the first time you run this particular command on your development system::
on your development system::
$ docker-compose -f local.yml build $ docker-compose -f local.yml build
If you want to build the production environment you use ``production.yml`` as -f argument (``docker-compose.yml`` or ``docker-compose.yaml`` are the defaults). Generally, if you want to emulate production environment use ``production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it!
Boot the System
---------------
This brings up both Django and PostgreSQL. Run the Stack
-------------
The first time it is run it might take a while to get started, but subsequent This brings up both Django and PostgreSQL. The first time it is run it might take a while to get started, but subsequent runs will occur quickly.
runs will occur quickly.
Open a terminal at the project root and run the following for local development:: Open a terminal at the project root and run the following for local development::
@ -61,98 +55,108 @@ And then run::
$ docker-compose up $ docker-compose up
Running management commands To run in a detached (background) mode, just::
~~~~~~~~~~~~~~~~~~~~~~~~~~~
As with any shell command that we wish to run in our container, this is done $ docker-compose up -d
using the ``docker-compose -f local.yml run`` command.
To migrate your app and to create a superuser, run::
$ docker-compose -f local.yml run django python manage.py migrate Execute Management Commands
$ docker-compose -f local.yml run django python manage.py createsuperuser ---------------------------
Here we specify the ``django`` container as the location to run our management commands. As with any shell command that we wish to run in our container, this is done using the ``docker-compose -f local.yml run --rm`` command: ::
Add your Docker development server IP $ docker-compose -f local.yml run --rm django python manage.py migrate
------------------------------------- $ docker-compose -f local.yml run --rm django python manage.py createsuperuser
When ``DEBUG`` is set to `True`, the host is validated against ``['localhost', '127.0.0.1', '[::1]']``. This is adequate when running a ``virtualenv``. For Docker, in the ``config.settings.local``, add your host development server IP to ``INTERNAL_IPS`` or ``ALLOWED_HOSTS`` if the variable exists. Here, ``django`` is the target service we are executing the commands against.
Production Mode
~~~~~~~~~~~~~~~
Instead of using `local.yml`, you would use `production.yml`. (Optionally) Designate your Docker Development Server IP
--------------------------------------------------------
Other Useful Tips When ``DEBUG`` is set to ``True``, the host is validated against ``['localhost', '127.0.0.1', '[::1]']``. This is adequate when running a ``virtualenv``. For Docker, in the ``config.settings.local``, add your host development server IP to ``INTERNAL_IPS`` or ``ALLOWED_HOSTS`` if the variable exists.
-----------------
Make a machine the active unit
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This tells our computer that all future commands are specifically for the dev1 machine. Configuring the Environment
Using the ``eval`` command we can switch machines as needed. ---------------------------
:: This is the excerpt from your project's ``local.yml``: ::
# ...
postgres:
build:
context: .
dockerfile: ./compose/production/postgres/Dockerfile
volumes:
- postgres_data_local:/var/lib/postgresql/data
- postgres_backup_local:/backups
env_file:
- ./.envs/.local/.postgres
# ...
The most important thing for us here now is ``env_file`` section enlisting ``./.envs/.local/.postgres``. Generally, the stack's behavior is governed by a number of environment variables (`env(s)`, for short) residing in ``envs/``, for instance, this is what we generate for you: ::
.envs
├── .local
│   ├── .django
│   └── .postgres
└── .production
├── .caddy
├── .django
└── .postgres
By convention, for any service ``sI`` in environment ``e`` (you know ``someenv`` is an environment when there is a ``someenv.yml`` file in the project root), given ``sI`` requires configuration, a ``.envs/.e/.sI`` `service configuration` file exists.
Consider the aforementioned ``.envs/.local/.postgres``: ::
# PostgreSQL
# ------------------------------------------------------------------------------
POSTGRES_USER=XgOWtQtJecsAbaIyslwGvFvPawftNaqO
POSTGRES_PASSWORD=jSljDz4whHuwO3aJIgVBrqEml5Ycbghorep4uVJ4xjDYQu0LfuTZdctj7y0YcCLu
The two envs we are presented with here are ``POSTGRES_USER``, and ``POSTGRES_PASSWORD`` (by the way, their values have also been generated for you). You might have figured out already where these definitions will end up; it's all the same with ``django`` and ``caddy`` service container envs.
Tips & Tricks
-------------
Activate a Docker Machine
~~~~~~~~~~~~~~~~~~~~~~~~~
This tells our computer that all future commands are specifically for the dev1 machine. Using the ``eval`` command we can switch machines as needed.::
$ eval "$(docker-machine env dev1)" $ eval "$(docker-machine env dev1)"
Detached Mode
~~~~~~~~~~~~~
If you want to run the stack in detached mode (in the background), use the ``-d`` argument:
::
$ docker-compose -f local.yml up -d
Debugging Debugging
~~~~~~~~~~~~~ ~~~~~~~~~
ipdb ipdb
""""" """""
If you are using the following within your code to debug: If you are using the following within your code to debug: ::
::
import ipdb; ipdb.set_trace() import ipdb; ipdb.set_trace()
Then you may need to run the following for it to work as desired: Then you may need to run the following for it to work as desired: ::
:: $ docker-compose -f local.yml run --rm --service-ports django
$ docker-compose -f local.yml run --service-ports django
django-debug-toolbar django-debug-toolbar
"""""""""""""""""""" """"""""""""""""""""
In order for django-debug-toolbar to work with docker you need to add your docker-machine ip address to ``INTERNAL_IPS`` in ``local.py`` In order for ``django-debug-toolbar`` to work designate your Docker Machine IP with ``INTERNAL_IPS`` in ``local.py``.
.. May be a better place to put this, as it is not Docker specific. Mailhog
~~~~~~~
You may need to add the following to your css in order for the django-debug-toolbar to be visible (this applies whether Docker is being used or not): When developing locally you can go with MailHog_ for email testing provided ``use_mailhog`` was set to ``y`` on setup. To proceed,
.. code-block:: css #. make sure ``mailhog`` container is up and running;
/* Override Bootstrap 4 styling on Django Debug Toolbar */ #. open up ``http://127.0.0.1:8025``.
#djDebug[hidden], #djDebug [hidden] {
display: block !important;
}
#djDebug [hidden][style='display: none;'] {
display: none !important;
}
Using the Mailhog Docker Container
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In development you can (optionally) use MailHog_ for email testing. If you selected `use_docker`, MailHog is added as a Docker container. To use MailHog:
1. Make sure, that ``mailhog`` docker container is up and running
2. Open your browser and go to ``http://127.0.0.1:8025``
.. _Mailhog: https://github.com/mailhog/MailHog/ .. _Mailhog: https://github.com/mailhog/MailHog/

View File

@ -2,7 +2,7 @@
Database Backups with Docker Database Backups with Docker
============================ ============================
The database has to be running to create/restore a backup. These examples show local examples. If you want to use it on a remote server, remove ``-f local.yml`` from each example. The database has to be running to create/restore a backup. These examples show local examples. If you want to use it on a remote server, use ``-f production.yml`` instead.
Running Backups Running Backups
================ ================
@ -11,17 +11,17 @@ Run the app with `docker-compose -f local.yml up`.
To create a backup, run:: To create a backup, run::
docker-compose -f local.yml run postgres backup docker-compose -f local.yml run --rm postgres backup
To list backups, run:: To list backups, run::
docker-compose -f local.yml run postgres list-backups docker-compose -f local.yml run --rm postgres list-backups
To restore a backup, run:: To restore a backup, run::
docker-compose -f local.yml run postgres restore filename.sql docker-compose -f local.yml run --rm postgres restore filename.sql
Where <containerId> is the ID of the Postgres container. To get it, run:: Where <containerId> is the ID of the Postgres container. To get it, run::

View File

@ -1,5 +1,5 @@
FAQ FAQ
==== ===
.. index:: FAQ, 12-Factor App .. index:: FAQ, 12-Factor App
@ -17,11 +17,11 @@ Why aren't you using just one configuration file (12-Factor App)
---------------------------------------------------------------------- ----------------------------------------------------------------------
TODO TODO
.. TODO
Why doesn't this follow the layout from Two Scoops of Django 1.8? Why doesn't this follow the layout from Two Scoops of Django?
---------------------------------------------------------------------- -------------------------------------------------------------
You may notice that some elements of this project do not exactly match what we describe in chapter 3 of `Two Scoops of Django`_. The reason for that is this project, amongst other things, serves as a test bed for trying out new ideas and concepts. Sometimes they work, sometimes they don't, but the end result is that it won't necessarily match precisely what is described in the book I co-authored. You may notice that some elements of this project do not exactly match what we describe in chapter 3 of `Two Scoops of Django 1.11`_. The reason for that is this project, amongst other things, serves as a test bed for trying out new ideas and concepts. Sometimes they work, sometimes they don't, but the end result is that it won't necessarily match precisely what is described in the book I co-authored.
.. _Two Scoops of Django 1.11: https://www.twoscoopspress.com/collections/django/products/two-scoops-of-django-1-11
.. _`Two Scoops of Django`: http://twoscoopspress.com/products/two-scoops-of-django-1-8

View File

@ -10,10 +10,10 @@ project_slug [my_awesome_project]:
is needed. is needed.
description [Behold My Awesome Project!] description [Behold My Awesome Project!]
Describes your project and gets used in places like `README.rst` and such. Describes your project and gets used in places like ``README.rst`` and such.
author_name [Daniel Roy Greenfeld]: author_name [Daniel Roy Greenfeld]:
This is you! The value goes into places like `LICENSE` and such. This is you! The value goes into places like ``LICENSE`` and such.
email [daniel-roy-greenfeld@example.com]: email [daniel-roy-greenfeld@example.com]:
The email address you want to identify yourself in the project. The email address you want to identify yourself in the project.
@ -35,7 +35,7 @@ open_source_license [1]
5. Not open source 5. Not open source
timezone [UTC] timezone [UTC]
The value to be used for the `TIME_ZONE` setting of the project. The value to be used for the ``TIME_ZONE`` setting of the project.
windows [n] windows [n]
Indicates whether the project should be configured for development on Windows. Indicates whether the project should be configured for development on Windows.
@ -94,6 +94,11 @@ use_heroku [n]
use_travisci [n] use_travisci [n]
Indicates whether the project should be configured to use `Travis CI`_. Indicates whether the project should be configured to use `Travis CI`_.
keep_local_envs_in_vcs [y]
Indicates whether the project's ``.envs/.local/`` should be kept in VCS
(comes in handy when working in teams where local environment reproducibility
is strongly encouraged).
.. _MIT: https://opensource.org/licenses/MIT .. _MIT: https://opensource.org/licenses/MIT
.. _BSD: https://opensource.org/licenses/BSD-3-Clause .. _BSD: https://opensource.org/licenses/BSD-3-Clause

View File

@ -3,7 +3,8 @@ Troubleshooting
This page contains some advice about errors and problems commonly encountered during the development of Cookiecutter Django applications. This page contains some advice about errors and problems commonly encountered during the development of Cookiecutter Django applications.
#. If you get the error ``jinja2.exceptions.TemplateSyntaxError: Encountered unknown tag 'now'.`` , please upgrade your cookiecutter version to >= 1.4 (see issue # 528_ )
#. ``project_slug`` must be a valid Python module name or you will have issues on imports. #. ``project_slug`` must be a valid Python module name or you will have issues on imports.
#. ``jinja2.exceptions.TemplateSyntaxError: Encountered unknown tag 'now'.``: please upgrade your cookiecutter version to >= 1.4 (see # 528_)
.. _528: https://github.com/pydanny/cookiecutter-django/issues/528#issuecomment-212650373 .. _528: https://github.com/pydanny/cookiecutter-django/issues/528#issuecomment-212650373

View File

@ -7,12 +7,12 @@ NOTE:
TODO: ? restrict Cookiecutter Django project initialization to Python 3.x environments only TODO: ? restrict Cookiecutter Django project initialization to Python 3.x environments only
""" """
from __future__ import print_function
import os import os
import random import random
import shutil import shutil
import string import string
import sys
try: try:
# Inspired by # Inspired by
@ -22,15 +22,19 @@ try:
except NotImplementedError: except NotImplementedError:
using_sysrandom = False using_sysrandom = False
PROJECT_DIR_PATH = os.path.realpath(os.path.curdir) TERMINATOR = "\x1b[0m"
WARNING = "\x1b[1;33m [WARNING]: "
INFO = "\x1b[1;33m [INFO]: "
HINT = "\x1b[3;33m"
SUCCESS = "\x1b[1;32m [SUCCESS]: "
def remove_open_source_project_only_files(): def remove_open_source_files():
file_names = [ file_names = [
'CONTRIBUTORS.txt', 'CONTRIBUTORS.txt',
] ]
for file_name in file_names: for file_name in file_names:
os.remove(os.path.join(PROJECT_DIR_PATH, file_name)) os.remove(file_name)
def remove_gplv3_files(): def remove_gplv3_files():
@ -38,21 +42,21 @@ def remove_gplv3_files():
'COPYING', 'COPYING',
] ]
for file_name in file_names: for file_name in file_names:
os.remove(os.path.join(PROJECT_DIR_PATH, file_name)) os.remove(file_name)
def remove_pycharm_files(): def remove_pycharm_files():
idea_dir_path = os.path.join(PROJECT_DIR_PATH, '.idea') idea_dir_path = '.idea'
if os.path.exists(idea_dir_path): if os.path.exists(idea_dir_path):
shutil.rmtree(idea_dir_path) shutil.rmtree(idea_dir_path)
docs_dir_path = os.path.join(PROJECT_DIR_PATH, 'docs', 'pycharm') docs_dir_path = os.path.join('docs', 'pycharm')
if os.path.exists(docs_dir_path): if os.path.exists(docs_dir_path):
shutil.rmtree(docs_dir_path) shutil.rmtree(docs_dir_path)
def remove_docker_files(): def remove_docker_files():
shutil.rmtree(os.path.join(PROJECT_DIR_PATH, 'compose')) shutil.rmtree('compose')
file_names = [ file_names = [
'local.yml', 'local.yml',
@ -60,7 +64,7 @@ def remove_docker_files():
'.dockerignore', '.dockerignore',
] ]
for file_name in file_names: for file_name in file_names:
os.remove(os.path.join(PROJECT_DIR_PATH, file_name)) os.remove(file_name)
def remove_heroku_files(): def remove_heroku_files():
@ -70,11 +74,7 @@ def remove_heroku_files():
'requirements.txt', 'requirements.txt',
] ]
for file_name in file_names: for file_name in file_names:
os.remove(os.path.join(PROJECT_DIR_PATH, file_name)) os.remove(file_name)
def remove_dotenv_file():
os.remove(os.path.join(PROJECT_DIR_PATH, '.env'))
def remove_grunt_files(): def remove_grunt_files():
@ -82,7 +82,7 @@ def remove_grunt_files():
'Gruntfile.js', 'Gruntfile.js',
] ]
for file_name in file_names: for file_name in file_names:
os.remove(os.path.join(PROJECT_DIR_PATH, file_name)) os.remove(file_name)
def remove_gulp_files(): def remove_gulp_files():
@ -90,7 +90,7 @@ def remove_gulp_files():
'gulpfile.js', 'gulpfile.js',
] ]
for file_name in file_names: for file_name in file_names:
os.remove(os.path.join(PROJECT_DIR_PATH, file_name)) os.remove(file_name)
def remove_packagejson_file(): def remove_packagejson_file():
@ -98,19 +98,19 @@ def remove_packagejson_file():
'package.json', 'package.json',
] ]
for file_name in file_names: for file_name in file_names:
os.remove(os.path.join(PROJECT_DIR_PATH, file_name)) os.remove(file_name)
def remove_celery_app(): def remove_celery_app():
shutil.rmtree(os.path.join(PROJECT_DIR_PATH, '{{ cookiecutter.project_slug }}', 'taskapp')) shutil.rmtree(os.path.join('{{ cookiecutter.project_slug }}', 'taskapp'))
def remove_dottravisyml_file(): def remove_dottravisyml_file():
os.remove(os.path.join(PROJECT_DIR_PATH, '.travis.yml')) os.remove('.travis.yml')
def append_to_project_gitignore(path): def append_to_project_gitignore(path):
gitignore_file_path = os.path.join(PROJECT_DIR_PATH, '.gitignore') gitignore_file_path = '.gitignore'
with open(gitignore_file_path, 'a') as gitignore_file: with open(gitignore_file_path, 'a') as gitignore_file:
gitignore_file.write(path) gitignore_file.write(path)
gitignore_file.write(os.linesep) gitignore_file.write(os.linesep)
@ -144,17 +144,19 @@ def generate_random_string(length,
def set_flag(file_path, def set_flag(file_path,
flag, flag,
value=None, value=None,
formatted=None,
*args, *args,
**kwargs): **kwargs):
if value is None: if value is None:
random_string = generate_random_string(*args, **kwargs) random_string = generate_random_string(*args, **kwargs)
if random_string is None: if random_string is None:
import sys print(
sys.stdout.write(
"We couldn't find a secure pseudo-random number generator on your system. " "We couldn't find a secure pseudo-random number generator on your system. "
"Please, make sure to manually {} later.".format(flag) "Please, make sure to manually {} later.".format(flag)
) )
random_string = flag random_string = flag
if formatted is not None:
random_string = formatted.format(random_string)
value = random_string value = random_string
with open(file_path, 'r+') as f: with open(file_path, 'r+') as f:
@ -170,21 +172,38 @@ def set_django_secret_key(file_path):
django_secret_key = set_flag( django_secret_key = set_flag(
file_path, file_path,
'!!!SET DJANGO_SECRET_KEY!!!', '!!!SET DJANGO_SECRET_KEY!!!',
length=50, length=64,
using_digits=True, using_digits=True,
using_ascii_letters=True using_ascii_letters=True
) )
return django_secret_key return django_secret_key
def set_django_admin_url(file_path):
django_admin_url = set_flag(
file_path,
'!!!SET DJANGO_ADMIN_URL!!!',
formatted='^{}/',
length=32,
using_digits=True,
using_ascii_letters=True
)
return django_admin_url
def generate_postgres_user():
return generate_random_string(
length=32,
using_ascii_letters=True
)
def set_postgres_user(file_path, def set_postgres_user(file_path,
value=None): value=None):
postgres_user = set_flag( postgres_user = set_flag(
file_path, file_path,
'!!!SET POSTGRES_USER!!!', '!!!SET POSTGRES_USER!!!',
value=value, value=value or generate_postgres_user()
length=8,
using_ascii_letters=True
) )
return postgres_user return postgres_user
@ -193,45 +212,50 @@ def set_postgres_password(file_path):
postgres_password = set_flag( postgres_password = set_flag(
file_path, file_path,
'!!!SET POSTGRES_PASSWORD!!!', '!!!SET POSTGRES_PASSWORD!!!',
length=42, length=64,
using_digits=True, using_digits=True,
using_ascii_letters=True using_ascii_letters=True
) )
return postgres_password return postgres_password
def initialize_dotenv(postgres_user): def append_to_gitignore_file(s):
# Initializing `env.example` first. with open('.gitignore', 'a') as gitignore_file:
envexample_file_path = os.path.join(PROJECT_DIR_PATH, 'env.example') gitignore_file.write(s)
set_django_secret_key(envexample_file_path) gitignore_file.write(os.linesep)
set_postgres_user(envexample_file_path, value=postgres_user)
set_postgres_password(envexample_file_path)
# Renaming `env.example` to `.env`.
dotenv_file_path = os.path.join(PROJECT_DIR_PATH, '.env')
shutil.move(envexample_file_path, dotenv_file_path)
def initialize_localyml(postgres_user): def set_flags_in_envs(postgres_user):
set_postgres_user(os.path.join(PROJECT_DIR_PATH, 'local.yml'), value=postgres_user) local_postgres_envs_path = os.path.join('.envs', '.local', '.postgres')
set_postgres_user(local_postgres_envs_path, value=postgres_user)
set_postgres_password(local_postgres_envs_path)
production_django_envs_path = os.path.join('.envs', '.production', '.django')
set_django_secret_key(production_django_envs_path)
set_django_admin_url(production_django_envs_path)
production_postgres_envs_path = os.path.join('.envs', '.production', '.postgres')
set_postgres_user(production_postgres_envs_path, value=postgres_user)
set_postgres_password(production_postgres_envs_path)
def initialize_local_settings(): def set_flags_in_settings_files():
set_django_secret_key(os.path.join(PROJECT_DIR_PATH, 'config', 'settings', 'local.py')) set_django_secret_key(os.path.join('config', 'settings', 'local.py'))
set_django_secret_key(os.path.join('config', 'settings', 'test.py'))
def initialize_test_settings(): def remove_envs_and_associated_files():
set_django_secret_key(os.path.join(PROJECT_DIR_PATH, 'config', 'settings', 'test.py')) shutil.rmtree('.envs')
os.remove('merge_production_dotenvs_in_dotenv.py')
def main(): def main():
postgres_user = generate_random_string(length=16, using_ascii_letters=True) postgres_user = generate_postgres_user()
initialize_dotenv(postgres_user) set_flags_in_envs(postgres_user)
initialize_localyml(postgres_user) set_flags_in_settings_files()
initialize_local_settings()
initialize_test_settings()
if '{{ cookiecutter.open_source_license }}' == 'Not open source': if '{{ cookiecutter.open_source_license }}' == 'Not open source':
remove_open_source_project_only_files() remove_open_source_files()
if '{{ cookiecutter.open_source_license}}' != 'GPLv3': if '{{ cookiecutter.open_source_license}}' != 'GPLv3':
remove_gplv3_files() remove_gplv3_files()
@ -245,7 +269,20 @@ def main():
remove_heroku_files() remove_heroku_files()
if '{{ cookiecutter.use_docker }}'.lower() == 'n' and '{{ cookiecutter.use_heroku }}'.lower() == 'n': if '{{ cookiecutter.use_docker }}'.lower() == 'n' and '{{ cookiecutter.use_heroku }}'.lower() == 'n':
remove_dotenv_file() if '{{ cookiecutter.keep_local_envs_in_vcs }}'.lower() == 'y':
print(
INFO +
".env(s) are only utilized when Docker Compose and/or "
"Heroku support is enabled so keeping them does not "
"make sense given your current setup." +
TERMINATOR
)
remove_envs_and_associated_files()
else:
append_to_gitignore_file('.env')
append_to_gitignore_file('.envs' + '/**/*')
if '{{ cookiecutter.keep_local_envs_in_vcs }}'.lower() == 'y':
append_to_gitignore_file('!.envs/.local/')
if '{{ cookiecutter.js_task_runner}}'.lower() == 'gulp': if '{{ cookiecutter.js_task_runner}}'.lower() == 'gulp':
remove_grunt_files() remove_grunt_files()
@ -255,18 +292,20 @@ def main():
remove_gulp_files() remove_gulp_files()
remove_grunt_files() remove_grunt_files()
remove_packagejson_file() remove_packagejson_file()
if '{{ cookiecutter.js_task_runner }}'.lower() in ['grunt', 'gulp'] \ if '{{ cookiecutter.js_task_runner }}'.lower() in ['grunt', 'gulp'] \
and '{{ cookiecutter.use_docker }}'.lower() == 'y': and '{{ cookiecutter.use_docker }}'.lower() == 'y':
TERMINATOR = "\x1b[0m" print(
INFO = "\x1b[1;33m [INFO]: " WARNING +
sys.stdout.write( "Docker and {} JS task runner ".format(
INFO + '{{ cookiecutter.js_task_runner }}'
"Docker and {} JS task runner ".format('{{ cookiecutter.js_task_runner }}'.lower().capitalize()) + .lower()
.capitalize()
) +
"working together not supported yet. " "working together not supported yet. "
"You can continue using the generated project like you normally would, " "You can continue using the generated project like you "
"however you would need to add a JS task runner service " "normally would, however you would need to add a JS "
"to your Docker Compose configuration manually." + "task runner service to your Docker Compose configuration "
"manually." +
TERMINATOR TERMINATOR
) )
@ -276,6 +315,12 @@ def main():
if '{{ cookiecutter.use_travisci }}'.lower() == 'n': if '{{ cookiecutter.use_travisci }}'.lower() == 'n':
remove_dottravisyml_file() remove_dottravisyml_file()
print(
SUCCESS +
"Project initialized, keep up the good work!" +
TERMINATOR
)
if __name__ == '__main__': if __name__ == '__main__':
main() main()

View File

@ -6,6 +6,15 @@ NOTE:
TODO: ? restrict Cookiecutter Django project initialization to Python 3.x environments only TODO: ? restrict Cookiecutter Django project initialization to Python 3.x environments only
""" """
from __future__ import print_function
import sys
TERMINATOR = "\x1b[0m"
WARNING = "\x1b[1;33m [WARNING]: "
INFO = "\x1b[1;33m [INFO]: "
HINT = "\x1b[3;33m"
SUCCESS = "\x1b[1;32m [SUCCESS]: "
project_slug = '{{ cookiecutter.project_slug }}' project_slug = '{{ cookiecutter.project_slug }}'
if hasattr(project_slug, 'isidentifier'): if hasattr(project_slug, 'isidentifier'):
@ -13,20 +22,10 @@ if hasattr(project_slug, 'isidentifier'):
assert "\\" not in "{{ cookiecutter.author_name }}", "Don't include backslashes in author name." assert "\\" not in "{{ cookiecutter.author_name }}", "Don't include backslashes in author name."
if '{{ cookiecutter.use_docker }}'.lower() == 'n':
using_docker = '{{ cookiecutter.use_docker }}'.lower()
if using_docker == 'n':
TERMINATOR = "\x1b[0m"
WARNING = "\x1b[1;33m [WARNING]: "
INFO = "\x1b[1;33m [INFO]: "
HINT = "\x1b[3;33m"
SUCCESS = "\x1b[1;32m [SUCCESS]: "
import sys
python_major_version = sys.version_info[0] python_major_version = sys.version_info[0]
if python_major_version == 2: if python_major_version == 2:
sys.stdout.write( print(
WARNING + WARNING +
"Cookiecutter Django does not support Python 2. " "Cookiecutter Django does not support Python 2. "
"Stability is guaranteed with Python 3.6+ only, " "Stability is guaranteed with Python 3.6+ only, "
@ -39,14 +38,14 @@ if using_docker == 'n':
if choice in yes_options: if choice in yes_options:
break break
elif choice in no_options: elif choice in no_options:
sys.stdout.write( print(
INFO + INFO +
"Generation process stopped as requested." + "Generation process stopped as requested." +
TERMINATOR TERMINATOR
) )
sys.exit(1) sys.exit(1)
else: else:
sys.stdout.write( print(
HINT + HINT +
"Please respond with {} or {}: ".format( "Please respond with {} or {}: ".format(
', '.join(["'{}'".format(o) for o in yes_options if not o == '']), ', '.join(["'{}'".format(o) for o in yes_options if not o == '']),
@ -54,9 +53,3 @@ if using_docker == 'n':
) + ) +
TERMINATOR TERMINATOR
) )
sys.stdout.write(
SUCCESS +
"Project initialized, keep up the good work!" +
TERMINATOR
)

View File

@ -0,0 +1,3 @@
# General
# ------------------------------------------------------------------------------
USE_DOCKER=yes

View File

@ -0,0 +1,4 @@
# PostgreSQL
# ------------------------------------------------------------------------------
POSTGRES_USER=!!!SET POSTGRES_USER!!!
POSTGRES_PASSWORD=!!!SET POSTGRES_PASSWORD!!!

View File

@ -0,0 +1,3 @@
# Caddy
# ------------------------------------------------------------------------------
DOMAIN_NAME={{ cookiecutter.domain_name }}

View File

@ -0,0 +1,43 @@
# General
# ------------------------------------------------------------------------------
# DJANGO_READ_DOT_ENV_FILE=True
DJANGO_SETTINGS_MODULE=config.settings.production
DJANGO_SECRET_KEY=!!!SET DJANGO_SECRET_KEY!!!
DJANGO_ADMIN_URL=!!!SET DJANGO_ADMIN_URL!!!
DJANGO_ALLOWED_HOSTS=.{{ cookiecutter.domain_name }}
# Security
# ------------------------------------------------------------------------------
# TIP: better off using DNS, however, redirect is OK too
DJANGO_SECURE_SSL_REDIRECT=False
# Email
# ------------------------------------------------------------------------------
MAILGUN_API_KEY=
DJANGO_SERVER_EMAIL=
MAILGUN_DOMAIN=
# AWS
# ------------------------------------------------------------------------------
DJANGO_AWS_ACCESS_KEY_ID=
DJANGO_AWS_SECRET_ACCESS_KEY=
DJANGO_AWS_STORAGE_BUCKET_NAME=
# django-allauth
# ------------------------------------------------------------------------------
DJANGO_ACCOUNT_ALLOW_REGISTRATION=True
{% if cookiecutter.use_compressor == 'y' %}
# django-compressor
# ------------------------------------------------------------------------------
COMPRESS_ENABLED=
{% endif %}{% if cookiecutter.use_sentry_for_error_reporting == 'y' %}
# Sentry
# ------------------------------------------------------------------------------
DJANGO_SENTRY_DSN=
{% endif %}{% if cookiecutter.use_opbeat == 'y' %}
# opbeat
# ------------------------------------------------------------------------------
DJANGO_OPBEAT_ORGANIZATION_ID=
DJANGO_OPBEAT_APP_ID=
DJANGO_OPBEAT_SECRET_TOKEN=
{% endif %}

View File

@ -0,0 +1,4 @@
# PostgreSQL
# ------------------------------------------------------------------------------
POSTGRES_USER=!!!SET POSTGRES_USER!!!
POSTGRES_PASSWORD=!!!SET POSTGRES_PASSWORD!!!

View File

@ -53,42 +53,23 @@ coverage.xml
# Django stuff: # Django stuff:
staticfiles/ staticfiles/
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation # Sphinx documentation
docs/_build/ docs/_build/
# PyBuilder # PyBuilder
target/ target/
# Jupyter Notebook
.ipynb_checkpoints
# pyenv # pyenv
.python-version .python-version
# celery beat schedule file # celery beat schedule file
celerybeat-schedule celerybeat-schedule
# SageMath parsed files
*.sage.py
# Environments # Environments
.env
.venv .venv
env/
venv/ venv/
ENV/ ENV/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings # Rope project settings
.ropeproject .ropeproject

View File

@ -0,0 +1,21 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="merge_production_dotenvs_in_dotenv" type="PythonConfigurationType" factoryName="Python" singleton="true">
<option name="INTERPRETER_OPTIONS" value="" />
<option name="PARENT_ENVS" value="true" />
<envs>
<env name="PYTHONUNBUFFERED" value="1" />
</envs>
<option name="SDK_HOME" value="" />
<option name="WORKING_DIRECTORY" value="$PROJECT_DIR$" />
<option name="IS_MODULE_SDK" value="true" />
<option name="ADD_CONTENT_ROOTS" value="true" />
<option name="ADD_SOURCE_ROOTS" value="true" />
<module name="{{ cookiecutter.project_slug }}" />
<EXTENSION ID="PythonCoverageRunConfigurationExtension" enabled="false" sample_coverage="true" runner="coverage.py" />
<option name="SCRIPT_NAME" value="merge_production_dotenvs_in_dotenv.py" />
<option name="PARAMETERS" value="" />
<option name="SHOW_COMMAND_LINE" value="false" />
<option name="EMULATE_TERMINAL" value="false" />
<method />
</configuration>
</component>

View File

@ -1,46 +0,0 @@
# PostgreSQL
POSTGRES_PASSWORD=!!!SET POSTGRES_PASSWORD!!!
POSTGRES_USER=!!!SET POSTGRES_USER!!!
CONN_MAX_AGE=
# Gunicorn concurrency
WEB_CONCURRENCY=4
# Domain name, used by caddy
DOMAIN_NAME={{ cookiecutter.domain_name }}
# General settings
# DJANGO_READ_DOT_ENV_FILE=True
DJANGO_ADMIN_URL=
DJANGO_SETTINGS_MODULE=config.settings.production
DJANGO_SECRET_KEY=!!!SET DJANGO_SECRET_KEY!!!
DJANGO_ALLOWED_HOSTS=.{{ cookiecutter.domain_name }}
# AWS Settings
DJANGO_AWS_ACCESS_KEY_ID=
DJANGO_AWS_SECRET_ACCESS_KEY=
DJANGO_AWS_STORAGE_BUCKET_NAME=
# Used with email
MAILGUN_API_KEY=
DJANGO_SERVER_EMAIL=
MAILGUN_DOMAIN=
# Security! Better to use DNS for this task, but you can use redirect
DJANGO_SECURE_SSL_REDIRECT=False
# django-allauth
DJANGO_ACCOUNT_ALLOW_REGISTRATION=True
{% if cookiecutter.use_sentry_for_error_reporting == 'y' -%}
# Sentry
DJANGO_SENTRY_DSN=
{% endif %}
{% if cookiecutter.use_opbeat == 'y' -%}
DJANGO_OPBEAT_ORGANIZATION_ID=
DJANGO_OPBEAT_APP_ID=
DJANGO_OPBEAT_SECRET_TOKEN=
{% endif %}
{% if cookiecutter.use_compressor == 'y' -%}
COMPRESS_ENABLED=
{% endif %}

View File

@ -10,13 +10,15 @@ services:
context: . context: .
dockerfile: ./compose/local/django/Dockerfile dockerfile: ./compose/local/django/Dockerfile
depends_on: depends_on:
- postgres{% if cookiecutter.use_mailhog == 'y' %} - postgres
- mailhog{% endif %} {% if cookiecutter.use_mailhog == 'y' -%}
- mailhog
{%- endif %}
volumes: volumes:
- .:/app - .:/app
environment: env_file:
- POSTGRES_USER=!!!SET POSTGRES_USER!!! - ./.envs/.local/.django
- USE_DOCKER=yes - ./.envs/.local/.postgres
ports: ports:
- "8000:8000" - "8000:8000"
command: /start.sh command: /start.sh
@ -28,35 +30,36 @@ services:
volumes: volumes:
- postgres_data_local:/var/lib/postgresql/data - postgres_data_local:/var/lib/postgresql/data
- postgres_backup_local:/backups - postgres_backup_local:/backups
environment: env_file:
- POSTGRES_USER=!!!SET POSTGRES_USER!!! - ./.envs/.local/.postgres
{% if cookiecutter.use_mailhog == 'y' %} {% if cookiecutter.use_mailhog == 'y' %}
mailhog: mailhog:
image: mailhog/mailhog:v1.0.0 image: mailhog/mailhog:v1.0.0
ports: ports:
- "8025:8025" - "8025:8025"
{% endif %} {% endif %}{% if cookiecutter.use_celery == 'y' %}
{% if cookiecutter.use_celery == 'y' %}
redis: redis:
image: redis:3.0 image: redis:3.0
celeryworker: celeryworker:
# https://github.com/docker/compose/issues/3220
<<: *django <<: *django
depends_on: depends_on:
- redis - redis
- postgres{% if cookiecutter.use_mailhog == 'y' %} - postgres
- mailhog{% endif %} {% if cookiecutter.use_mailhog == 'y' -%}
- mailhog
{%- endif %}
ports: [] ports: []
command: /start-celeryworker.sh command: /start-celeryworker.sh
celerybeat: celerybeat:
# https://github.com/docker/compose/issues/3220
<<: *django <<: *django
depends_on: depends_on:
- redis - redis
- postgres{% if cookiecutter.use_mailhog == 'y' %} - postgres
- mailhog{% endif %} {% if cookiecutter.use_mailhog == 'y' -%}
- mailhog
{%- endif %}
ports: [] ports: []
command: /start-celerybeat.sh command: /start-celerybeat.sh
{% endif %} {% endif %}

View File

@ -0,0 +1,69 @@
import os
from typing import Sequence
import pytest
ROOT_DIR_PATH = os.path.dirname(os.path.realpath(__file__))
PRODUCTION_DOTENVS_DIR_PATH = os.path.join(ROOT_DIR_PATH, '.envs', '.production')
PRODUCTION_DOTENV_FILE_PATHS = [
os.path.join(PRODUCTION_DOTENVS_DIR_PATH, '.django'),
os.path.join(PRODUCTION_DOTENVS_DIR_PATH, '.postgres'),
os.path.join(PRODUCTION_DOTENVS_DIR_PATH, '.caddy'),
]
DOTENV_FILE_PATH = os.path.join(ROOT_DIR_PATH, '.env')
def merge(output_file_path: str,
merged_file_paths: Sequence[str],
append_linesep: bool = True) -> None:
with open(output_file_path, 'w') as output_file:
for merged_file_path in merged_file_paths:
with open(merged_file_path, 'r') as merged_file:
merged_file_content = merged_file.read()
output_file.write(merged_file_content)
if append_linesep:
output_file.write(os.linesep)
def main():
merge(DOTENV_FILE_PATH, PRODUCTION_DOTENV_FILE_PATHS)
@pytest.mark.parametrize('merged_file_count', range(3))
@pytest.mark.parametrize('append_linesep', [True, False])
def test_merge(tmpdir_factory,
merged_file_count: int,
append_linesep: bool):
tmp_dir_path = str(tmpdir_factory.getbasetemp())
output_file_path = os.path.join(tmp_dir_path, '.env')
expected_output_file_content = ''
merged_file_paths = []
for i in range(merged_file_count):
merged_file_ord = i + 1
merged_filename = '.service{}'.format(merged_file_ord)
merged_file_path = os.path.join(tmp_dir_path, merged_filename)
merged_file_content = merged_filename * merged_file_ord
with open(merged_file_path, 'w+') as file:
file.write(merged_file_content)
expected_output_file_content += merged_file_content
if append_linesep:
expected_output_file_content += os.linesep
merged_file_paths.append(merged_file_path)
merge(output_file_path, merged_file_paths, append_linesep)
with open(output_file_path, 'r') as output_file:
actual_output_file_content = output_file.read()
assert actual_output_file_content == expected_output_file_content
if __name__ == '__main__':
main()

View File

@ -13,7 +13,9 @@ services:
depends_on: depends_on:
- postgres - postgres
- redis - redis
env_file: .env env_file:
- ./.envs/.production/.django
- ./.envs/.production/.postgres
command: /gunicorn.sh command: /gunicorn.sh
postgres: postgres:
@ -23,7 +25,8 @@ services:
volumes: volumes:
- postgres_data:/var/lib/postgresql/data - postgres_data:/var/lib/postgresql/data
- postgres_backup:/backups - postgres_backup:/backups
env_file: .env env_file:
- ./.envs/.production/.postgres
caddy: caddy:
build: build:
@ -33,19 +36,23 @@ services:
- django - django
volumes: volumes:
- caddy:/root/.caddy - caddy:/root/.caddy
env_file: .env env_file:
- ./.envs/.production/.caddy
ports: ports:
- "0.0.0.0:80:80" - "0.0.0.0:80:80"
- "0.0.0.0:443:443" - "0.0.0.0:443:443"
redis: redis:
image: redis:3.0 image: redis:3.0
{% if cookiecutter.use_celery == 'y' %} {% if cookiecutter.use_celery == 'y' %}
celeryworker: celeryworker:
<<: *django <<: *django
depends_on: depends_on:
- postgres - postgres
- redis - redis
env_file:
- ./.envs/.production/.django
- ./.envs/.production/.postgres
command: /start-celeryworker.sh command: /start-celeryworker.sh
celerybeat: celerybeat:
@ -53,5 +60,8 @@ services:
depends_on: depends_on:
- postgres - postgres
- redis - redis
env_file:
- ./.envs/.production/.django
- ./.envs/.production/.postgres
command: /start-celerybeat.sh command: /start-celerybeat.sh
{% endif %} {% endif %}