diff --git a/README.md b/README.md
index 555081316..1e40a4190 100644
--- a/README.md
+++ b/README.md
@@ -10,58 +10,58 @@
Powered by [Cookiecutter](https://github.com/cookiecutter/cookiecutter), Cookiecutter Django is a framework for jumpstarting
production-ready Django projects quickly.
-- Documentation:
-- See [Troubleshooting](https://cookiecutter-django.readthedocs.io/en/latest/troubleshooting.html) for common errors and obstacles
-- If you have problems with Cookiecutter Django, please open [issues](https://github.com/cookiecutter/cookiecutter-django/issues/new) don't send
- emails to the maintainers.
+- Documentation:
+- See [Troubleshooting](https://cookiecutter-django.readthedocs.io/en/latest/troubleshooting.html) for common errors and obstacles
+- If you have problems with Cookiecutter Django, please open [issues](https://github.com/cookiecutter/cookiecutter-django/issues/new) don't send
+ emails to the maintainers.
## Features
-- For Django 3.2
-- Works with Python 3.9
-- Renders Django projects with 100% starting test coverage
-- Twitter [Bootstrap](https://github.com/twbs/bootstrap) v5
-- [12-Factor](http://12factor.net/) based settings via [django-environ](https://github.com/joke2k/django-environ)
-- Secure by default. We believe in SSL.
-- Optimized development and production settings
-- Registration via [django-allauth](https://github.com/pennersr/django-allauth)
-- Comes with custom user model ready to go
-- Optional basic ASGI setup for Websockets
-- Optional custom static build using Gulp and livereload
-- Send emails via [Anymail](https://github.com/anymail/django-anymail) (using [Mailgun](http://www.mailgun.com/) by default or Amazon SES if AWS is selected cloud provider, but switchable)
-- Media storage using Amazon S3 or Google Cloud Storage
-- Docker support using [docker-compose](https://github.com/docker/compose) for development and production (using [Traefik](https://traefik.io/) with [LetsEncrypt](https://letsencrypt.org/) support)
-- [Procfile](https://devcenter.heroku.com/articles/procfile) for deploying to Heroku
-- Instructions for deploying to [PythonAnywhere](https://www.pythonanywhere.com/)
-- Run tests with unittest or pytest
-- Customizable PostgreSQL version
-- Default integration with [pre-commit](https://github.com/pre-commit/pre-commit) for identifying simple issues before submission to code review
+- For Django 3.2
+- Works with Python 3.9
+- Renders Django projects with 100% starting test coverage
+- Twitter [Bootstrap](https://github.com/twbs/bootstrap) v5
+- [12-Factor](http://12factor.net/) based settings via [django-environ](https://github.com/joke2k/django-environ)
+- Secure by default. We believe in SSL.
+- Optimized development and production settings
+- Registration via [django-allauth](https://github.com/pennersr/django-allauth)
+- Comes with custom user model ready to go
+- Optional basic ASGI setup for Websockets
+- Optional custom static build using Gulp and livereload
+- Send emails via [Anymail](https://github.com/anymail/django-anymail) (using [Mailgun](http://www.mailgun.com/) by default or Amazon SES if AWS is selected cloud provider, but switchable)
+- Media storage using Amazon S3 or Google Cloud Storage
+- Docker support using [docker-compose](https://github.com/docker/compose) for development and production (using [Traefik](https://traefik.io/) with [LetsEncrypt](https://letsencrypt.org/) support)
+- [Procfile](https://devcenter.heroku.com/articles/procfile) for deploying to Heroku
+- Instructions for deploying to [PythonAnywhere](https://www.pythonanywhere.com/)
+- Run tests with unittest or pytest
+- Support Postgres and MySQL
+- Customizable Database version
+- Default integration with [pre-commit](https://github.com/pre-commit/pre-commit) for identifying simple issues before submission to code review
## Optional Integrations
-*These features can be enabled during initial project setup.*
+_These features can be enabled during initial project setup._
-- Serve static files from Amazon S3, Google Cloud Storage or [Whitenoise](https://whitenoise.readthedocs.io/)
-- Configuration for [Celery](http://www.celeryproject.org/) and [Flower](https://github.com/mher/flower) (the latter in Docker setup only)
-- Integration with [MailHog](https://github.com/mailhog/MailHog) for local email testing
-- Integration with [Sentry](https://sentry.io/welcome/) for error logging
+- Serve static files from Amazon S3, Google Cloud Storage or [Whitenoise](https://whitenoise.readthedocs.io/)
+- Configuration for [Celery](http://www.celeryproject.org/) and [Flower](https://github.com/mher/flower) (the latter in Docker setup only)
+- Integration with [MailHog](https://github.com/mailhog/MailHog) for local email testing
+- Integration with [Sentry](https://sentry.io/welcome/) for error logging
## Constraints
-- Only maintained 3rd party libraries are used.
-- Uses PostgreSQL everywhere (10.19 - 14.1)
-- Environment variables for configuration (This won't work with Apache/mod_wsgi).
+- Only maintained 3rd party libraries are used.
+- Environment variables for configuration (This won't work with Apache/mod_wsgi).
## Support this Project!
This project is run by volunteers. Please support them in their efforts to maintain and improve Cookiecutter Django:
-- Daniel Roy Greenfeld, Project Lead ([GitHub](https://github.com/pydanny), [Patreon](https://www.patreon.com/danielroygreenfeld)): expertise in Django and AWS ELB.
-- Nikita Shupeyko, Core Developer ([GitHub](https://github.com/webyneter)): expertise in Python/Django, hands-on DevOps and frontend experience.
+- Daniel Roy Greenfeld, Project Lead ([GitHub](https://github.com/pydanny), [Patreon](https://www.patreon.com/danielroygreenfeld)): expertise in Django and AWS ELB.
+- Nikita Shupeyko, Core Developer ([GitHub](https://github.com/webyneter)): expertise in Python/Django, hands-on DevOps and frontend experience.
Projects that provide financial support to the maintainers:
-------------------------------------------------------------------------
+---
@@ -120,12 +120,13 @@ Answer the prompts with your own desired [options](http://cookiecutter-django.re
windows [n]: n
use_pycharm [n]: y
use_docker [n]: n
- Select postgresql_version:
- 1 - 14.1
- 2 - 13.5
- 3 - 12.9
- 4 - 11.14
- 5 - 10.19
+ database_engine: 1
+ Select database_version:
+ 1 - postgresql@14.1
+ 2 - postgresql@13.5
+ 3 - postgresql@12.9
+ 4 - postgresql@11.14
+ 5 - postgresql@10.19
Choose from 1, 2, 3, 4, 5 [1]: 1
Select js_task_runner:
1 - None
@@ -182,14 +183,14 @@ Now take a look at your repo. Don't forget to carefully look at the generated RE
For local development, see the following:
-- [Developing locally](http://cookiecutter-django.readthedocs.io/en/latest/developing-locally.html)
-- [Developing locally using docker](http://cookiecutter-django.readthedocs.io/en/latest/developing-locally-docker.html)
+- [Developing locally](http://cookiecutter-django.readthedocs.io/en/latest/developing-locally.html)
+- [Developing locally using docker](http://cookiecutter-django.readthedocs.io/en/latest/developing-locally-docker.html)
## Community
-- Have questions? **Before you ask questions anywhere else**, please post your question on [Stack Overflow](http://stackoverflow.com/questions/tagged/cookiecutter-django) under the *cookiecutter-django* tag. We check there periodically for questions.
-- If you think you found a bug or want to request a feature, please open an [issue](https://github.com/cookiecutter/cookiecutter-django/issues).
-- For anything else, you can chat with us on [Discord](https://discord.gg/9BrxzPKuEW).
+- Have questions? **Before you ask questions anywhere else**, please post your question on [Stack Overflow](http://stackoverflow.com/questions/tagged/cookiecutter-django) under the _cookiecutter-django_ tag. We check there periodically for questions.
+- If you think you found a bug or want to request a feature, please open an [issue](https://github.com/cookiecutter/cookiecutter-django/issues).
+- For anything else, you can chat with us on [Discord](https://discord.gg/9BrxzPKuEW).
## For Readers of Two Scoops of Django
@@ -197,7 +198,7 @@ You may notice that some elements of this project do not exactly match what we d
## For PyUp Users
-If you are using [PyUp](https://pyup.io) to keep your dependencies updated and secure, use the code *cookiecutter* during checkout to get 15% off every month.
+If you are using [PyUp](https://pyup.io) to keep your dependencies updated and secure, use the code _cookiecutter_ during checkout to get 15% off every month.
## "Your Stuff"
@@ -209,18 +210,18 @@ Need a stable release? You can find them at
+ # or
+ heroku addons:create jawsdb --app --version=
- # Assign with AWS_ACCESS_KEY_ID
- heroku config:set DJANGO_AWS_ACCESS_KEY_ID=
+ # once database is delployed, you can run the following command to get the connection url,
+ heroku config:get JAWSDB_URL
+ >> mysql://username:password@hostname:port/default_schema
- # Assign with AWS_SECRET_ACCESS_KEY
- heroku config:set DJANGO_AWS_SECRET_ACCESS_KEY=
+ # backups
+ heroku addons:create jawsdb --bkpwindowstart 00:30 --bkpwindowend 01:00 --mntwindowstart Tue:23:30 --mntwindowend Wed:00:00 --app
- # Assign with AWS_STORAGE_BUCKET_NAME
- heroku config:set DJANGO_AWS_STORAGE_BUCKET_NAME=
+ # To find more about jawsdb backups
+ # https://devcenter.heroku.com/articles/jawsdb#backup-import-data-from-jawsdb-or-another-mysql-database
- git push heroku master
+#. Redis connection setup
- heroku run python manage.py createsuperuser
+ .. code-block:: bash
- heroku run python manage.py check --deploy
+ heroku addons:create heroku-redis:hobby-dev
- heroku open
+#. Mailgun
+
+ .. code-block:: bash
+
+ # Assuming you chose Mailgun as mail service (see below for others)
+ heroku addons:create mailgun:starter
+
+#. Setting up environment variables
+
+.. code-block:: bash
+
+ heroku config:set PYTHONHASHSEED=random
+
+ heroku config:set WEB_CONCURRENCY=4
+
+ heroku config:set DJANGO_DEBUG=False
+ heroku config:set DJANGO_SETTINGS_MODULE=config.settings.production
+ heroku config:set DJANGO_SECRET_KEY="$(openssl rand -base64 64)"
+
+ # Generating a 32 character-long random string without any of the visually similar characters "IOl01":
+ heroku config:set DJANGO_ADMIN_URL="$(openssl rand -base64 4096 | tr -dc 'A-HJ-NP-Za-km-z2-9' | head -c 32)/"
+
+ # Set this to your Heroku app url, e.g. 'bionic-beaver-28392.herokuapp.com'
+ heroku config:set DJANGO_ALLOWED_HOSTS=
+
+ # Assign with AWS_ACCESS_KEY_ID
+ heroku config:set DJANGO_AWS_ACCESS_KEY_ID=
+
+ # Assign with AWS_SECRET_ACCESS_KEY
+ heroku config:set DJANGO_AWS_SECRET_ACCESS_KEY=
+
+ # Assign with AWS_STORAGE_BUCKET_NAME
+ heroku config:set DJANGO_AWS_STORAGE_BUCKET_NAME=
+
+#. Deploying
+
+ .. code-block:: bash
+
+ git push heroku master
+
+ heroku run python manage.py createsuperuser
+
+ heroku run python manage.py check --deploy
+
+ heroku open
+
+.. _Heroku Add-ons: https://elements.heroku.com/addons
Notes
-----
@@ -125,4 +171,4 @@ which runs Gulp in cookiecutter-django.
If things don't work, please refer to the Heroku docs.
-.. _multiple buildpacks: https://devcenter.heroku.com/articles/using-multiple-buildpacks-for-an-app
\ No newline at end of file
+.. _multiple buildpacks: https://devcenter.heroku.com/articles/using-multiple-buildpacks-for-an-app
diff --git a/docs/deployment-with-docker.rst b/docs/deployment-with-docker.rst
index fcce7e6f5..815bdc6ad 100644
--- a/docs/deployment-with-docker.rst
+++ b/docs/deployment-with-docker.rst
@@ -17,7 +17,7 @@ Understanding the Docker Compose Setup
Before you begin, check out the ``production.yml`` file in the root of this project. Keep note of how it provides configuration for the following services:
* ``django``: your application running behind ``Gunicorn``;
-* ``postgres``: PostgreSQL database with the application's relational data;
+* ``postgres/mysql``: PostgreSQL/MySQL database with the application's relational data;
* ``redis``: Redis instance for caching;
* ``traefik``: Traefik reverse proxy with HTTPS on by default.
@@ -85,10 +85,12 @@ You can read more about this feature and how to configure it, at `Automatic HTTP
.. _Automatic HTTPS: https://docs.traefik.io/https/acme/
-(Optional) Postgres Data Volume Modifications
+(Optional) Database Data Volume Modifications
---------------------------------------------
-Postgres is saving its database files to the ``production_postgres_data`` volume by default. Change that if you want something else and make sure to make backups since this is not done automatically.
+Postgres is saving its database files to the ``production_postgres_data`` volume by default.
+Similarly, MySQL is saving its database files to the ``production_mysql_data`` volume by default.
+Change that if you want something else and make sure to make backups since this is not done automatically.
Building & Running Production Stack
diff --git a/docs/developing-locally-docker.rst b/docs/developing-locally-docker.rst
index e0b522c8c..f70363dc9 100644
--- a/docs/developing-locally-docker.rst
+++ b/docs/developing-locally-docker.rst
@@ -124,6 +124,30 @@ Consider the aforementioned ``.envs/.local/.postgres``: ::
The three envs we are presented with here are ``POSTGRES_DB``, ``POSTGRES_USER``, and ``POSTGRES_PASSWORD`` (by the way, their values have also been generated for you). You might have figured out already where these definitions will end up; it's all the same with ``django`` service container envs.
+.. note::
+
+ If you are using MySQL, the ``.env`` structure will adapt accordingly. ::
+
+ .envs
+ ├── .local
+ │ ├── .django
+ │ └── .mysql
+ └── .production
+ ├── .django
+ └── .mysql
+
+ where the ``.mysql`` will contain: ::
+
+ # MySQL
+ # ------------------------------------------------------------------------------
+ MYSQL_HOST=mysql
+ MYSQL_PORT=3306
+ MYSQL_DATABASE={{ cookiecutter.project_slug }}
+ MYSQL_USER=ssdDFA2FEaFeFDasdG2432TT23TWE
+ MYSQL_PASSWORD=aldAdds82FD89rnkDFFfsNFDaf8493H
+ MYSQL_ROOT_PASSWORD=jSljDz4whHuwO3aJIgVBrqEml5Ycbghorep4uVJ4xjDYQu0LfuTZdctj7y0YcCLu
+
+
One final touch: should you ever need to merge ``.envs/.production/*`` in a single ``.env`` run the ``merge_production_dotenvs_in_dotenv.py``: ::
$ python merge_production_dotenvs_in_dotenv.py
diff --git a/docs/developing-locally.rst b/docs/developing-locally.rst
index a9a54a03b..7468a9431 100644
--- a/docs/developing-locally.rst
+++ b/docs/developing-locally.rst
@@ -1,7 +1,7 @@
Getting Up and Running Locally
==============================
-.. index:: pip, virtualenv, PostgreSQL
+.. index:: pip, virtualenv, PostgreSQL, MySQL
Setting Up Development Environment
@@ -10,25 +10,25 @@ Setting Up Development Environment
Make sure to have the following on your host:
* Python 3.9
-* PostgreSQL_.
+* PostgreSQL_ / MySQL_
* Redis_, if using Celery
* Cookiecutter_
First things first.
-#. Create a virtualenv: ::
+1. Create a virtualenv: ::
$ python3.9 -m venv
-#. Activate the virtualenv you have just created: ::
+2. Activate the virtualenv you have just created: ::
$ source /bin/activate
-#. Install cookiecutter-django: ::
+3. Install cookiecutter-django: ::
$ cookiecutter gh:cookiecutter/cookiecutter-django
-#. Install development requirements: ::
+4. Install development requirements: ::
$ cd
$ pip install -r requirements/local.txt
@@ -40,6 +40,11 @@ First things first.
the `pre-commit` hook exists in the generated project as default.
For the details of `pre-commit`, follow the `pre-commit`_ site.
+Database Setup
+-------------------
+
+Setup with PostgreSQL_
+~~~~~~~~~~~~~~~~~~~~~~~
#. Create a new PostgreSQL database using createdb_: ::
$ createdb -U postgres --password
@@ -51,10 +56,31 @@ First things first.
the ``postgres`` user. The `postgres documentation`_ explains the syntax of the config file
that you need to change.
+Setup with MySQL_
+~~~~~~~~~~~~~~~~~~~~~~~
+ #. Create a new MySQL database: ::
-#. Set the environment variables for your database(s): ::
+ $ mysql -u root -p
+ $ Enter password: ******
+ mysql> CREATE DATABASE ;
+
+ mysql> quit
+
+ .. note::
+
+ If this is the first time you are using MySQL database on your machine, you might need to install the database
+ and set a root user and password. Visit `initial MySQL set up`_ for more details.
+
+
+5. Set the environment variables for your database(s): ::
+
+ # PostgreSQL
$ export DATABASE_URL=postgres://postgres:@127.0.0.1:5432/
+
+ # MySQL
+ $ export DATABASE_URL=mysql://root:@127.0.0.1:3306/
+
# Optional: set broker URL if using Celery
$ export CELERY_BROKER_URL=redis://localhost:6379/0
@@ -71,11 +97,11 @@ First things first.
will be read.
* Use a local environment manager like `direnv`_
-#. Apply migrations: ::
+6. Apply migrations: ::
$ python manage.py migrate
-#. If you're running synchronously, see the application being served through Django development server: ::
+7. If you're running synchronously, see the application being served through Django development server: ::
$ python manage.py runserver 0.0.0.0:8000
@@ -84,10 +110,12 @@ or if you're running asynchronously: ::
$ uvicorn config.asgi:application --host 0.0.0.0 --reload
.. _PostgreSQL: https://www.postgresql.org/download/
+.. _MySQL: https://dev.mysql.com/downloads/
.. _Redis: https://redis.io/download
.. _CookieCutter: https://github.com/cookiecutter/cookiecutter
.. _createdb: https://www.postgresql.org/docs/current/static/app-createdb.html
.. _initial PostgreSQL set up: https://web.archive.org/web/20190303010033/http://suite.opengeo.org/docs/latest/dataadmin/pgGettingStarted/firstconnect.html
+.. _initial MySQL set up: https://dev.mysql.com/doc/mysql-getting-started/en/#mysql-getting-started-installing
.. _postgres documentation: https://www.postgresql.org/docs/current/static/auth-pg-hba-conf.html
.. _pre-commit: https://pre-commit.com/
.. _direnv: https://direnv.net/
diff --git a/docs/docker-postgres-backups.rst b/docs/docker-database-backups.rst
similarity index 78%
rename from docs/docker-postgres-backups.rst
rename to docs/docker-database-backups.rst
index 875d737eb..965cccd09 100644
--- a/docs/docker-postgres-backups.rst
+++ b/docs/docker-database-backups.rst
@@ -1,4 +1,4 @@
-PostgreSQL Backups with Docker
+Database Backups with Docker
==============================
.. note:: For brevity it is assumed that you will be running the below commands against local environment, however, this is by no means mandatory so feel free to switch to ``production.yml`` when needed.
@@ -14,24 +14,40 @@ Prerequisites
Creating a Backup
-----------------
-To create a backup, run::
+Postgresql Backup
+~~~~~~~~~~~~~~~~~~
- $ docker-compose -f local.yml exec postgres backup
+ To create a backup, run::
+
+ $ docker-compose -f local.yml exec postgres backup
+
+MySQL Backup
+~~~~~~~~~~~~~
+
+ To create a backup, run::
+
+ $ docker-compose -f local.yml exec mysql backup
Assuming your project's database is named ``my_project`` here is what you will see: ::
Backing up the 'my_project' database...
SUCCESS: 'my_project' database backup 'backup_2018_03_13T09_05_07.sql.gz' has been created and placed in '/backups'.
-Keep in mind that ``/backups`` is the ``postgres`` container directory.
+Keep in mind that ``/backups`` is the directory in the database container
Viewing the Existing Backups
----------------------------
-To list existing backups, ::
+To list existing backups,
- $ docker-compose -f local.yml exec postgres backups
+ Postgres database backups: ::
+
+ $ docker-compose -f local.yml exec postgres backups
+
+ MySQL database backups: ::
+
+ $ docker-compose -f local.yml exec mysql backups
These are the sample contents of ``/backups``: ::
@@ -59,15 +75,25 @@ You can also get the container ID using ``docker-compose -f local.yml ps -q post
$ docker cp $(docker-compose -f local.yml ps -q postgres):/backups ./backups
+.. note::
+ The above example assumes `postgres` as the project database. If you're using `mysql`, just replace the container names
+ with `mysql`
+
.. _`command`: https://docs.docker.com/engine/reference/commandline/cp/
Restoring from the Existing Backup
----------------------------------
-To restore from one of the backups you have already got (take the ``backup_2018_03_13T09_05_07.sql.gz`` for example), ::
+To restore from one of the backups you have already got (take the ``backup_2018_03_13T09_05_07.sql.gz`` for example),
+
+Postgres ::
$ docker-compose -f local.yml exec postgres restore backup_2018_03_13T09_05_07.sql.gz
+MySQL ::
+
+ $ docker-compose -f local.yml exec mysql restore backup_2018_03_13T09_05_07.sql.gz
+
You will see something like ::
Restoring the 'my_project' database from the '/backups/backup_2018_03_13T09_05_07.sql.gz' backup...
diff --git a/docs/index.rst b/docs/index.rst
index dae641d10..f393b3b3d 100644
--- a/docs/index.rst
+++ b/docs/index.rst
@@ -23,7 +23,7 @@ Contents
deployment-on-pythonanywhere
deployment-on-heroku
deployment-with-docker
- docker-postgres-backups
+ docker-database-backups
websocket
faq
troubleshooting
diff --git a/docs/project-generation-options.rst b/docs/project-generation-options.rst
index 26fb79d2f..d18458d83 100644
--- a/docs/project-generation-options.rst
+++ b/docs/project-generation-options.rst
@@ -52,8 +52,23 @@ use_pycharm:
use_docker:
Indicates whether the project should be configured to use Docker_ and `Docker Compose`_.
-postgresql_version:
- Select a PostgreSQL_ version to use. The choices are:
+database_engine:
+ Cookiecutter now supports the following databases:
+
+ 1. Postgres
+ 2. MySQL
+
+ Select the database you want to use for your project.
+
+database_version:
+ Select the version of the database you want to use for your project.
+
+ .. note::
+ | Database versions are shown in the format ``{database_engine}@{version}``.
+ | So make sure you select the versoin accoring to the database version you seleted in the previous step.
+ | For example, if you selected **Postgres** as your database, make sure to select from one of the options starting with ``postgresql@{version_you_want}`` and if you seleted **MySQL**, select one of the options starting with ``mysql@{version_you_want}``.
+
+ *Currently, following PostgreSQL versions are supported:*
1. 14.1
2. 13.5
@@ -61,6 +76,10 @@ postgresql_version:
4. 11.14
5. 10.19
+ *Currently, following MySQL versions are supported:*
+
+ 1. 5.7
+
js_task_runner:
Select a JavaScript task runner. The choices are:
diff --git a/hooks/post_gen_project.py b/hooks/post_gen_project.py
index 50fcbea24..7e1374c2e 100644
--- a/hooks/post_gen_project.py
+++ b/hooks/post_gen_project.py
@@ -143,6 +143,32 @@ def remove_dotgithub_folder():
shutil.rmtree(".github")
+def remove_postgres_env_files():
+ local_postgres_envs_path = os.path.join(".envs", ".local", ".postgres")
+ production_postgres_envs_path = os.path.join(".envs", ".production", ".postgres")
+
+ os.remove(local_postgres_envs_path)
+ os.remove(production_postgres_envs_path)
+
+
+def remove_mysql_env_files():
+ local_mysql_envs_path = os.path.join(".envs", ".local", ".mysql")
+ production_mysql_envs_path = os.path.join(".envs", ".production", ".mysql")
+
+ os.remove(local_mysql_envs_path)
+ os.remove(production_mysql_envs_path)
+
+
+def remove_postgres_docker_folder():
+ postgres_compose_path = os.path.join("compose", "production", "postgres")
+ shutil.rmtree(postgres_compose_path)
+
+
+def remove_mysql_docker_folder():
+ mysql_compose_path = os.path.join("compose", "production", "mysql")
+ shutil.rmtree(mysql_compose_path)
+
+
def generate_random_string(
length, using_digits=False, using_ascii_letters=False, using_punctuation=False
):
@@ -217,25 +243,50 @@ def generate_random_user():
return generate_random_string(length=32, using_ascii_letters=True)
-def generate_postgres_user(debug=False):
+def generate_database_user(debug=False):
return DEBUG_VALUE if debug else generate_random_user()
-def set_postgres_user(file_path, value):
- postgres_user = set_flag(file_path, "!!!SET POSTGRES_USER!!!", value=value)
- return postgres_user
+def set_database_user(file_path: str, value: str, database_engine: str):
+ database_user = set_flag(
+ file_path, f"!!!SET {database_engine.upper()}_USER!!!", value=value
+ )
+ return database_user
-def set_postgres_password(file_path, value=None):
- postgres_password = set_flag(
+def set_database_password(file_path: str, database_engine: str, value: str = None):
+ database_password = set_flag(
file_path,
- "!!!SET POSTGRES_PASSWORD!!!",
+ f"!!!SET {database_engine.upper()}_USER!!!",
value=value,
length=64,
using_digits=True,
using_ascii_letters=True,
)
- return postgres_password
+ return database_password
+
+
+def get_database_env_path(env: str, database_engine: str):
+ local_postgres_envs_path = os.path.join(".envs", ".local", ".postgres")
+ production_postgres_envs_path = os.path.join(".envs", ".production", ".postgres")
+ local_mysql_envs_path = os.path.join(".envs", ".local", ".mysql")
+ production_mysql_envs_path = os.path.join(".envs", ".production", ".mysql")
+
+ is_mysql = database_engine == "mysql"
+ is_postgres = database_engine == "postgresql"
+
+ if env == "local":
+ if is_mysql:
+ return local_mysql_envs_path
+ if is_postgres:
+ return local_postgres_envs_path
+ if env == "prod":
+ if is_mysql:
+ return production_mysql_envs_path
+ if is_postgres:
+ return production_postgres_envs_path
+
+ return None
def set_celery_flower_user(file_path, value):
@@ -263,22 +314,35 @@ def append_to_gitignore_file(ignored_line):
gitignore_file.write("\n")
-def set_flags_in_envs(postgres_user, celery_flower_user, debug=False):
+def set_flags_in_envs(database_user, celery_flower_user, debug=False):
local_django_envs_path = os.path.join(".envs", ".local", ".django")
production_django_envs_path = os.path.join(".envs", ".production", ".django")
- local_postgres_envs_path = os.path.join(".envs", ".local", ".postgres")
- production_postgres_envs_path = os.path.join(".envs", ".production", ".postgres")
+
+ selected_database = "{{ cookiecutter.database_engine }}"
set_django_secret_key(production_django_envs_path)
set_django_admin_url(production_django_envs_path)
- set_postgres_user(local_postgres_envs_path, value=postgres_user)
- set_postgres_password(
- local_postgres_envs_path, value=DEBUG_VALUE if debug else None
+ set_database_user(
+ get_database_env_path(env="local", database_engine=selected_database),
+ value=database_user,
+ database_engine=selected_database,
)
- set_postgres_user(production_postgres_envs_path, value=postgres_user)
- set_postgres_password(
- production_postgres_envs_path, value=DEBUG_VALUE if debug else None
+ set_database_password(
+ get_database_env_path(env="local", database_engine=selected_database),
+ database_engine=selected_database,
+ value=DEBUG_VALUE if debug else None,
+ )
+
+ set_database_user(
+ get_database_env_path(env="prod", database_engine=selected_database),
+ value=database_user,
+ database_engine=selected_database,
+ )
+ set_database_password(
+ get_database_env_path(env="prod", database_engine=selected_database),
+ database_engine=selected_database,
+ value=DEBUG_VALUE if debug else None,
)
set_celery_flower_user(local_django_envs_path, value=celery_flower_user)
@@ -352,6 +416,10 @@ def main():
remove_pycharm_files()
if "{{ cookiecutter.use_docker }}".lower() == "y":
+ if "{{ cookiecutter.database_engine }}".lower() == "postgresql":
+ remove_mysql_docker_folder()
+ elif "{{ cookiecutter.database_engine }}".lower() == "mysql":
+ remove_postgres_docker_folder()
remove_utility_files()
else:
remove_docker_files()
@@ -367,6 +435,11 @@ def main():
elif "{{ cookiecutter.use_compressor }}".lower() == "n":
remove_heroku_build_hooks()
+ if "{{ cookiecutter.database_engine }}".lower() == "postgresql":
+ remove_mysql_env_files()
+ elif "{{ cookiecutter.database_engine }}".lower() == "mysql":
+ remove_postgres_env_files()
+
if (
"{{ cookiecutter.use_docker }}".lower() == "n"
and "{{ cookiecutter.use_heroku }}".lower() == "n"
diff --git a/hooks/pre_gen_project.py b/hooks/pre_gen_project.py
index 3fd4131c8..b07c736f0 100644
--- a/hooks/pre_gen_project.py
+++ b/hooks/pre_gen_project.py
@@ -17,6 +17,9 @@ INFO = "\x1b[1;33m [INFO]: "
HINT = "\x1b[3;33m"
SUCCESS = "\x1b[1;32m [SUCCESS]: "
+SUPPORTED_POSTGRES_VERSIONS = ["14.1", "13.5", "12.9", "11.14", "10.19"]
+SUPPORTED_MYSQL_VERSIONS = ["5.7"]
+
project_slug = "{{ cookiecutter.project_slug }}"
if hasattr(project_slug, "isidentifier"):
assert (
@@ -83,3 +86,17 @@ if (
"Mail Service for sending emails."
)
sys.exit(1)
+
+
+if (
+ "{{ cookiecutter.database_version }}".lower().split("@")[0]
+ != "{{ cookiecutter.database_engine }}"
+):
+ print(
+ WARNING + " You have selected {{ cookiecutter.database_engine }} "
+ "as your database engien and "
+ "your selected database_version {{ cookiecutter.database_version }} is not "
+ "compatible with this "
+ "selection. Please retry and select appropriate option." + TERMINATOR
+ )
+ sys.exit(1)
diff --git a/tests/test_cookiecutter_generation.py b/tests/test_cookiecutter_generation.py
index 49a0d2cc8..1d7048395 100755
--- a/tests/test_cookiecutter_generation.py
+++ b/tests/test_cookiecutter_generation.py
@@ -47,11 +47,12 @@ SUPPORTED_COMBINATIONS = [
{"use_pycharm": "n"},
{"use_docker": "y"},
{"use_docker": "n"},
- {"postgresql_version": "14.1"},
- {"postgresql_version": "13.5"},
- {"postgresql_version": "12.9"},
- {"postgresql_version": "11.14"},
- {"postgresql_version": "10.19"},
+ {"database_engine": "postgresql", "database_version": "postgresql@14.1"},
+ {"database_engine": "postgresql", "database_version": "postgresql@13.5"},
+ {"database_engine": "postgresql", "database_version": "postgresql@12.9"},
+ {"database_engine": "postgresql", "database_version": "postgresql@11.14"},
+ {"database_engine": "postgresql", "database_version": "postgresql@10.1 9"},
+ {"database_engine": "mysql", "database_version": "mysql@5.7"},
{"cloud_provider": "AWS", "use_whitenoise": "y"},
{"cloud_provider": "AWS", "use_whitenoise": "n"},
{"cloud_provider": "GCP", "use_whitenoise": "y"},
@@ -117,6 +118,12 @@ UNSUPPORTED_COMBINATIONS = [
{"cloud_provider": "None", "use_whitenoise": "n"},
{"cloud_provider": "GCP", "mail_service": "Amazon SES"},
{"cloud_provider": "None", "mail_service": "Amazon SES"},
+ {"database_engine": "postgresql", "database_version": "mysql@5.7"},
+ {"database_engine": "mysql", "database_version": "postgresql@14.1"},
+ {"database_engine": "mysql", "database_version": "postgresql@13.5"},
+ {"database_engine": "mysql", "database_version": "postgresql@12.9"},
+ {"database_engine": "mysql", "database_version": "postgresql@11.14"},
+ {"database_engine": "mysql", "database_version": "postgresql@10.19"},
]
diff --git a/{{cookiecutter.project_slug}}/.envs/.local/.mysql b/{{cookiecutter.project_slug}}/.envs/.local/.mysql
new file mode 100644
index 000000000..765b69183
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/.envs/.local/.mysql
@@ -0,0 +1,9 @@
+# MySQL
+# ------------------------------------------------------------------------------
+MYSQL_HOST=mysql
+MYSQL_PORT=3306
+MYSQL_DATABASE={{ cookiecutter.project_slug }}
+MYSQL_USER=!!!SET MYSQL_USER!!!
+MYSQL_PASSWORD=!!!SET MYSQL_PASSWORD!!!
+MYSQL_ROOT_PASSWORD=!!!SET MYSQL_ROOT_PASSWORD!!!
+
diff --git a/{{cookiecutter.project_slug}}/.envs/.production/.mysql b/{{cookiecutter.project_slug}}/.envs/.production/.mysql
new file mode 100644
index 000000000..221c8e264
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/.envs/.production/.mysql
@@ -0,0 +1,8 @@
+# MySQL
+# ------------------------------------------------------------------------------
+MYSQL_HOST=mysql
+MYSQL_PORT=3306
+MYSQL_DATABASE={{ cookiecutter.project_slug }}
+MYSQL_USER=!!!SET MYSQL_USER!!!
+MYSQL_PASSWORD=!!!SET MYSQL_PASSWORD!!!
+MYSQL_ROOT_PASSWORD=!!!SET MYSQL_ROOT_PASSWORD!!!
diff --git a/{{cookiecutter.project_slug}}/.github/dependabot.yml b/{{cookiecutter.project_slug}}/.github/dependabot.yml
index cf88cf33b..0b792f748 100644
--- a/{{cookiecutter.project_slug}}/.github/dependabot.yml
+++ b/{{cookiecutter.project_slug}}/.github/dependabot.yml
@@ -57,8 +57,14 @@ updates:
# Enable version updates for Docker
- package-ecosystem: "docker"
+ {%- if cookiecutter.database_engine == 'postgresql' %}
# Look for a `Dockerfile` in the `compose/production/postgres` directory
directory: "compose/production/postgres/"
+ {%- endif %}
+ {%- if cookiecutter.database_engine == 'mysql' %}
+ # Look for a `Dockerfile` in the `compose/production/mysql` directory
+ directory: "compose/production/mysql/"
+ {%- endif %}
# Check for updates to GitHub Actions every weekday
schedule:
interval: "daily"
diff --git a/{{cookiecutter.project_slug}}/.github/workflows/ci.yml b/{{cookiecutter.project_slug}}/.github/workflows/ci.yml
index c24f4cb94..4d83d7d91 100644
--- a/{{cookiecutter.project_slug}}/.github/workflows/ci.yml
+++ b/{{cookiecutter.project_slug}}/.github/workflows/ci.yml
@@ -47,19 +47,35 @@ jobs:
ports:
- 6379:6379
{%- endif %}
+ {%- if cookiecutter.database_engine == 'postgresql' %}
postgres:
image: postgres:12
ports:
- 5432:5432
env:
POSTGRES_PASSWORD: postgres
+ {%- endif %}
+ {%- if cookiecutter.database_engine == 'mysql' %}
+ mysql:
+ image: mysql:8.0
+ ports:
+ - 3306:3306
+ env:
+ MYSQL_PASSWORD: mysql
+ {%- endif %}
env:
{%- if cookiecutter.use_celery == 'y' %}
CELERY_BROKER_URL: "redis://localhost:6379/0"
{%- endif %}
+ {%- if cookiecutter.database_engine == 'postgresql' %}
# postgres://user:password@host:port/database
DATABASE_URL: "postgres://postgres:postgres@localhost:5432/postgres"
+ {%- endif %}
+ {%- if cookiecutter.database_engine == 'mysql' %}
+ # mysql://user:password@host:port/database
+ DATABASE_URL: "mysql://mysql:mysql@localhost:3306/mysql"
+ {%- endif %}
{%- endif %}
steps:
diff --git a/{{cookiecutter.project_slug}}/.gitlab-ci.yml b/{{cookiecutter.project_slug}}/.gitlab-ci.yml
index 711bfc392..5ee1aabfc 100644
--- a/{{cookiecutter.project_slug}}/.gitlab-ci.yml
+++ b/{{cookiecutter.project_slug}}/.gitlab-ci.yml
@@ -3,10 +3,18 @@ stages:
- test
variables:
+ {% if cookiecutter.database_engine == 'postgresql' -%}
POSTGRES_USER: '{{ cookiecutter.project_slug }}'
POSTGRES_PASSWORD: ''
POSTGRES_DB: 'test_{{ cookiecutter.project_slug }}'
POSTGRES_HOST_AUTH_METHOD: trust
+ {% elif cookiecutter.database_engine == 'mysql' -%}
+ MYSQL_USER: '{{ cookiecutter.project_slug }}'
+ MYSQL_PASSWORD: ''
+ MYSQL_DATABASE: 'test_{{ cookiecutter.project_slug }}'
+ MYSQL_ROOT_PASSWORD: ''
+ MYSQL_ALLOW_EMPTY_PASSWORD: 'yes'
+ {% endif -%}
{% if cookiecutter.use_celery == 'y' -%}
CELERY_BROKER_URL: 'redis://redis:6379/0'
{%- endif %}
@@ -34,14 +42,22 @@ pytest:
- docker-compose -f local.yml up -d
script:
- docker-compose -f local.yml run django pytest
- {%- else -%}
+ {%- else %}
image: python:3.9
tags:
- python
services:
- - postgres:{{ cookiecutter.postgresql_version }}
+ {%- if cookiecutter.database_engine == 'postgresql' %}
+ - postgres:{{ cookiecutter.database_version.split('@')[1] }}
+ {%- elif cookiecutter.database_engine == 'mysql' %}
+ - mysql:{{ cookiecutter.database_version.split('@')[1] }}
+ {%- endif %}
variables:
+ {%- if cookiecutter.database_engine == 'postgresql' %}
DATABASE_URL: pgsql://$POSTGRES_USER:$POSTGRES_PASSWORD@postgres/$POSTGRES_DB
+ {%- elif cookiecutter.database_engine == 'mysql' %}
+ DATABASE_URL: mysql://$MYSQL_USER:$MYSQL_PASSWORD@mysql/$MYSQL_DATABASE
+ {%- endif %}
before_script:
- pip install -r requirements/local.txt
diff --git a/{{cookiecutter.project_slug}}/compose/local/django/Dockerfile b/{{cookiecutter.project_slug}}/compose/local/django/Dockerfile
index 044ef4a74..190a9f518 100644
--- a/{{cookiecutter.project_slug}}/compose/local/django/Dockerfile
+++ b/{{cookiecutter.project_slug}}/compose/local/django/Dockerfile
@@ -12,8 +12,13 @@ ARG BUILD_ENVIRONMENT=local
RUN apt-get update && apt-get install --no-install-recommends -y \
# dependencies for building Python packages
build-essential \
+ {%- if cookiecutter.database_engine == "postgresql" %}
# psycopg2 dependencies
libpq-dev
+ {%- elif cookiecutter.database_engine == "mysql" %}
+ # mysql dependency
+ default-libmysqlclient-dev
+ {%- endif %}
# Requirements are installed here to ensure they will be cached.
COPY ./requirements .
@@ -37,8 +42,13 @@ WORKDIR ${APP_HOME}
# Install required system dependencies
RUN apt-get update && apt-get install --no-install-recommends -y \
+ {%- if cookiecutter.database_engine == "postgresql" %}
# psycopg2 dependencies
libpq-dev \
+ {%- elif cookiecutter.database_engine == "mysql" %}
+ # mysql dependency
+ default-libmysqlclient-dev \
+ {%- endif %}
# Translations dependencies
gettext \
# cleaning up unused files
diff --git a/{{cookiecutter.project_slug}}/compose/production/django/Dockerfile b/{{cookiecutter.project_slug}}/compose/production/django/Dockerfile
index 1dd47a2f7..84464a725 100644
--- a/{{cookiecutter.project_slug}}/compose/production/django/Dockerfile
+++ b/{{cookiecutter.project_slug}}/compose/production/django/Dockerfile
@@ -25,8 +25,13 @@ ARG BUILD_ENVIRONMENT=production
RUN apt-get update && apt-get install --no-install-recommends -y \
# dependencies for building Python packages
build-essential \
+ {%- if cookiecutter.database_engine == "postgresql" %}
# psycopg2 dependencies
libpq-dev
+ {%- elif cookiecutter.database_engine == "mysql" %}
+ # mysql dependency
+ default-libmysqlclient-dev
+ {%- endif %}
# Requirements are installed here to ensure they will be cached.
COPY ./requirements .
@@ -54,8 +59,13 @@ RUN addgroup --system django \
# Install required system dependencies
RUN apt-get update && apt-get install --no-install-recommends -y \
+ {%- if cookiecutter.database_engine == "postgresql" %}
# psycopg2 dependencies
libpq-dev \
+ {%- elif cookiecutter.database_engine == "mysql" %}
+ # mysql dependency
+ default-libmysqlclient-dev \
+ {%- endif %}
# Translations dependencies
gettext \
# cleaning up unused files
diff --git a/{{cookiecutter.project_slug}}/compose/production/django/entrypoint b/{{cookiecutter.project_slug}}/compose/production/django/entrypoint
index 95ab8297a..109fcfdcc 100644
--- a/{{cookiecutter.project_slug}}/compose/production/django/entrypoint
+++ b/{{cookiecutter.project_slug}}/compose/production/django/entrypoint
@@ -10,16 +10,26 @@ set -o nounset
export CELERY_BROKER_URL="${REDIS_URL}"
{% endif %}
+{%- if cookiecutter.database_engine == 'postgresql' %}
if [ -z "${POSTGRES_USER}" ]; then
base_postgres_image_default_user='postgres'
export POSTGRES_USER="${base_postgres_image_default_user}"
fi
export DATABASE_URL="postgres://${POSTGRES_USER}:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}"
+{%- endif %}
+{%- if cookiecutter.database_engine == 'mysql' %}
+if [ -z "${MYSQL_USER}" ]; then
+ base_mysql_image_default_user='mysql'
+ export MYSQL_USER="${base_mysql_image_default_user}"
+fi
+export DATABASE_URL="mysql://${MYSQL_USER}:${MYSQL_PASSWORD}@${MYSQL_HOST}:${MYSQL_PORT}/${MYSQL_DATABASE}"
+{%- endif %}
-postgres_ready() {
+database_ready() {
python << END
import sys
+{%- if cookiecutter.database_engine == 'postgresql' %}
import psycopg2
try:
@@ -33,13 +43,28 @@ try:
except psycopg2.OperationalError:
sys.exit(-1)
sys.exit(0)
+{%- endif %}
+{%- if cookiecutter.database_engine == 'mysql' %}
+import MySQLdb
+
+try:
+ _db = MySQLdb._mysql.connect(
+ host="${MYSQL_HOST}",
+ user="${MYSQL_USER}",
+ password="${MYSQL_PASSWORD}",
+ database="${MYSQL_DATABASE}",
+ port=int("${MYSQL_PORT}")
+ )
+except MySQLdb._exceptions.OperationalError:
+ sys.exit(-1)
+{%- endif %}
END
}
-until postgres_ready; do
- >&2 echo 'Waiting for PostgreSQL to become available...'
+until database_ready; do
+ >&2 echo 'Waiting for {{ cookiecutter.database_engine.upper() }} to become available...'
sleep 1
done
->&2 echo 'PostgreSQL is available'
+>&2 echo '{{ cookiecutter.database_engine.upper() }} is available'
exec "$@"
diff --git a/{{cookiecutter.project_slug}}/compose/production/mysql/Dockerfile b/{{cookiecutter.project_slug}}/compose/production/mysql/Dockerfile
new file mode 100644
index 000000000..71eabc804
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/compose/production/mysql/Dockerfile
@@ -0,0 +1,6 @@
+FROM mysql:{{ cookiecutter.database_version.lower().split('@')[1] }}
+
+COPY ./compose/production/mysql/maintenance /usr/local/bin/maintenance
+RUN chmod +x /usr/local/bin/maintenance/*
+RUN mv /usr/local/bin/maintenance/* /usr/local/bin \
+ && rmdir /usr/local/bin/maintenance
diff --git a/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/_sourced/constants.sh b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/_sourced/constants.sh
new file mode 100644
index 000000000..6ca4f0ca9
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/_sourced/constants.sh
@@ -0,0 +1,5 @@
+#!/usr/bin/env bash
+
+
+BACKUP_DIR_PATH='/backups'
+BACKUP_FILE_PREFIX='backup'
diff --git a/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/_sourced/countdown.sh b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/_sourced/countdown.sh
new file mode 100644
index 000000000..e6cbfb6ff
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/_sourced/countdown.sh
@@ -0,0 +1,12 @@
+#!/usr/bin/env bash
+
+
+countdown() {
+ declare desc="A simple countdown. Source: https://superuser.com/a/611582"
+ local seconds="${1}"
+ local d=$(($(date +%s) + "${seconds}"))
+ while [ "$d" -ge `date +%s` ]; do
+ echo -ne "$(date -u --date @$(($d - `date +%s`)) +%H:%M:%S)\r";
+ sleep 0.1
+ done
+}
diff --git a/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/_sourced/messages.sh b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/_sourced/messages.sh
new file mode 100644
index 000000000..f6be756e9
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/_sourced/messages.sh
@@ -0,0 +1,41 @@
+#!/usr/bin/env bash
+
+
+message_newline() {
+ echo
+}
+
+message_debug()
+{
+ echo -e "DEBUG: ${@}"
+}
+
+message_welcome()
+{
+ echo -e "\e[1m${@}\e[0m"
+}
+
+message_warning()
+{
+ echo -e "\e[33mWARNING\e[0m: ${@}"
+}
+
+message_error()
+{
+ echo -e "\e[31mERROR\e[0m: ${@}"
+}
+
+message_info()
+{
+ echo -e "\e[37mINFO\e[0m: ${@}"
+}
+
+message_suggestion()
+{
+ echo -e "\e[33mSUGGESTION\e[0m: ${@}"
+}
+
+message_success()
+{
+ echo -e "\e[32mSUCCESS\e[0m: ${@}"
+}
diff --git a/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/_sourced/yes_no.sh b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/_sourced/yes_no.sh
new file mode 100644
index 000000000..fd9cae161
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/_sourced/yes_no.sh
@@ -0,0 +1,16 @@
+#!/usr/bin/env bash
+
+
+yes_no() {
+ declare desc="Prompt for confirmation. \$\"\{1\}\": confirmation message."
+ local arg1="${1}"
+
+ local response=
+ read -r -p "${arg1} (y/[n])? " response
+ if [[ "${response}" =~ ^[Yy]$ ]]
+ then
+ exit 0
+ else
+ exit 1
+ fi
+}
diff --git a/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/backup b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/backup
new file mode 100644
index 000000000..0bdaa0264
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/backup
@@ -0,0 +1,39 @@
+#!/usr/bin/env bash
+
+
+### Create a database backup.
+###
+### Usage:
+### $ docker-compose -f .yml (exec |run --rm) mysql backup
+
+
+set -o errexit
+set -o pipefail
+set -o nounset
+
+
+working_dir="$(dirname ${0})"
+source "${working_dir}/_sourced/constants.sh"
+source "${working_dir}/_sourced/messages.sh"
+
+
+message_welcome "Backing up the '${MYSQL_DATABASE}' database..."
+
+
+if [[ "${MYSQL_USER}" == "root" ]]; then
+ message_error "Backing up as 'root' user is not supported. Assign 'MYSQL_USER' env with another one and try again."
+ exit 1
+fi
+
+export MYSQL_TCP_PORT="${MYSQL_PORT}"
+export MYSQL_HOST="${MYSQL_HOST}"
+
+backup_filename="${BACKUP_FILE_PREFIX}_$(date +'%Y_%m_%dT%H_%M_%S').sql.gz"
+backup_file_path="${BACKUP_DIR_PATH}/${backup_filename}"
+
+
+mysqldump --no-tablespaces --user=${MYSQL_USER} --password=${MYSQL_PASSWORD} --port=${MYSQL_PORT} ${MYSQL_DATABASE} | gzip > "${backup_file_path}"
+
+
+
+message_success "'${MYSQL_DATABASE}' database backup '${backup_filename}' has been created and placed in '${BACKUP_DIR_PATH}'."
diff --git a/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/backups b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/backups
new file mode 100644
index 000000000..9239d9fcd
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/backups
@@ -0,0 +1,22 @@
+#!/usr/bin/env bash
+
+
+### View backups.
+###
+### Usage:
+### $ docker-compose -f .yml (exec |run --rm) mysql backups
+
+
+set -o errexit
+set -o pipefail
+set -o nounset
+
+
+working_dir="$(dirname ${0})"
+source "${working_dir}/_sourced/constants.sh"
+source "${working_dir}/_sourced/messages.sh"
+
+
+message_welcome "These are the backups you have got:"
+
+ls -lht "${BACKUP_DIR_PATH}"
diff --git a/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/restore b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/restore
new file mode 100644
index 000000000..902f45381
--- /dev/null
+++ b/{{cookiecutter.project_slug}}/compose/production/mysql/maintenance/restore
@@ -0,0 +1,49 @@
+#!/usr/bin/env bash
+
+
+### Restore database from a backup.
+###
+### Parameters:
+### <1> filename of an existing backup.
+###
+### Usage:
+### $ docker-compose -f .yml (exec |run --rm) mysql restore <1>
+
+
+set -o errexit
+set -o pipefail
+set -o nounset
+
+
+working_dir="$(dirname ${0})"
+source "${working_dir}/_sourced/constants.sh"
+source "${working_dir}/_sourced/messages.sh"
+
+
+if [[ -z ${1+x} ]]; then
+ message_error "Backup filename is not specified yet it is a required parameter. Make sure you provide one and try again."
+ exit 1
+fi
+backup_filename="${BACKUP_DIR_PATH}/${1}"
+if [[ ! -f "${backup_filename}" ]]; then
+ message_error "No backup with the specified filename found. Check out the 'backups' maintenance script output to see if there is one and try again."
+ exit 1
+fi
+
+message_welcome "Restoring the '${MYSQL_DATABASE}' database from the '${backup_filename}' backup..."
+
+if [[ "${MYSQL_USER}" == "root" ]]; then
+ message_error "Restoring as 'mysql' user is not supported. Assign 'MYSQL_DATABASE' env with another one and try again."
+ exit 1
+fi
+
+message_info "Dropping the database..."
+echo "DROP DATABASE IF EXISTS ${MYSQL_DATABASE};" | mysql --user=${MYSQL_USER} --password=${MYSQL_PASSWORD}
+
+message_info "Creating a new database..."
+echo "CREATE DATABASE IF NOT EXISTS ${MYSQL_DATABASE};" | mysql --user=${MYSQL_USER} --password=${MYSQL_PASSWORD}
+
+message_info "Applying the backup to the new database..."
+gunzip -c "${backup_filename}" | mysql --user=${MYSQL_USER} --password=${MYSQL_PASSWORD} ${MYSQL_DATABASE}
+
+message_success "The '${MYSQL_DATABASE}' database has been restored from the '${backup_filename}' backup."
diff --git a/{{cookiecutter.project_slug}}/compose/production/postgres/Dockerfile b/{{cookiecutter.project_slug}}/compose/production/postgres/Dockerfile
index eca29bada..fa9f2ac80 100644
--- a/{{cookiecutter.project_slug}}/compose/production/postgres/Dockerfile
+++ b/{{cookiecutter.project_slug}}/compose/production/postgres/Dockerfile
@@ -1,4 +1,4 @@
-FROM postgres:{{ cookiecutter.postgresql_version }}
+FROM postgres:{{ cookiecutter.database_version.lower().split('@')[1] }}
COPY ./compose/production/postgres/maintenance /usr/local/bin/maintenance
RUN chmod +x /usr/local/bin/maintenance/*
diff --git a/{{cookiecutter.project_slug}}/config/settings/base.py b/{{cookiecutter.project_slug}}/config/settings/base.py
index 5177c92fa..afae0ec4f 100644
--- a/{{cookiecutter.project_slug}}/config/settings/base.py
+++ b/{{cookiecutter.project_slug}}/config/settings/base.py
@@ -43,9 +43,15 @@ LOCALE_PATHS = [str(ROOT_DIR / "locale")]
{% if cookiecutter.use_docker == "y" -%}
DATABASES = {"default": env.db("DATABASE_URL")}
{%- else %}
+{% if cookiecutter.database_engine == 'postgresql' -%}
DATABASES = {
"default": env.db("DATABASE_URL", default="postgres://{% if cookiecutter.windows == 'y' %}localhost{% endif %}/{{cookiecutter.project_slug}}"),
}
+{% elif cookiecutter.database_engine == 'mysql' -%}
+DATABASES = {
+ "default": env.db("DATABASE_URL", default="mysql://root:debug@{% if cookiecutter.windows == 'y' %}localhost{% endif %}/{{cookiecutter.project_slug}}"),
+}
+{%- endif %}
{%- endif %}
DATABASES["default"]["ATOMIC_REQUESTS"] = True
# https://docs.djangoproject.com/en/stable/ref/settings/#std:setting-DEFAULT_AUTO_FIELD
diff --git a/{{cookiecutter.project_slug}}/local.yml b/{{cookiecutter.project_slug}}/local.yml
index 5090c5cef..3678da4e6 100644
--- a/{{cookiecutter.project_slug}}/local.yml
+++ b/{{cookiecutter.project_slug}}/local.yml
@@ -1,8 +1,14 @@
version: '3'
volumes:
+ {%- if cookiecutter.database_engine == 'postgresql' %}
{{ cookiecutter.project_slug }}_local_postgres_data: {}
{{ cookiecutter.project_slug }}_local_postgres_data_backups: {}
+ {%- endif %}
+ {%- if cookiecutter.database_engine == 'mysql' %}
+ {{ cookiecutter.project_slug }}_local_mysql_data: {}
+ {{ cookiecutter.project_slug }}_local_mysql_data_backups: {}
+ {%- endif %}
services:
django:{% if cookiecutter.use_celery == 'y' %} &django{% endif %}
@@ -12,7 +18,12 @@ services:
image: {{ cookiecutter.project_slug }}_local_django
container_name: {{ cookiecutter.project_slug }}_local_django
depends_on:
+ {%- if cookiecutter.database_engine == 'postgresql' %}
- postgres
+ {%- endif %}
+ {%- if cookiecutter.database_engine == 'mysql' %}
+ - mysql
+ {%- endif %}
{%- if cookiecutter.use_celery == 'y' %}
- redis
{%- endif %}
@@ -23,11 +34,17 @@ services:
- .:/app:z
env_file:
- ./.envs/.local/.django
+ {%- if cookiecutter.database_engine == 'postgresql' %}
- ./.envs/.local/.postgres
+ {%- endif %}
+ {%- if cookiecutter.database_engine == 'mysql' %}
+ - ./.envs/.local/.mysql
+ {%- endif %}
ports:
- "8000:8000"
command: /start
+ {%- if cookiecutter.database_engine == 'postgresql' %}
postgres:
build:
context: .
@@ -39,6 +56,20 @@ services:
- {{ cookiecutter.project_slug }}_local_postgres_data_backups:/backups:z
env_file:
- ./.envs/.local/.postgres
+ {%- endif %}
+ {%- if cookiecutter.database_engine == 'mysql' %}
+ mysql:
+ build:
+ context: .
+ dockerfile: ./compose/production/mysql/Dockerfile
+ image: {{ cookiecutter.project_slug }}_production_mysql
+ container_name: mysql
+ volumes:
+ - {{ cookiecutter.project_slug }}_local_mysql_data:/var/lib/mysql:z
+ - {{ cookiecutter.project_slug }}_local_mysql_data_backups:/backups:z
+ env_file:
+ - ./.envs/.local/.mysql
+ {%- endif %}
docs:
image: {{ cookiecutter.project_slug }}_local_docs
@@ -76,7 +107,12 @@ services:
container_name: {{ cookiecutter.project_slug }}_local_celeryworker
depends_on:
- redis
+ {%- if cookiecutter.database_engine == 'postgresql' %}
- postgres
+ {%- endif %}
+ {%- if cookiecutter.database_engine == 'mysql' %}
+ - mysql
+ {%- endif %}
{%- if cookiecutter.use_mailhog == 'y' %}
- mailhog
{%- endif %}
@@ -89,7 +125,12 @@ services:
container_name: {{ cookiecutter.project_slug }}_local_celerybeat
depends_on:
- redis
+ {%- if cookiecutter.database_engine == 'postgresql' %}
- postgres
+ {%- endif %}
+ {%- if cookiecutter.database_engine == 'mysql' %}
+ - mysql
+ {%- endif %}
{%- if cookiecutter.use_mailhog == 'y' %}
- mailhog
{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/production.yml b/{{cookiecutter.project_slug}}/production.yml
index ea4292a0d..131108345 100644
--- a/{{cookiecutter.project_slug}}/production.yml
+++ b/{{cookiecutter.project_slug}}/production.yml
@@ -1,8 +1,14 @@
version: '3'
volumes:
+ {%- if cookiecutter.database_engine == 'postgresql' %}
production_postgres_data: {}
production_postgres_data_backups: {}
+ {%- endif %}
+ {%- if cookiecutter.database_engine == 'mysql' %}
+ production_mysql_data: {}
+ production_mysql_data_backups: {}
+ {%- endif %}
production_traefik: {}
services:
@@ -12,13 +18,24 @@ services:
dockerfile: ./compose/production/django/Dockerfile
image: {{ cookiecutter.project_slug }}_production_django
depends_on:
+ {%- if cookiecutter.database_engine == 'postgresql' %}
- postgres
+ {%- endif %}
+ {%- if cookiecutter.database_engine == 'mysql' %}
+ - mysql
+ {%- endif %}
- redis
env_file:
- ./.envs/.production/.django
+ {%- if cookiecutter.database_engine == 'postgresql' %}
- ./.envs/.production/.postgres
+ {%- endif %}
+ {%- if cookiecutter.database_engine == 'mysql' %}
+ - ./.envs/.production/.mysql
+ {%- endif %}
command: /start
+ {%- if cookiecutter.database_engine == 'postgresql' %}
postgres:
build:
context: .
@@ -29,6 +46,20 @@ services:
- production_postgres_data_backups:/backups:z
env_file:
- ./.envs/.production/.postgres
+ {%- endif %}
+ {%- if cookiecutter.database_engine == 'mysql' %}
+ mysql:
+ build:
+ context: .
+ dockerfile: ./compose/production/mysql/Dockerfile
+ image: {{ cookiecutter.project_slug }}_production_mysql
+ container_name: mysql
+ volumes:
+ - production_mysql_data:/var/lib/mysql:z
+ - production_mysql_data_backups:/backups:z
+ env_file:
+ - ./.envs/.production/.mysql
+ {%- endif %}
traefik:
build:
diff --git a/{{cookiecutter.project_slug}}/requirements/local.txt b/{{cookiecutter.project_slug}}/requirements/local.txt
index 04e4bedd6..ee99e04e9 100644
--- a/{{cookiecutter.project_slug}}/requirements/local.txt
+++ b/{{cookiecutter.project_slug}}/requirements/local.txt
@@ -2,11 +2,16 @@
Werkzeug[watchdog]==2.0.2 # https://github.com/pallets/werkzeug
ipdb==0.13.9 # https://github.com/gotcha/ipdb
+{%- if cookiecutter.database_engine == "postgresql" %}
{%- if cookiecutter.use_docker == 'y' %}
psycopg2==2.9.3 # https://github.com/psycopg/psycopg2
{%- else %}
psycopg2-binary==2.9.3 # https://github.com/psycopg/psycopg2
{%- endif %}
+{%- endif %}
+{%- if cookiecutter.database_engine == "mysql" %}
+mysqlclient==2.1.0 # https://github.com/PyMySQL/mysqlclient
+{%- endif %}
{%- if cookiecutter.use_async == 'y' or cookiecutter.use_celery == 'y' %}
watchgod==0.7 # https://github.com/samuelcolvin/watchgod
{%- endif %}
diff --git a/{{cookiecutter.project_slug}}/requirements/production.txt b/{{cookiecutter.project_slug}}/requirements/production.txt
index 859af6049..67d0ae0c5 100644
--- a/{{cookiecutter.project_slug}}/requirements/production.txt
+++ b/{{cookiecutter.project_slug}}/requirements/production.txt
@@ -3,7 +3,11 @@
-r base.txt
gunicorn==20.1.0 # https://github.com/benoitc/gunicorn
+{%- if cookiecutter.database_engine == "postgresql" %}
psycopg2==2.9.3 # https://github.com/psycopg/psycopg2
+{%- elif cookiecutter.database_engine == "mysql" %}
+mysqlclient==2.1.0 # https://github.com/PyMySQL/mysqlclient
+{%- endif %}
{%- if cookiecutter.use_whitenoise == 'n' %}
Collectfast==2.2.0 # https://github.com/antonagestam/collectfast
{%- endif %}