cookiecutter-django/docs/developing-locally.rst

198 lines
6.9 KiB
ReStructuredText
Raw Normal View History

Getting Up and Running Locally
==============================
.. index:: pip, virtualenv, PostgreSQL
2018-03-09 21:17:56 +03:00
Setting Up Development Environment
----------------------------------
2018-03-09 21:17:56 +03:00
Make sure to have the following on your host:
2015-12-01 17:09:50 +03:00
* Python 3.10
* PostgreSQL_.
* Redis_, if using Celery
* Cookiecutter_
2018-03-09 21:17:56 +03:00
First things first.
#. Create a virtualenv: ::
2018-03-09 21:17:56 +03:00
$ python3.10 -m venv <virtual env path>
#. Activate the virtualenv you have just created: ::
$ source <virtual env path>/bin/activate
2015-12-01 17:09:50 +03:00
#.
.. include:: generate-project-block.rst
2018-03-09 21:17:56 +03:00
#. Install development requirements: ::
$ cd <what you have entered as the project_slug at setup stage>
$ pip install -r requirements/local.txt
$ git init # A git repo is required for pre-commit to install
$ pre-commit install
2020-11-15 00:10:36 +03:00
.. note::
the `pre-commit` hook exists in the generated project as default.
For the details of `pre-commit`, follow the `pre-commit`_ site.
#. Create a new PostgreSQL database using createdb_: ::
$ createdb --username=postgres <project_slug>
``project_slug`` is what you have entered as the project_slug at the setup stage.
.. note::
if this is the first time a database is created on your machine you might need an
`initial PostgreSQL set up`_ to allow local connections & set a password for
the ``postgres`` user. The `postgres documentation`_ explains the syntax of the config file
that you need to change.
#. Set the environment variables for your database(s): ::
$ export DATABASE_URL=postgres://postgres:<password>@127.0.0.1:5432/<DB name given to createdb>
# Optional: set broker URL if using Celery
$ export CELERY_BROKER_URL=redis://localhost:6379/0
.. note::
Check out the :ref:`settings` page for a comprehensive list of the environments variables.
.. seealso::
To help setting up your environment variables, you have a few options:
* create an ``.env`` file in the root of your project and define all the variables you need in it.
Then you just need to have ``DJANGO_READ_DOT_ENV_FILE=True`` in your machine and all the variables
will be read.
* Use a local environment manager like `direnv`_
2018-03-09 21:17:56 +03:00
#. Apply migrations: ::
$ python manage.py migrate
2020-04-13 18:26:09 +03:00
#. If you're running synchronously, see the application being served through Django development server: ::
2018-03-09 21:17:56 +03:00
$ python manage.py runserver 0.0.0.0:8000
2020-04-13 18:26:09 +03:00
or if you're running asynchronously: ::
$ uvicorn config.asgi:application --host 0.0.0.0 --reload --reload-include '*.html'
2020-04-13 18:26:09 +03:00
.. _PostgreSQL: https://www.postgresql.org/download/
.. _Redis: https://redis.io/download
.. _CookieCutter: https://github.com/cookiecutter/cookiecutter
.. _createdb: https://www.postgresql.org/docs/current/static/app-createdb.html
.. _initial PostgreSQL set up: https://web.archive.org/web/20190303010033/http://suite.opengeo.org/docs/latest/dataadmin/pgGettingStarted/firstconnect.html
.. _postgres documentation: https://www.postgresql.org/docs/current/static/auth-pg-hba-conf.html
.. _pre-commit: https://pre-commit.com/
.. _direnv: https://direnv.net/
2018-03-09 21:17:56 +03:00
2018-03-09 21:17:56 +03:00
Setup Email Backend
-------------------
2018-03-09 21:17:56 +03:00
MailHog
~~~~~~~
2018-03-09 21:17:56 +03:00
.. note:: In order for the project to support MailHog_ it must have been bootstrapped with ``use_mailhog`` set to ``y``.
2018-03-09 21:17:56 +03:00
MailHog is used to receive emails during development, it is written in Go and has no external dependencies.
2018-03-09 21:17:56 +03:00
For instance, one of the packages we depend upon, ``django-allauth`` sends verification emails to new users signing up as well as to the existing ones who have not yet verified themselves.
2018-03-09 21:17:56 +03:00
#. `Download the latest MailHog release`_ for your OS.
2018-03-09 21:17:56 +03:00
#. Rename the build to ``MailHog``.
2018-03-09 21:17:56 +03:00
#. Copy the file to the project root.
2018-03-09 21:17:56 +03:00
#. Make it executable: ::
2018-03-09 21:17:56 +03:00
$ chmod +x MailHog
2018-03-09 21:17:56 +03:00
#. Spin up another terminal window and start it there: ::
2016-06-13 15:05:43 +03:00
2018-03-09 21:17:56 +03:00
./MailHog
2018-03-09 21:17:56 +03:00
#. Check out `<http://127.0.0.1:8025/>`_ to see how it goes.
2015-11-18 14:16:25 +03:00
2018-03-09 21:17:56 +03:00
Now you have your own mail server running locally, ready to receive whatever you send it.
2016-06-13 15:05:43 +03:00
.. _`Download the latest MailHog release`: https://github.com/mailhog/MailHog
2018-03-09 21:17:56 +03:00
Console
~~~~~~~
.. note:: If you have generated your project with ``use_mailhog`` set to ``n`` this will be a default setup.
Alternatively, deliver emails over console via ``EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'``.
In production, we have Mailgun_ configured to have your back!
.. _Mailgun: https://www.mailgun.com/
Celery
------
If the project is configured to use Celery as a task scheduler then, by default, tasks are set to run on the main thread when developing locally instead of getting sent to a broker. However, if you have Redis setup on your local machine, you can set the following in ``config/settings/local.py``::
2019-06-18 22:50:17 +03:00
CELERY_TASK_ALWAYS_EAGER = False
Next, make sure `redis-server` is installed (per the `Getting started with Redis`_ guide) and run the server in one terminal::
$ redis-server
Start the Celery worker by running the following command in another terminal::
$ celery -A config.celery_app worker --loglevel=info
That Celery worker should be running whenever your app is running, typically as a background process,
so that it can pick up any tasks that get queued. Learn more from the `Celery Workers Guide`_.
The project comes with a simple task for manual testing purposes, inside `<project_slug>/users/tasks.py`. To queue that task locally, start the Django shell, import the task, and call `delay()` on it::
$ python manage.py shell
>> from <project_slug>.users.tasks import get_users_count
>> get_users_count.delay()
You can also use Django admin to queue up tasks, thanks to the `django-celerybeat`_ package.
.. _Getting started with Redis guide: https://redis.io/docs/getting-started/
.. _Celery Workers Guide: https://docs.celeryq.dev/en/stable/userguide/workers.html
.. _django-celerybeat: https://django-celery-beat.readthedocs.io/en/latest/
2018-03-09 21:17:56 +03:00
Sass Compilation & Live Reloading
---------------------------------
If you've opted for Gulp or Webpack as front-end pipeline, the project comes configured with `Sass`_ compilation and `live reloading`_. As you change you Sass/JS source files, the task runner will automatically rebuild the corresponding CSS and JS assets and reload them in your browser without refreshing the page.
#. Make sure that `Node.js`_ v16 is installed on your machine.
#. In the project root, install the JS dependencies with::
$ npm install
#. Now - with your virtualenv activated - start the application by running::
$ npm run dev
The app will now run with live reloading enabled, applying front-end changes dynamically.
2022-03-19 21:37:48 +03:00
.. note:: The task will start 2 processes in parallel: the static assets build loop on one side, and the Django server on the other. You do NOT need to run Django as your would normally with ``manage.py runserver``.
.. _Node.js: http://nodejs.org/download/
.. _Sass: https://sass-lang.com/
.. _live reloading: https://browsersync.io
2018-03-09 21:17:56 +03:00
Summary
-------
2018-03-09 21:17:56 +03:00
Congratulations, you have made it! Keep on reading to unleash full potential of Cookiecutter Django.