.._initial PostgreSQL set up: https://web.archive.org/web/20190303010033/http://suite.opengeo.org/docs/latest/dataadmin/pgGettingStarted/firstconnect.html
After setting up your environment, you're ready to add your first app. This project uses the setup from "Two Scoops of Django" with a two-tier layout:
-**Top Level Repository Root** has config files, documentation, `manage.py`, and more.
-**Second Level Django Project Root** is where your Django apps live.
-**Second Level Configuration Root** holds settings and URL configurations.
The project layout looks something like this: ::
<repository_root>/
├── config/
│ ├── settings/
│ │ ├── __init__.py
│ │ ├── base.py
│ │ ├── local.py
│ │ └── production.py
│ ├── urls.py
│ └── wsgi.py
├── <django_project_root>/
│ ├── <name_of_the_app>/
│ │ ├── migrations/
│ │ ├── admin.py
│ │ ├── apps.py
│ │ ├── models.py
│ │ ├── tests.py
│ │ └── views.py
│ ├── __init__.py
│ └── ...
├── requirements/
│ ├── base.txt
│ ├── local.txt
│ └── production.txt
├── manage.py
├── README.md
└── ...
Following this structured approach, here's how to add a new app:
#.**Create the app** using Django's ``startapp`` command, replacing ``<name-of-the-app>`` with your desired app name: ::
$ python manage.py startapp <name-of-the-app>
#.**Move the app** to the Django Project Root, maintaining the project's two-tier structure: ::
$ mv <name-of-the-app> <django_project_root>/
#.**Edit the app's apps.py** change ``name = '<name-of-the-app>'`` to ``name = '<django_project_root>.<name-of-the-app>'``.
#.**Register the new app** by adding it to the ``LOCAL_APPS`` list in ``config/settings/base.py``, integrating it as an official component of your project.
For instance, one of the packages we depend upon, ``django-allauth`` sends verification emails to new users signing up as well as to the existing ones who have not yet verified themselves.
If the project is configured to use Celery as a task scheduler then, by default, tasks are set to run on the main thread when developing locally instead of getting sent to a broker. However, if you have Redis setup on your local machine, you can set the following in ``config/settings/local.py``::
Start the Celery worker by running the following command in another terminal::
$ celery -A config.celery_app worker --loglevel=info
That Celery worker should be running whenever your app is running, typically as a background process,
so that it can pick up any tasks that get queued. Learn more from the `Celery Workers Guide`_.
The project comes with a simple task for manual testing purposes, inside `<project_slug>/users/tasks.py`. To queue that task locally, start the Django shell, import the task, and call `delay()` on it::
$ python manage.py shell
>> from <project_slug>.users.tasks import get_users_count
>> get_users_count.delay()
You can also use Django admin to queue up tasks, thanks to the `django-celerybeat`_ package.
.._Getting started with Redis guide: https://redis.io/docs/getting-started/
If you've opted for Gulp or Webpack as front-end pipeline, the project comes configured with `Sass`_ compilation and `live reloading`_. As you change your Sass/JS source files, the task runner will automatically rebuild the corresponding CSS and JS assets and reload them in your browser without refreshing the page.
This will start 2 processes in parallel: the static assets build loop on one side, and the Django server on the other.
#. Access your application at the address of the ``node`` service in order to see your correct styles. This is http://localhost:3000 by default.
..note:: Do NOT access the application using the Django port (8000 by default), as it will result in broken styles and 404s when accessing static assets.