Compare commits

...

318 Commits
0.14.3 ... main

Author SHA1 Message Date
pre-commit-ci[bot]
7c9334952d
[pre-commit.ci] pre-commit autoupdate (#551)
updates:
- [github.com/psf/black: 24.10.0 → 25.1.0](https://github.com/psf/black/compare/24.10.0...25.1.0)
- [github.com/pycqa/isort: 5.13.2 → 6.0.1](https://github.com/pycqa/isort/compare/5.13.2...6.0.1)
- [github.com/PyCQA/flake8: 7.1.1 → 7.2.0](https://github.com/PyCQA/flake8/compare/7.1.1...7.2.0)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-04-18 06:58:38 -07:00
Andrew Sears
630caed915
Upgrade project metadata (#542) 2025-01-22 17:34:39 +01:00
pre-commit-ci[bot]
786e4c120a
[pre-commit.ci] pre-commit autoupdate (#546) 2025-01-07 07:33:08 +01:00
David Fuentes Baldomir
f0a3ec60e9
Add --nostatic and --insecure args to runserver command. (#450)
Fixes #449
2025-01-04 09:37:17 +01:00
David Smith
32ac73e1a0
Added support for Python 3.13. (#539) 2024-11-18 20:57:11 +01:00
David Smith
e25b4bcc31
Target py39-plus with pyupgrade. (#540) 2024-11-18 20:56:40 +01:00
Carlton Gibson
06afd9b94a
Drop support for EOL Python 3.8. (#536) 2024-10-28 19:03:41 -07:00
pre-commit-ci[bot]
ffad9c3cfd
[pre-commit.ci] pre-commit autoupdate (#532)
updates:
- [github.com/asottile/pyupgrade: v3.16.0 → v3.17.0](https://github.com/asottile/pyupgrade/compare/v3.16.0...v3.17.0)
- [github.com/psf/black: 24.4.2 → 24.10.0](https://github.com/psf/black/compare/24.4.2...24.10.0)
- [github.com/PyCQA/flake8: 7.1.0 → 7.1.1](https://github.com/PyCQA/flake8/compare/7.1.0...7.1.1)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-10-07 17:10:56 -07:00
Jon Janzen
862ebcd08d
Include test support files in sdist (#530)
Closes: https://github.com/django/daphne/issues/522
2024-08-26 07:02:23 -07:00
Jon Janzen
b1902e8ccd
Remove pytest-runner (#528)
Closes: https://github.com/django/daphne/issues/523
2024-08-24 11:51:43 -07:00
Robert Schütz
9ec5798c0d
fix tests with Twisted 24.7.0 (#526)
In the fixed test cases the responses now contain `HTTP/1.1` rather than
`HTTP/1.0`.
2024-08-24 20:47:58 +02:00
Jon Janzen
3607351212
Revert "Remove pytest-runner"
This reverts commit 420f065d9e.
2024-08-24 11:36:44 -07:00
Jon Janzen
420f065d9e
Remove pytest-runner
Closes: https://github.com/django/daphne/issues/523
2024-08-24 11:34:33 -07:00
pre-commit-ci[bot]
4a55faca45
[pre-commit.ci] pre-commit autoupdate (#515)
updates:
- [github.com/asottile/pyupgrade: v3.15.2 → v3.16.0](https://github.com/asottile/pyupgrade/compare/v3.15.2...v3.16.0)
- [github.com/psf/black: 24.3.0 → 24.4.2](https://github.com/psf/black/compare/24.3.0...24.4.2)
- [github.com/PyCQA/flake8: 7.0.0 → 7.1.0](https://github.com/PyCQA/flake8/compare/7.0.0...7.1.0)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-07-02 09:38:28 +02:00
Carlton Gibson
0f15e4595b
Fixed packaging configuration. (#510)
Use auto-discovery, but make sure daphne package is correctly listed.
2024-04-11 15:23:49 +02:00
Carlton Gibson
cf9145985b Updated version and changelog for v4.1.1 release. 2024-04-10 17:42:01 +02:00
sdc50
c0b630834c
Fix twisted.plugin install (#507) 2024-04-10 17:33:21 +02:00
pre-commit-ci[bot]
63790936d1
[pre-commit.ci] pre-commit autoupdate (#503) 2024-04-02 07:14:14 +02:00
Carlton Gibson
df0680c9ad Updated change notes and version for 4.1 release. 2024-02-10 15:39:05 +01:00
Carlton Gibson
ef24796243
Validate HTTP header names as per RFC 9110. (#500)
Fixes #497.
2024-02-10 15:31:37 +01:00
Alejandro R. Sedeño
9a282dd627
Handle Daphne-Root-Path for websockets, adding root_path to scope. (#453)
Signed-off-by: Alejandro R. Sedeño <asedeno@mit.edu>
2024-02-06 09:15:03 +01:00
Carlton Gibson
5fdc9176e5
Ignored flake8-bugbear B036. (#499)
"except BaseException: without re-raising" used in testing.py
2024-02-06 09:04:10 +01:00
pre-commit-ci[bot]
993efe62ce
[pre-commit.ci] pre-commit autoupdate (#492) 2024-01-01 19:58:51 +01:00
dependabot[bot]
c07925d53f
Bump actions/setup-python from 4 to 5 (#491) 2023-12-12 07:19:48 +01:00
Paolo Melchiorre
4d24e22c72
Fixed #489 -- Add support for Python 3.12 (#490) 2023-11-26 16:12:23 +01:00
pre-commit-ci[bot]
2d4dcbf149
[pre-commit.ci] pre-commit autoupdate (#485) 2023-10-03 12:59:12 +02:00
InvalidInterrupt
3758c514fd
Raise minimum supported Python version to 3.8 (#481) 2023-09-07 09:09:10 +02:00
dependabot[bot]
f108bbc7c1
Bump actions/checkout from 3 to 4 (#480)
Bumps [actions/checkout](https://github.com/actions/checkout) from 3 to 4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-05 07:51:07 +02:00
pre-commit-ci[bot]
e49c39a4e5
[pre-commit.ci] pre-commit autoupdate (#477)
updates:
- [github.com/asottile/pyupgrade: v3.3.1 → v3.8.0](https://github.com/asottile/pyupgrade/compare/v3.3.1...v3.8.0)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2023-07-04 05:48:51 +01:00
Carlton Gibson
1eaf2206cc
Set pre-commit to quarterly updates. (#472) 2023-04-16 10:33:27 +02:00
Abenezer Belachew
09da15dc4e
Double quotes during pip (#467)
Single quotes return => ERROR: Invalid requirement: "'Twisted[tls,http2]'"
2023-04-14 15:10:16 +02:00
pre-commit-ci[bot]
21513b84da
[pre-commit.ci] pre-commit autoupdate (#470)
updates:
- [github.com/psf/black: 23.1.0 → 23.3.0](https://github.com/psf/black/compare/23.1.0...23.3.0)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Carlton Gibson <carlton.gibson@noumenal.es>
2023-04-14 15:03:51 +02:00
Carlton Gibson
077dd81809
Added missing stacklevel warning parameter. (#471) 2023-04-14 14:53:12 +02:00
pre-commit-ci[bot]
b0204165b1
[pre-commit.ci] pre-commit autoupdate (#466)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2023-02-07 15:20:08 +00:00
pre-commit-ci[bot]
79fd65dec3
[pre-commit.ci] pre-commit autoupdate (#464)
updates:
- [github.com/pycqa/isort: 5.11.4 → 5.12.0](https://github.com/pycqa/isort/compare/5.11.4...5.12.0)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2023-01-31 06:52:32 +00:00
pre-commit-ci[bot]
5681d71c17
[pre-commit.ci] pre-commit autoupdate (#462)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-12-29 12:19:00 +00:00
pre-commit-ci[bot]
fdc944a280
[pre-commit.ci] pre-commit autoupdate (#461)
updates:
- [github.com/pycqa/isort: 5.11.0 → v5.11.3](https://github.com/pycqa/isort/compare/5.11.0...v5.11.3)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-12-20 10:37:51 +01:00
pre-commit-ci[bot]
c9343aa9d6
[pre-commit.ci] pre-commit autoupdate (#460)
updates:
- [github.com/asottile/pyupgrade: v3.3.0 → v3.3.1](https://github.com/asottile/pyupgrade/compare/v3.3.0...v3.3.1)
- [github.com/psf/black: 22.10.0 → 22.12.0](https://github.com/psf/black/compare/22.10.0...22.12.0)
- [github.com/pycqa/isort: 5.10.1 → 5.11.0](https://github.com/pycqa/isort/compare/5.10.1...5.11.0)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-12-12 23:15:35 +00:00
Adam Johnson
d59c2bd424
Upgrade to tox 4 (#458) 2022-12-08 01:06:41 +00:00
pre-commit-ci[bot]
2015ecdd8f
[pre-commit.ci] pre-commit autoupdate (#457)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-12-06 09:01:28 +00:00
pre-commit-ci[bot]
18e936eed1
[pre-commit.ci] pre-commit autoupdate (#456)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-11-29 20:34:37 +00:00
Michael K
ef946d5637
Upgrade GitHub Actions actions (#454)
And set up dependabot to take care of them in the future.
2022-11-18 11:33:03 +01:00
pre-commit-ci[bot]
a0b2ac0e8f
[pre-commit.ci] pre-commit autoupdate (#452)
updates:
- [github.com/asottile/pyupgrade: v3.2.0 → v3.2.2](https://github.com/asottile/pyupgrade/compare/v3.2.0...v3.2.2)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-11-14 20:47:53 +00:00
pre-commit-ci[bot]
afd0d51b83
[pre-commit.ci] pre-commit autoupdate (#448)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-11-01 11:09:39 +00:00
Michael K
d5fbdfc4cb
Run tests against Python 3.11 and add trove classifier (#446) 2022-10-25 13:46:20 +01:00
pre-commit-ci[bot]
71be46265d
[pre-commit.ci] pre-commit autoupdate (#444)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-10-17 23:09:03 +01:00
pre-commit-ci[bot]
91c61f4ff4
[pre-commit.ci] pre-commit autoupdate (#443)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-10-10 23:39:20 +01:00
Carlton Gibson
060202f491 Updated change log and version for v4.0 release. 2022-10-07 14:53:16 +02:00
Adam Johnson
12e543750b
Make DaphneProcess pickleable (#440) 2022-10-07 13:22:40 +02:00
Jakub Stawowy
fef1490eff
Removed deprecated —ws-protocols CLI option. (#387) 2022-10-07 12:21:22 +02:00
pre-commit-ci[bot]
898529c489
[pre-commit.ci] pre-commit autoupdate (#441)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-09-26 21:47:43 +01:00
pre-commit-ci[bot]
e9637419df
[pre-commit.ci] pre-commit autoupdate (#439)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-09-20 10:07:55 +01:00
pre-commit-ci[bot]
b3bfbd6135
[pre-commit.ci] pre-commit autoupdate (#438)
updates:
- [github.com/psf/black: 22.6.0 → 22.8.0](https://github.com/psf/black/compare/22.6.0...22.8.0)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-09-06 09:35:26 +02:00
Michael K
c9a7fdc669
Run tests on main branch, not master (#435) 2022-08-30 07:35:56 +02:00
Carlton Gibson
6a466b7bee
Bumped version and changelog for 4.0b1 release. (#434) 2022-08-25 12:04:14 +02:00
pre-commit-ci[bot]
7453ad9cc5
[pre-commit.ci] pre-commit autoupdate (#430)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-08-08 21:40:53 +01:00
Carlton Gibson
2b13b74ce2
Added runserver to Daphne. (#429)
* Made daphne installable as a Django app.
* Added system check to ensure daphne is installed before
  django.contrib.staticfiles.
* Moved runserver command from Channels.
* Added changelog entry for runserver command.
2022-08-08 14:10:03 +02:00
pre-commit-ci[bot]
438b7ad06d
[pre-commit.ci] pre-commit autoupdate (#427)
updates:
- [github.com/asottile/pyupgrade: v2.37.2 → v2.37.3](https://github.com/asottile/pyupgrade/compare/v2.37.2...v2.37.3)
- [github.com/PyCQA/flake8: 4.0.1 → 5.0.2](https://github.com/PyCQA/flake8/compare/4.0.1...5.0.2)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-08-02 08:56:41 +02:00
pre-commit-ci[bot]
e04b4077f4
[pre-commit.ci] pre-commit autoupdate (#426)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-07-25 23:59:09 +01:00
pre-commit-ci[bot]
5502d1d37b
[pre-commit.ci] pre-commit autoupdate (#424) 2022-07-13 06:24:58 +01:00
Carlton Gibson
71ba440761
Added support for ASGI_THREADS max worker limit. (#422)
Closes #319
2022-07-06 12:37:26 +02:00
David Smith
6199d509c2
Merge pull request #421 from django/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2022-07-04 21:08:13 +01:00
pre-commit-ci[bot]
1df4f08fac
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/psf/black: 22.3.0 → 22.6.0](https://github.com/psf/black/compare/22.3.0...22.6.0)
2022-07-04 19:40:09 +00:00
pre-commit-ci[bot]
7d4316fd4a
[pre-commit.ci] pre-commit autoupdate (#418)
updates:
- [github.com/asottile/pyupgrade: v2.32.1 → v2.34.0](https://github.com/asottile/pyupgrade/compare/v2.32.1...v2.34.0)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-06-13 23:58:38 +01:00
baseplate-admin
54745d0f83
Set a default Server header for HTTP responses (#396)
Co-authored-by: Carlton Gibson <carlton.gibson@noumenal.es>
2022-05-24 12:40:02 +02:00
Abhimanyu Saharan
87bc5a7975 Added argument to change log format. (#414) 2022-05-23 16:41:45 +02:00
Carlton Gibson
5e709795b4
Updated supported Python and dependency versions. (#417)
* Updated Python support and dependencies.
* Updated Python support in README.
* Removed PY36 from GHA workflow.
* Remove pre-commit workflow. Fixes #397
* Updated Black in pre-commit to 22.3.0.
* Update all pre-commit hooks.
* [pre-commit.ci] auto fixes from pre-commit.com hooks
2022-05-23 15:34:29 +02:00
Marcin Muszynski
eae1ff0df4
Set default attributes on WebRequest (#406) 2022-02-14 16:12:56 +01:00
Carlton Gibson
6a5093982c
Run CI tests on Windows. (#393)
* Updated minimum twisted to 19.7

Co-authored-by: Michael Käufl <django@c.michael-kaeufl.de>
2021-11-09 20:12:29 +01:00
Carlton Gibson
b62e58a023
Added Python 3.10 to CI. (#392) 2021-11-09 15:58:09 +01:00
Carlton Gibson
15a754d903
Unpinned test dependencies. (#391) 2021-11-09 15:55:18 +01:00
Adam Johnson
36ce9fd1ed
Use tox-py in CI (#369) 2021-04-16 18:21:51 +02:00
Carlton Gibson
e480917c1a Bumped version and change notes for 3.0.2 release. 2021-04-07 20:26:57 +02:00
Carlton Gibson
d5c41bf641 Updated various README URLs. 2021-04-07 20:26:57 +02:00
Adam Johnson
2b6f153616
Used partial() to wrap Server.handle_reply() (#364)
Fixes #332.
2021-04-07 20:14:02 +02:00
Adam Johnson
ca61162129
Lint with pre-commit (#365)
* Lint with pre-commit

* Move existing tox qa hooks into pre-commit.
* Set up GitHub Action based on https://github.com/pre-commit/action/ (we could also use https://pre-commit.ci ).
* Add `pyupgrade` to drop old Python syntax.
* Add `flake8-bugbear` plugin to prevent flake8 errors.

* Drop custom GHA
2021-04-07 20:11:21 +02:00
Carlton Gibson
aac4708a61 Bumped version and change notes for 3.0.1 release. 2020-11-12 20:34:13 +01:00
Patrick Gingras
aae0870971
Handle asyncio.CancelledError in Server.application_checker (#341)
As of [bpo-32528](https://bugs.python.org/issue32528), asyncio.CancelledError is
not a subclass of concurrent.futures.CancelledError. This means that if an
asyncio future raises an exception, it won't be caught. Therefore, the
exception will bubble past the try-except within the loop in application_checker,
resulting in done applications not being cleaned up, and the application_checker
task not being queued again.
2020-11-11 16:12:33 +01:00
Carlton Gibson
a69723ca3f Version 3.0 release.
* Bump version number.
* Changelog.
* README.
* Update asgiref dependency specifier to match Django 3.1.
2020-10-28 20:52:00 +01:00
Ryan Fredericks
525b6d2dbb
Update README for shell compatibility. (#327) 2020-10-28 20:45:41 +01:00
Avinash Raj
15ba5c6776
Updated to use ASGI v3 applications internally. (#275)
Used guarantee_single_callable().
Removed unneeded --asgi-protocol CLI option.
Updated tests.

Co-authored-by: Carlton Gibson <carlton.gibson@noumenal.es>
2020-10-27 19:50:50 +01:00
Samori Gorse
e1b77e930b
Added request body chunking (#335)
The entire body was previously read in memory which would lead
the server to be killed by the scheduler.
This change allows 8Kb chunks to be read until the entire body is
consummed.

Co-authored-by: Samori Gorse <samori@codeinstyle.io>
2020-10-21 16:38:03 +02:00
Michael K
b96720390f
Switch from Travis CI to GitHub Actions (#336)
* Add GitHub Actions
* Remove Travis CI
* Remove known first party from isort's config
2020-10-20 16:44:54 +02:00
Sergey Klyuykov
d0e841b41d Added support for executing from interpreter.
For run from python interpreter as module use command:
`python -m daphne [all daphne arguments]`
2020-10-16 08:16:34 +02:00
Carlton Gibson
9838a173d7 Releasing 2.5.0 2020-04-15 20:26:53 +02:00
Carlton Gibson
c3b88e5639 Corrected ignore pattern. 2020-04-15 20:12:40 +02:00
Chris Barber
1765187a17 Fixed race-condition with TestApplication pickle file. 2020-04-15 20:07:11 +02:00
Carlton Gibson
5cf15bd636 Set event loop policy on Windows with Python 3.8+. 2020-04-15 20:07:11 +02:00
Carlton Gibson
d689ca2eab
Updated git ignore with common files. (#316)
* Pyenv
* Pytest
* VS Code.
2020-04-13 16:38:24 +02:00
Michael
59b57a9f4b
Simplify travis config (#295) 2020-02-05 21:05:51 +01:00
LittlePony
61c8633c5d
Add logger traceback on application error. (#308) 2020-02-05 20:40:44 +01:00
Carlton Gibson
18f2d67f34 Releasing 2.4.1 2019-12-18 20:50:12 +01:00
Carlton Gibson
27f760a814 Avoid Twisted using the default event loop
When switching threads, e.g. when run via Django auto-reloader, the default run loop causes issues detecting async contexts.
Fixes https://github.com/django/channels/issues/1374
2019-12-18 20:50:12 +01:00
Carlton Gibson
eb582d1d43 Releasing 2.4.0 2019-11-20 20:41:07 +01:00
Carlton Gibson
beb836acce Remove macOS Travis build.
Travis' infrastructure is just too slow.
2019-11-20 20:27:00 +01:00
Carlton Gibson
4b7c027b98 Add testing against Python 3.8. 2019-11-20 20:27:00 +01:00
Carlton Gibson
a4efcd5c1d Reduced macOS Travis builds to single env.
Slow, and not any benefit in multiple runs.
2019-11-20 20:27:00 +01:00
Carlton Gibson
78be865eb4
Fixed #276 -- Ensured 500 response when app sends malformed headers. (#281) 2019-11-14 07:13:16 +01:00
Joonhyung Shin
7032f8e0f8 Resolve asyncio + multiprocessing problem when testing. (#247) 2019-11-06 19:51:00 +01:00
Carlton Gibson
d3630e0925
Pin hypothesis at 4.2.3 (#283)
https://travis-ci.org/django/daphne/jobs/595912612

Requirement already satisfied: attrs>=17.4.0 in /home/travis/virtualenv/python3.6.7/lib/python3.6/site-packages (from twisted) (18.2.0)

hypothesis 4.40.0 has requirement attrs>=19.2.0, but you'll have attrs 18.2.0 which is incompatible.
2019-10-10 05:05:08 +02:00
Simon Willison
333f4644d1 Added support for raw_path in scope. (#268)
As per https://github.com/django/asgiref/pull/92

Required valid URI path fragments to be used in tests:
- Test case must ensure paths are correctly quoted before calling
   run_daphne_request() & co.

Co-authored-by: Carlton Gibson <carlton.gibson@noumenal.es>
2019-07-03 20:22:03 +02:00
Simon Willison
ffd949f2ce Fix deprecated regex escape sequence. (#266) 2019-06-17 10:21:24 +02:00
Mario Rodas
f46c2448b1 Added compatibility for hypothesis 4 (#261)
hypothesis `average_size` argument was already deprecated [1], and was
effectively removed in hypothesis 4 [2].

[1] https://github.com/HypothesisWorks/hypothesis/pull/1162
[2] https://hypothesis.readthedocs.io/en/latest/changes.html#v4-0-0
2019-06-17 10:10:14 +02:00
Alan Rominger
a3494215cf Require installing Twisted TLS extras. (#257) 2019-04-13 15:04:56 +02:00
d.s.e
0be1d3ef01 Added missing LICENSE to distribution (#250)
Signed-off-by: Guenther Meyer <d.s.e@sordidmusic.com>
2019-04-13 12:09:29 +02:00
Carlton Gibson
1759643f1f Releasing 2.3.0 2019-04-09 11:42:31 +02:00
Tom Christie
f52960e587 Support ASGI3 (#255) 2019-04-09 11:36:18 +02:00
Andrew Godwin
67cfa98b00 Fixing test dependencies to actual versions 2019-01-31 17:43:23 -08:00
Andrew Godwin
1b77e247f8 Releasing 2.2.5 2019-01-31 17:36:30 -08:00
Avinash Raj
9c574083ff Support for passing server name as cli argument (#231) 2018-12-28 13:42:39 +00:00
Florian Apolloner
cc344c6e34
Fix typo in changelog 2018-12-26 15:49:33 +01:00
Andrew Godwin
699f9dd4cb Set the websocket handshake from the connect time 2018-12-24 16:04:53 +00:00
Andrew Godwin
8515524c2b Releasing 2.2.4 2018-12-15 13:28:13 -08:00
Kyle Agronick
c4125c66d4 Only set disconnected time when it is not already set (#237)
Fixes a memory leak where the time would never expire, as well as an additional case where send is called on an already-cleaned-up instance.
2018-11-27 12:20:27 -08:00
Sylvain Prat
de15dcb4d1 Fixed #234: Don't listen on port 8000 when provided a file descriptor 2018-11-19 14:13:03 -08:00
Andrew Godwin
5722d4e7ea Releasing 2.2.3 2018-11-06 10:27:18 -08:00
László Károlyi
20f2bc93d4 Add command-line options for proxy headers 2018-10-26 12:34:15 -07:00
Imblc
e93643ff5a Fixed #229: Allow bytes headers only
Previously Daphne was too lax and would happily accept strings too.
2018-09-28 09:45:03 -07:00
Andrew Godwin
3e4aab95e2 Fix Travis release stage 2018-08-29 17:57:06 -07:00
Andrew Godwin
c5554cb817 Tidying up 2018-08-27 14:21:40 +10:00
Andrew Godwin
02a299e5a7 Fix isort in travis 2018-08-27 12:40:51 +10:00
Andrew Godwin
460bdf64db Only lint the daphne and tests directories 2018-08-27 12:31:54 +10:00
Andrew Godwin
0ed6294406 Implement Black code formatting 2018-08-27 12:29:57 +10:00
Michael
88792984e7 Run tests against Python 3.7 (#224) 2018-08-25 09:46:04 +10:00
Andrew Godwin
47358c7c79 Releasing 2.2.2 2018-08-16 21:34:50 -07:00
Andrew Godwin
5fe47cbbed
Add an issue template 2018-08-09 11:36:22 -07:00
Nick Sellen
2f94210321 Add x-forwarded-proto support (#219) 2018-07-24 13:25:03 -07:00
Anders Jensen
adb622d4f5 Removed deferToThread for ASGI instance constructor (#218)
The previous behaviour was from an older spec.
2018-07-22 09:54:42 -07:00
Andrew Godwin
e16b58bcb5 Releasing 2.2.1 2018-07-22 09:47:14 -07:00
Brian May
d5611bccb6 Don't crash if connection closed before application started (#213)
Fixes #205.
2018-07-05 18:26:34 -07:00
Andrew Godwin
6dcc0d52b3 send() should not block once connection is closed 2018-06-24 16:33:54 -07:00
Andrew Godwin
bb54f41736 Releasing 2.2.0 2018-06-13 11:55:20 -07:00
Andrew Godwin
ece52b8e79 Don't try and read requests that are closed already (#205) 2018-06-02 06:45:02 +01:00
Andrew Godwin
8c031239ad Remove HTTP timeout by default, and mid-response error for it 2018-05-30 09:52:47 -07:00
Andrew Godwin
84466d4ae4 Fixed #207: Do header transforms for WebSocket XFF right 2018-05-26 12:16:07 +02:00
Andrew Godwin
9f7e19cf2d Use clean headers to fix decoding issues 2018-05-25 15:11:09 +02:00
Andrew Godwin
fa3c764433 Fixed #206: Check applications exist before timing them out 2018-05-25 12:33:46 +02:00
Andrew Godwin
c4360fd70a Releasing 2.1.2 2018-05-24 14:15:56 +02:00
Andrew Godwin
f046a35dbc Only validate header names (values are already done) 2018-05-24 12:43:18 +02:00
Andrew Godwin
b3c097aabd Enforce that header names and values are bytes 2018-05-24 12:31:18 +02:00
Andrew Godwin
dd2c8b2a0f Don't try to send disconnect when we never made an app instance 2018-05-03 09:47:12 -07:00
Andrew Godwin
097f3ba8e8 Releasing 2.1.1 2018-04-18 10:59:25 -07:00
Andrew Godwin
a7ccfab495 Run server constructor in a threadpool as it's synchronous 2018-04-18 10:57:58 -07:00
Andrew Godwin
cc6af549a6 Releasing 2.1.0 2018-03-05 20:43:48 -08:00
Andrew Godwin
446fc69408 Fixed #150: Correctly handle bad querystrings 2018-03-04 09:48:33 -08:00
Andrew Godwin
388bbc5c24 Accept ws_protocols for now but ignore the contents 2018-02-24 10:47:09 -08:00
Andrew Godwin
4eb6cab9aa Fix #180: asgiref is not a required dependency 2018-02-24 10:45:04 -08:00
Andrew Godwin
f877d54942 Remove subprotocol support (handled by apps now) 2018-02-23 16:53:25 -08:00
Andrew Godwin
9b3e2b4b28 Releasing 2.0.4 2018-02-21 22:04:26 -08:00
Andrew Godwin
0a2c6c2ff2 Fix #175: Check finished as well as channel 2018-02-21 09:50:59 -08:00
Andrew Godwin
853771ec95 Move testing to use multiprocessing for better reliability
We can also hopefully reuse this for LiveServerTestCase
2018-02-19 20:58:47 -08:00
Andrew Godwin
173617ad3b Fixed #172: Outgoing frames do not reset ping clock (incoming does) 2018-02-16 09:56:40 -08:00
Andrew Godwin
de0811f13e Fixed #169: Don't try to send messages to a closed client 2018-02-14 14:52:49 -08:00
Andrew Godwin
f53eb0dda6 Don't put commas in the header hypothesis tests 2018-02-07 14:15:28 -08:00
Andrew Godwin
12437e2677 Releasing 2.0.3 2018-02-07 12:11:57 -08:00
Andrew Godwin
13511d2ca6 Fixed #162: Test suite now uses port 0 binding 2018-02-07 12:03:54 -08:00
Andrew Godwin
678a97ec7f Fixed #152: Give ASGI apps a grace period after close before killing
Also adds a warning when they do not correctly exit before killing.
2018-02-07 12:02:30 -08:00
Jonas Lidén
d46429247f Unix socket fix (#161)
Fix error on listeners when passing a unix socket
2018-02-06 00:04:44 -08:00
Andrew Godwin
3bffe981f6 Releasing 2.0.2 2018-02-04 12:22:13 -08:00
Andrew Godwin
826a8ce0de Better Twisted reactor detection 2018-02-04 12:18:44 -08:00
Andrew Godwin
0f8f731b2c Rename Travis CI stage to "Release" 2018-02-04 12:09:26 -08:00
Andrew Godwin
105e1d5436 Don't apply HTTP timeout to WebSocket connections! 2018-02-04 12:08:57 -08:00
Andrew Godwin
6eeb280e1b Put a last line for the warning traceback that's not the string 2018-02-03 23:38:17 -08:00
Andrew Godwin
bb4d46f09c Fix string concatenation (duh) 2018-02-03 23:29:37 -08:00
Andrew Godwin
7949b244b8 Try to uninstall previous reactors if they're found 2018-02-03 22:57:15 -08:00
Andrew Godwin
3b5721c699 Move deploy to second stage 2018-02-03 22:56:24 -08:00
Andrew Godwin
06a3727f8b Releasing 2.0.1 2018-02-03 12:35:42 -08:00
Andrew Godwin
981b6988db Add auto-release for Travis 2018-02-03 12:31:12 -08:00
Andrew Godwin
3b2fb6f78e Use loggers rather then the logging module directly
Refs django/channels#846
2018-02-02 20:08:33 -08:00
Artem Malyshev
990656a36d Correct project name in the README. 2018-02-02 14:20:30 +03:00
Andrew Godwin
f18078e53d Remove Python 2.7 classifiers 2018-02-01 23:20:39 -08:00
Andrew Godwin
66d20c2563 Releasing 2.0.0 2018-02-01 21:27:50 -08:00
Andrew Godwin
eb7468059d Make Daphne process tests try harder. 2018-02-01 21:22:15 -08:00
Andrew Godwin
0572b1dbcd Further improve flaky header tests 2018-02-01 21:18:13 -08:00
Andrew Godwin
db68c43de1 Fix imports and use of six 2018-02-01 21:02:37 -08:00
Andrew Godwin
51c2de3f8c Fix header matching in websocket tests 2018-02-01 21:02:27 -08:00
Andrew Godwin
228142cab5 Merge branch 'master' into 2.0 2018-02-01 20:35:43 -08:00
Andrew Godwin
e10b72f33f Use plan REST not Sphinx REST in the README 2018-02-01 20:33:55 -08:00
Andrew Godwin
bc9400322d Update readme to include 1.x note 2018-02-01 20:32:19 -08:00
Andrew Godwin
b287a74236 Make test port selection less flaky 2018-02-01 20:32:08 -08:00
Andrew Godwin
9460cc166f Allow listening on port 0 and add hooks to get that port out on start
Used in the ChannelsLiveServerTestCase
2018-02-01 20:12:56 -08:00
Andrew Godwin
691151b097 Releasing 1.4.2 2018-01-05 00:41:58 -08:00
Andrew Godwin
f335232373 Fix poorly captured second argument on Python 2 2018-01-05 00:39:46 -08:00
Andrew Godwin
0c633fa968 Releasing 1.4.1 2018-01-02 13:31:41 -08:00
Andrew Godwin
3fcfe45e84 Add missing proto header argument to HTTPFactory 2018-01-02 13:29:03 -08:00
Andrew Godwin
13e7804187 Releasing 1.4.0 2018-01-02 11:39:58 -08:00
Andrew Godwin
26fa870540 Move to "body" everywhere in HTTP messages 2017-11-29 21:27:24 -08:00
Andrew Godwin
44c1d0905f Update version and README 2017-11-29 00:12:14 -08:00
Andrew Godwin
3358767814 Increase hypothesis deadlines for slower systems/Travis 2017-11-29 00:03:29 -08:00
Andrew Godwin
29db466c48 Oldest supported twisted version is 17.5 now. 2017-11-28 23:57:44 -08:00
Andrew Godwin
b55fc382e8 Tox is no longer needed for tests. 2017-11-28 23:49:51 -08:00
Andrew Godwin
b04e6a5a64 Ignore eggs from flake8 2017-11-28 23:47:11 -08:00
Andrew Godwin
7f5fe7370f Add flake8 linting 2017-11-28 23:42:35 -08:00
Andrew Godwin
08e7841718 Fix import ordering 2017-11-28 18:03:29 -08:00
Andrew Godwin
03aa8548fe Stop using tox and start linting 2017-11-28 17:59:59 -08:00
Andrew Godwin
a57ef2fa54 Detect listening failures 2017-11-28 17:38:22 -08:00
Andrew Godwin
20ff8fec28 Match to the new ASGI-HTTP spec. 2017-11-28 17:28:35 -08:00
Andrew Godwin
7fb3e9a167 Clean up comments, names and imports 2017-11-27 00:02:37 -08:00
Andrew Godwin
567c27504d Add websocket tests to make sure everything important is covered. 2017-11-27 00:00:34 -08:00
Andrew Godwin
1ca1c67032 Add HTTP response test suite 2017-11-26 00:06:23 -08:00
Andrew Godwin
e0e60e4117 Full HTTP request test suite 2017-11-25 23:19:27 -08:00
Andrew Godwin
b3115e8dcf Start fixing travis config 2017-11-25 18:35:12 -08:00
Andrew Godwin
b72349d2c1 HTTP protocol tests 2017-11-25 18:23:54 -08:00
Andrew Godwin
0626f39214 Unify all strings to double quotes 2017-11-25 13:41:38 -08:00
Andrew Godwin
22aa56e196 Start on fixing tests 2017-11-25 13:39:46 -08:00
Andrew Godwin
017797c05b Change to scope-based code 2017-11-12 16:32:30 -08:00
Thomas Steen Rasmussen
f9233d4b47 Make sure headers are always correctly encoded
WebSocket headers were not correctly encoding as bytestrings.
2017-10-12 11:06:18 -07:00
Buky
d24fd06460 Update Readme.rst (#138)
Fix line commande for pip
2017-09-12 10:22:14 -07:00
Andrew Godwin
01f174bf26 Trying out asyncio based interface 2017-09-07 21:24:14 -07:00
Nick Sellen
05bd4ac258 Parse X-Forwarded-Proto header (#136)
Adds the ability to use this header for HTTPS detection.
2017-08-25 10:24:24 -07:00
Artem Malyshev
3161715238 Log interface listener errors.
Fix #133.
2017-08-13 19:18:54 +03:00
Andrew Godwin
a656c9f4c6 Initial refactor to get HTTP working in new style 2017-08-07 14:15:35 +10:00
Tom Turner
a69d69490b Removed a unsed import (urlencode) (#131) 2017-07-30 22:28:08 +12:00
Adam Johnson
79927fbe10 Travis - test on Trusty (#129)
As per [their blog post](https://blog.travis-ci.com/2017-07-11-trusty-as-default-linux-is-coming) they're making it the new default, best to be ahead of the curve.
2017-07-29 14:42:50 +12:00
Andrew Godwin
f3b5d854ca Fix slow hypothesis test 2017-07-24 10:57:47 -07:00
Andrew Godwin
8fd8f794a4 Fixed #128: Could not use file descriptor 0 2017-07-24 10:41:54 -07:00
Andrew Godwin
cf7e7f3924 Fix flaky test that times out on Python 3.5 only 2017-06-16 10:50:20 +08:00
Andrew Godwin
fbb080c600 Releasing 1.3.0 2017-06-16 10:37:30 +08:00
Andrew Godwin
3b4801527d Fixed #123: Add default websocket timeout. 2017-06-12 10:06:54 +08:00
ElRoberto538
c5385fb253 Added websocket_handshake_timeout option/server param
Added an optional parameter to Server and HTTPFactory to allow Autobahn openHandshakeTimeout to be overridden.
2017-06-02 16:08:58 -07:00
Eric Menendez
4ff2384337 Set HTTP Server header to "Daphne" to avoid revealing Autobahn version number. (#122) 2017-05-30 15:56:16 -07:00
Camilo Nova
63d2166320 Fix typo (#120) 2017-05-26 15:13:27 -07:00
John Miller
3683e71afc Fix Minor Typo in CLI Help Message, "WeSocket"->"WebSocket" (#117) 2017-05-23 18:19:11 -07:00
Andrew Godwin
eb195c6004 Don't break if protocol is removed before it's put into reply_protocols 2017-05-15 09:39:03 -07:00
Maik Hoepfel
8787c1dfe2 Check query string for spec conformance again (#112)
This check was skipped because of
https://github.com/django/daphne/issues/110. As this issue is now fixed,
we can re-enable the check again.
2017-05-02 10:23:49 -07:00
Andrew Godwin
a59bd123a9 Fix ws query test 2017-04-29 19:14:25 -07:00
Andrew Godwin
6318bae452 Fixed #110: Use raw WS query string rather than reconstructing it 2017-04-29 19:09:07 -07:00
Maik Hoepfel
2bcec3fe94 Websockets test and unicode fix for Python 2 (#111)
* Python 2 fix for host address

This is a copy of
57051a48cd
for the Websocket protocol.

In Python 2, Twisted returns a byte string for the host address, while
the spec requires a unicode string. A simple cast gives us consistency.

* Test suite for websocket tests

This commit

* introduces some new helpers to test the Websocket protocol
* renames the old ASGITestCase class to ASGIHTTPTestCase, and
  introduces a test case for testing Websockets
* moves some helper methods that are shared between HTTP and Websockets
  into a mutual base class
* uses the new helpers to simplfiy the existing tests
* and adds a couple new tests.
2017-04-28 14:45:07 -07:00
Andrew Godwin
bd03fabce6 Don't log discarded old protocol messages as ERROR 2017-04-04 10:51:42 +02:00
Yoan Blanc
382318b6d2 Run tox tests from travis. (#104) 2017-04-03 15:49:08 +02:00
Andrew Godwin
46656aad24 Releasing version 1.2.0
Includes some test fixes for the new reply channel style.
2017-04-01 15:26:57 +01:00
Andrew Godwin
cddb0aa89e Fix unicode channel names in Python 2 2017-03-28 11:33:53 -07:00
Andrew Godwin
2d2a0cf822 Fix unicode-ness of http version 2017-03-28 11:28:59 -07:00
Andrew Godwin
4a764f7966 Try to handle HTTP version and SSL correctly (plus better errors) 2017-03-28 11:23:51 -07:00
Andrew Godwin
3937489c4a Update Daphne for new process-local channel style 2017-03-27 19:49:50 -07:00
Andrew Godwin
bd9b8d0068 Improve accept flow handling to allow accept: False and match spec 2017-03-27 10:02:35 -07:00
Raul
9f4f057e4c Support websocket options for cli and infinite time to timeouts (#99)
* add websocket timeout and websocket connetion timeout to Cli. Add support for infinite time to websocket timeout and websocket connection timeout

* change test
2017-03-24 20:47:59 -07:00
Maik Hoepfel
d68461920f HTTP responses: two fixes and some tests (#96)
* Fix: Always call Request.write()

The spec says 'content' is an optional key, defaulting to b''.
But before this commit, if 'content' wasn't specified, Request.write()
was not called. In conjunction with setting 'more_content' to True,
this would result in nothing being written on the transport. If
'content' was set to b'' instead, the HTTP preamble and any headers were
written as expected. That smells like a bug, so I'm making sure we're
always calling Request.write().

* Require status key in first message to response channel

Previous to this commit, it was possible to not pass in a 'status' key.
This would result in any passed in headers being ignored as well.

Instead of relying on user data ('status' being present or not), this
commit now enforces that the first message to a response channel is
indead a HTTP Response-style message, and hence contains status. It will
complain loudly if that isn't the case.

* Helper for getting HTTP Response for a given channel message

To test Daphne's message-to-HTTP part, we need an easy way to fetch the
HTTP response for a given response channel message. I borrowed the
approach from Andrew's existing code. I feel like we might be able to do
with less scaffolding at some point, but didn't have time to
investigate. It's good enough for now.

* Add assert method to check a response for spec conformance

Similarly to the method for checking HTTP requests for spec conformance,
we're adding a method to do the same for HTTP responses. This one is a bit
less exciting because we're testing raw HTTP responses.

* Add Hypothesis tests for HTTP responses

Similarly to what I did for HTTP requests, this commit adds a couple
test that try to check different parts of the ASGI spec. Because going
from message to HTTP response is more straightforward than going from
HTTP request to channel message, there's not a whole lot going on here.
2017-03-22 15:55:28 -07:00
Andrew Godwin
04118cab7e Releasing 1.1.0 2017-03-18 12:38:52 -07:00
Andrew Godwin
2edfe5d7d5 Ah yes, Twisted 17 releases start at 17.1 2017-03-18 12:32:04 -07:00
Andrew Godwin
fa2841c101 Update the other things mentioning Twisted 16.0 2017-03-18 12:30:20 -07:00
Andrew Godwin
5eff45482a Update tox config for Twisted release range 2017-03-18 12:28:02 -07:00
Andrew Godwin
ea7544d8b7 Update README with HTTP/2 details 2017-03-18 12:25:56 -07:00
Andrew Godwin
a925ce32cd Add in HTTP/2 support with right deps and log info 2017-03-18 12:10:20 -07:00
Cory Benfield
e6e4240c0e Implement IProtocolNegotiationFactory. 2017-03-18 10:48:07 -07:00
Cory Benfield
d3f26a6bf2 Add a custom override of buildProtocol. 2017-03-18 10:48:07 -07:00
Andrew Godwin
d86d7dd3c4 Fixed #93: Don't try to send disconnect if it never connected 2017-03-16 19:06:11 -07:00
Artem Malyshev
3cd048d594 Store endpoint listen results. (#92)
* Store endpoint listen results.

* Rename ports to listeners.
2017-03-16 19:04:02 -07:00
Maik Hoepfel
7f92a48293 Full test suite for HTTP requests (#91)
* Add Hypothesis for property-based tests

Hypothesis:
"It works by letting you write tests that assert that
something should be true for every case, not just the ones you happen to
think of."

I think it's well suited for the task of ensuring Daphne conforms to the
ASGI specification.

* Fix accidental cast to byte string under Python 2

While grepping for calls to str(), I found this bit which looks like a
cast to unicode was intended under Python 2.

* ASGITestCase - checking channel messages for spec conformance

This commit introduces a new test case class, with it's main method
assert_valid_http_request_message. The idea is
that this method is a translation of the ASGI spec to code, and can be
used to check channel messages for conformance with that part of the
spec.

I plan to add further methods for other parts of the spec.

* Add Hypothesis strategies for generating HTTP requests

Hypothesis, our framework for test data generation, contains only
general so-called strategies for generating data. This commit adds a few
which will be useful for generating the data for our tests.

Alos see http://hypothesis.readthedocs.io/en/latest/data.html.

* Add and convert tests for HTTP requests

This commit introduces a few Hypothesis tests to test the HTTP request
part of the specification. To keep things organized, I split the
existing tests module into two: one concerned with requests, and one
concerned with responses. I anticipate that we'll also add modules for
chunks and server push later.

daphne already had tests for the HTTP protocol. Some of them I converted
to Hypothesis tests to increase what was tested. Some were also
concerned with HTTP responses, so they were moved to the new response
module. And three tests were concerned with proxy behaviour, which I
wasn't sure about, and I just kept them as-is, but also moved them
to the request tests.

* Fix byte string issue in Python 2

Twisted seems to return a byte string for the client and server IP
address. It is easily rectified by casting to the required unicode
string. Also added a test to ensure this is also handled correctly in
the X-Forwarded-For header parsing.

* Check order of header values

I'm in the process of updating the ASGI spec to require that the order
of header values is kept. To match that work, I'm adding matching
assertions to the tests.

The code unfortunately is not as elegant as I'd like, but then it's only
a result of the underlying HTTP spec.

* Suppress warning about slow test

The kitchen sink test expectedly can be slow. So far it wasn't slow
enough for hypothesis to trigger a warning, but today Travis must've had
a bad day. I think for this test is is acceptable to exceed hypothesis'
warning threshold.
2017-03-14 14:12:07 -07:00
Andrew Godwin
c55bc8a94b Fixed #90: X-Forwarded-For now does v6 address properly
It also now ignores ports, as I can't find a good example of them being put into the XFF header.
2017-02-25 18:18:17 -08:00
Andrew Godwin
360a445f68 Fix missed test 2017-02-16 10:18:32 -08:00
Andrew Godwin
b8c96d7fb2 Fixed #86: Use left-most X-Forwarded-For value. 2017-02-16 10:01:27 -08:00
NotSqrt
80bacf1ea1 Handle both dicts and twisted Headers (#84)
Fix #78
2017-02-14 18:15:00 -08:00
Andrew Godwin
412d9a48dc Releasing version 1.0.3 2017-02-12 22:44:38 -08:00
Andrew Godwin
b65140b158 Fix WebSockets to work with Twisted 17.1
Underlying PR that broke things: https://github.com/twisted/twisted/pull/591
We're relying on a private API so this is not really Twisted's fault.
2017-02-12 18:25:32 -08:00
Andrew Godwin
9853bf1740 Remove call to receive_many 2017-02-11 16:29:04 -08:00
Andrew Godwin
ecf88ee72a Fix broken proxy util code 2017-02-11 16:27:29 -08:00
Andrew Godwin
7d1123d39a Further fix for #78 and the shared util function 2017-02-11 06:59:43 -08:00
Gennady Chibisov
630609fce7 IPV6 interface binding (#80) 2017-02-10 18:24:50 -08:00
Andrew Godwin
4d23655b5c Fixed #78: Use right variable for WS headers 2017-02-10 09:52:16 -08:00
Andrew Godwin
60952b34bf Releasing version 1.0.2 2017-02-01 12:09:21 -08:00
Maik Hoepfel
cf94ec01fa Test against Python 3.4 and multiple Twisted versions (#75)
* Test against Python 3.4 and multiple Twisted versions

This commit adds tox to be able to test against different dependencies
locally. We agreed that Python 3.4 should be supported across all Channels
projects, so it is also added with this commit.

Furthermore, I think it makes sense to support a broad range of Twisted
releases, as users of daphne are not unlikely to have other Twisted code
running. It's not feasible to test against all releases since 16.0, and
it would require constant maintenance to add new releases as they come
out. So I opted to keep things simple for now, and only test against the
oldest supported and the current Twisted release.

I did consider @jpic's great idea from
https://github.com/django/daphne/pull/19 to just use tox to avoid having
to duplicate the dependency matrix. But it does lead to slower test runs
as it bypasses Travis' caching, and is slightly more verbose.

* Require asgiref 1.0 and use receive instead of receive_many

As both daphne and asgiref had a 1.0 release, I think it makes sense to
require the presumably more stable asgiref 1.0. It's also a good
occasion to fix the deprecation warnings when running the tests by
switching to receive instead of receive_many.

* Document supported Python and Twisted versions
2017-01-30 17:24:17 -08:00
David Marquis
07dd777ef1 Fix #72: Allowing null origins for Cordova/PhoneGap WS clients (#73)
* Fix #72: Configuring the underlying Autobahn Websocket connection factory so that 'null' origins are accepted, effectively allowing connections from Cordova/PhoneGap clients

* Fix #72: Adding test for no origin header use case
2017-01-30 10:18:11 -08:00
David Marquis
ccc804fd6a Tweaked TestWebSocketProtocol class documentation (copy/paste typo) (#74) 2017-01-30 10:17:49 -08:00
Хасанов Булат
71f6052fb3 Remove domain from file descriptor in serverFromString (#68) 2017-01-23 11:22:10 -08:00
Andrew Godwin
7597576db2 Add License 2017-01-23 10:09:15 -08:00
Andrew Godwin
c69bf09b7c Releasing 1.0.1 2017-01-09 18:28:52 -08:00
Andrew Godwin
3d628ba941 Fixed channels/#470: Bad python 2 handling of endpoint string 2017-01-09 18:27:37 -08:00
Andrew Godwin
aa5e7dd48f Releasing 1.0.0 2017-01-08 17:14:28 -08:00
Andrew Godwin
fd83678276 Merge pull request #37 from mcallistersean/ticket_10
use twisted endpoint description strings to bind to ports and sockets
2016-11-14 09:29:32 -08:00
Sean Mc Allister
b1c238d377 fix tests 2016-11-14 11:04:09 +01:00
Sean Mc Allister
5ea0749c5f build_endpoint_description_string is now a normal function 2016-11-14 10:59:15 +01:00
Andrew Godwin
c6e9b5a3a1 Merge pull request #60 from hackedd/ws-tests
Add test for WebSocket connection.
2016-11-06 15:15:06 +01:00
Paul Hooijenga
b8c1fe54eb Add test for WebSocket connection. 2016-11-05 17:16:43 +01:00
Andrew Godwin
3e65675f78 Merge pull request #59 from raphaelm/patch-1
Fixed a regression introduced in fixing #55
2016-11-05 14:26:35 +01:00
Raphael Michel
2ffed27a77 Fixed a regression introduced in fixing #55 2016-11-05 14:18:54 +01:00
Sean Mc Allister
76b441ecdf Merge branch 'master' into ticket_10 2016-11-05 14:07:49 +01:00
Sean Mc Allister
a2458ac47c test mixed endpoints and some cleanup 2016-11-05 13:48:24 +01:00
Andrew Godwin
a672da7b21 Merge pull request #58 from rixx/
Respond with a code when closing a connection
2016-11-05 13:38:25 +01:00
Sean Mc Allister
f115f808c3 Merge branch 'master' into ticket_10 2016-11-05 12:38:03 +01:00
Tobias Kunze
c6e4ea25d1 Respond with a code when closing a connection
Regards django/channels#414
2016-11-05 12:27:16 +01:00
Andrew Godwin
dc98b09dfd Merge pull request #57 from raphaelm/issue55
Fix #55 -- Optionally parse X-Forwarded-For header
2016-11-05 12:06:44 +01:00
Raphael Michel
61ed32d500 Fix #55 -- Optionally parse X-Forwarded-For header 2016-11-05 11:45:48 +01:00
Sean Mc Allister
f9f799f75f Merge branch 'master' into ticket_10 2016-11-04 14:40:40 +01:00
Andrew Godwin
3c8c21b352 Merge pull request #54 from indevgr/access-log-buffering
Access log buffering
2016-10-25 09:16:23 -07:00
Stratos Moros
bd0530d147 make access log line buffered
fixes #53
2016-10-25 15:43:44 +03:00
Stratos Moros
cd8b709ce8 fix variable name 2016-10-25 15:43:38 +03:00
Sean Mc Allister
e38c7541da Merge remote-tracking branch 'upstream/master' into ticket_10 2016-10-21 17:51:52 +02:00
Andrew Godwin
5f6f14a8d1 Merge pull request #51 from mcallistersean/issue/50
log full uri for http response
2016-10-11 13:21:07 -07:00
Sean Mc Allister
3e2370b5f1 log full uri for http response 2016-10-11 17:09:34 +02:00
Andrew Godwin
b537bed180 Make accept silently pass if already accepted 2016-10-05 16:00:11 -07:00
Andrew Godwin
8c637ff728 Fix default verbosity 2016-10-05 14:46:54 -07:00
Andrew Godwin
685f3aed1e Switch to new explicit WebSocket acceptance 2016-10-05 13:45:12 -07:00
Andrew Godwin
1bdf2a7518 Merge pull request #48 from adamchainz/setup.py
Tidy up setup.py a bit
2016-09-22 14:30:40 -07:00
Adam Chainz
5a5fd08633 Tidy up setup.py a bit
* Remove unused import 'sys'
* Github is HTTPS
* Add some trove classifiers based upon Django's
2016-09-22 22:28:55 +01:00
Andrew Godwin
6df13290b2 Merge pull request #47 from Krukov/fix-285-channels-issue
Catching error at receive_many form channel layer
2016-09-22 14:27:20 -07:00
Andrew Godwin
bfce80a062 Merge pull request #45 from Krukov/patch-add-verbosity
Verbosity
2016-09-22 14:23:25 -07:00
Andrew Godwin
a83d2f41bd Merge pull request #46 from adamchainz/readthedocs.io
Convert readthedocs links for their .org -> .io migration for hosted projects
2016-09-22 14:04:36 -07:00
Krukov Dima
790c482cb6 Catching error at receive_many form channel layer 2016-09-21 18:24:05 +00:00
Krukov Dima
cf096bab5c Logging to the python standard library 2016-09-21 18:12:35 +00:00
Adam Chainz
bcaf1de155 Convert readthedocs links for their .org -> .io migration for hosted projects
As per [their blog post of the 27th April](https://blog.readthedocs.com/securing-subdomains/) ‘Securing subdomains’:

> Starting today, Read the Docs will start hosting projects from subdomains on the domain readthedocs.io, instead of on readthedocs.org. This change addresses some security concerns around site cookies while hosting user generated data on the same domain as our dashboard.

Test Plan: Manually visited all the links I’ve modified.
2016-09-21 08:43:24 +01:00
Krukov Dima
0844061296 Add verbosity to log twisted log 2016-09-16 20:36:31 +00:00
Krukov Dima
c6270e7e4e Pep8 2016-09-16 20:34:20 +00:00
Andrew Godwin
2176b209f7 Django-ification 2016-09-09 13:27:25 +01:00
Andrew Godwin
8662b02daf Add maintenance and security README 2016-09-09 12:52:12 +01:00
Andrew Godwin
c0919a1cd0 Merge pull request #41 from damycra/303-attach-path-to-http-disconnect
Attach path to http.disconnect
2016-08-28 19:59:57 -07:00
Steven Davidson
edb67cac74 Attach path to http.disconnect
https://github.com/andrewgodwin/channels/issues/303
2016-08-29 00:03:42 +01:00
Andrew Godwin
1a5cce9c75 Releasing 0.15.0 2016-08-28 11:27:05 -07:00
Andrew Godwin
0b37e80614 Add attribute check for #31 and remove version pin 2016-08-28 11:14:13 -07:00
Andrew Godwin
c8883eea50 Merge pull request #40 from bpeschier/master
Tell Twisted to keep producing data after connection upgrade
2016-08-28 10:52:05 -07:00
Bas Peschier
65da239aa2 Tell Twisted to keep producing data after connection upgrade 2016-08-28 14:39:08 +02:00
Sean Mc Allister
ea399ecc1b updated README with an example of endpoint string usage 2016-08-12 10:19:41 +02:00
Sean Mc Allister
c3585c463b py2 compatibility 2016-08-11 22:36:57 +02:00
Sean Mc Allister
95351ffebb use twisted endpoint description strings to bind to ports and sockets 2016-08-11 17:52:27 +02:00
Andrew Godwin
fca52d4850 Correctly catch send dispatch errors 2016-08-05 22:17:38 -07:00
Andrew Godwin
c71a035004 Merge pull request #34 from globophobe/master
Implement connection force-close via ping timeout
2016-08-04 15:40:01 -07:00
globophobe
9a2748d7da Implement connection force-close via ping timeout
Logging will show WSDISCONNECT.
2016-08-04 20:26:57 +09:00
Andrew Godwin
dd3bf9b0b0 Use twisted variant of receive_many if available. 2016-07-26 20:00:11 +01:00
42 changed files with 4140 additions and 580 deletions

11
.flake8 Normal file
View File

@ -0,0 +1,11 @@
[flake8]
exclude =
.venv,
.tox,
docs,
testproject,
js_client,
.eggs
extend-ignore = E123, E128, E266, E402, W503, E731, W601, B036
max-line-length = 120

14
.github/ISSUE_TEMPLATE.md vendored Normal file
View File

@ -0,0 +1,14 @@
Issues are for **concrete, actionable bugs and feature requests** only - if you're just asking for debugging help or technical support we have to direct you elsewhere. If you just have questions or support requests please use:
- Stack Overflow
- The Django Users mailing list django-users@googlegroups.com (https://groups.google.com/forum/#!forum/django-users)
We have to limit this because of limited volunteer time to respond to issues!
Please also try and include, if you can:
- Your OS and runtime environment, and browser if applicable
- A `pip freeze` output showing your package versions
- What you expected to happen vs. what actually happened
- How you're running Channels (runserver? daphne/runworker? Nginx/Apache in front?)
- Console logs and full tracebacks of any errors

6
.github/dependabot.yml vendored Normal file
View File

@ -0,0 +1,6 @@
version: 2
updates:
- package-ecosystem: github-actions
directory: "/"
schedule:
interval: weekly

43
.github/workflows/tests.yml vendored Normal file
View File

@ -0,0 +1,43 @@
name: Tests
on:
push:
branches:
- main
pull_request:
workflow_dispatch:
permissions:
contents: read
jobs:
tests:
runs-on: ${{ matrix.os }}-latest
strategy:
fail-fast: false
matrix:
os:
- ubuntu
- windows
python-version:
- "3.9"
- "3.10"
- "3.11"
- "3.12"
- "3.13"
steps:
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip setuptools wheel
python -m pip install --upgrade tox
- name: Run tox targets for ${{ matrix.python-version }}
run: tox run -f py$(echo ${{ matrix.python-version }} | tr -d .)

11
.gitignore vendored
View File

@ -1,5 +1,16 @@
.idea/
*.egg-info *.egg-info
*.pyc *.pyc
__pycache__ __pycache__
dist/ dist/
build/ build/
/.tox
.hypothesis
.cache
.eggs
test_layer*
test_consumer*
.python-version
.pytest_cache/
.vscode
.coverage

23
.pre-commit-config.yaml Normal file
View File

@ -0,0 +1,23 @@
repos:
- repo: https://github.com/asottile/pyupgrade
rev: v3.19.1
hooks:
- id: pyupgrade
args: [--py39-plus]
- repo: https://github.com/psf/black
rev: 25.1.0
hooks:
- id: black
language_version: python3
- repo: https://github.com/pycqa/isort
rev: 6.0.1
hooks:
- id: isort
- repo: https://github.com/PyCQA/flake8
rev: 7.2.0
hooks:
- id: flake8
additional_dependencies:
- flake8-bugbear
ci:
autoupdate_schedule: quarterly

View File

@ -1,9 +0,0 @@
sudo: false
language: python
python:
- "2.7"
- "3.5"
install:
- if [[ $TRAVIS_PYTHON_VERSION == 2.7 ]]; then pip install unittest2; fi
- pip install asgiref twisted autobahn
script: if [[ $TRAVIS_PYTHON_VERSION == 2.7 ]]; then python -m unittest2; else python -m unittest; fi

View File

@ -1,3 +1,394 @@
4.2.0 (to be released)
------------------
* Added support for Python 3.13.
* Drop support for EOL Python 3.8.
* Removed unused pytest-runner
* Fixed sdist file to ensure it includes all tests
4.1.2 (2024-04-11)
------------------
* Fixed a setuptools configuration error in 4.1.1.
4.1.1 (2024-04-10)
------------------
* Fixed a twisted.plugin packaging error in 4.1.0.
Thanks to sdc50.
4.1.0 (2024-02-10)
------------------
* Added support for Python 3.12.
* Dropped support for EOL Python 3.7.
* Handled root path for websocket scopes.
* Validate HTTP header names as per RFC 9110.
4.0.0 (2022-10-07)
------------------
Major versioning targeting use with Channels 4.0 and beyond. Except where
noted should remain usable with Channels v3 projects, but updating Channels to the latest version is recommended.
* Added a ``runserver`` command to run an ASGI Django development server.
Added ``"daphne"`` to the ``INSTALLED_APPS`` setting, before
``"django.contrib.staticfiles"`` to enable:
INSTALLED_APPS = [
"daphne",
...
]
This replaces the Channels implementation of ``runserver``, which is removed
in Channels 4.0.
* Made the ``DaphneProcess`` tests helper class compatible with the ``spawn``
process start method, which is used on macOS and Windows.
Note that requires Channels v4 if using with ``ChannelsLiveServerTestCase``.
* Dropped support for Python 3.6.
* Updated dependencies to the latest versions.
Previously a range of Twisted versions have been supported. Recent Twisted
releases (22.2, 22.4) have issued security fixes, so those are now the
minimum supported version. Given the stability of Twisted, supporting a
range of versions does not represent a good use of maintainer time. Going
forward the latest Twisted version will be required.
* Set ``daphne`` as default ``Server`` header.
This can be configured with the ``--server-name`` CLI argument.
Added the new ``--no-server-name`` CLI argument to disable the ``Server``
header, which is equivalent to ``--server-name=` (an empty name).
* Added ``--log-fmt`` CLI argument.
* Added support for ``ASGI_THREADS`` environment variable, setting the maximum
number of workers used by a ``SyncToAsync`` thread-pool executor.
Set e.g. ``ASGI_THREADS=4 daphne ...`` when running to limit the number of
workers.
* Removed deprecated ``--ws_protocols`` CLI option.
3.0.2 (2021-04-07)
------------------
* Fixed a bug where ``send`` passed to applications wasn't a true async
function but a lambda wrapper, preventing it from being used with
``asgiref.sync.async_to_sync()``.
3.0.1 (2020-11-12)
------------------
* Fixed a bug where ``asyncio.CancelledError`` was not correctly handled on
Python 3.8+, resulting in incorrect protocol application cleanup.
3.0.0 (2020-10-28)
------------------
* Updates internals to use ASGI v3 throughout. ``asgiref.compatibility`` is
used for older applications.
* Consequently, the `--asgi-protocol` command-line option is removed.
* HTTP request bodies are now read, and passed to the application, in chunks.
* Added support for Python 3.9.
* Dropped support for Python 3.5.
2.5.0 (2020-04-15)
------------------
* Fixes compatability for twisted when running Python 3.8+ on Windows, by
setting ``asyncio.WindowsSelectorEventLoopPolicy`` as the event loop policy
in this case.
* The internal ``daphne.testing.TestApplication`` now requires an addition
``lock`` argument to ``__init__()``. This is expected to be an instance of
``multiprocessing.Lock``.
2.4.1 (2019-12-18)
------------------
* Avoids Twisted using the default event loop, for compatibility with Django
3.0's ``async_unsafe()`` decorator in threaded contexts, such as using the
auto-reloader.
2.4.0 (2019-11-20)
------------------
* Adds CI testing against and support for Python 3.8.
* Adds support for ``raw_path`` in ASGI scope.
* Ensures an error response is sent to the client if the application sends
malformed headers.
* Resolves an asyncio + multiprocessing problem when testing that would cause
the test suite to fail/hang on macOS.
* Requires installing Twisted's TLS extras, via ``install_requires``.
* Adds missing LICENSE to distribution.
2.3.0 (2019-04-09)
------------------
* Added support for ASGI v3.
2.2.5 (2019-01-31)
------------------
* WebSocket handshakes are now affected by the websocket connect timeout, so
you can limit them from the command line.
* Server name can now be set using --server-name
2.2.4 (2018-12-15)
------------------
* No longer listens on port 8000 when a file descriptor is provided with --fd
* Fixed a memory leak with WebSockets
2.2.3 (2018-11-06)
------------------
* Enforce that response headers are only bytestrings, rather than allowing
unicode strings and coercing them into bytes.
* New command-line options to set proxy header names: --proxy-headers-host and
--proxy-headers-port.
2.2.2 (2018-08-16)
------------------
* X-Forwarded-Proto support is now present and enabled if you turn on the
--proxy-headers flag
* ASGI applications are no longer instantiated in a thread (the ASGI spec
was finalised to say all constructors must be non-blocking on the main thread)
2.2.1 (2018-07-22)
------------------
* Python 3.7 compatability is flagged and ensured by using Twisted 18.7 and
above as a dependency.
* The send() awaitable in applications no longer blocks if the connection is
closed.
* Fixed a race condition where applications would be cleaned up before they
had even started.
2.2.0 (2018-06-13)
------------------
* HTTP timeouts have been removed by default, as they were only needed
with ASGI/Channels 1. You can re-enable them with the --http-timeout
argument to Daphne.
* Occasional errors on application timeout for non-fully-opened sockets
and for trying to read closed requests under high load are fixed.
* X-Forwarded-For headers are now correctly decoded in all environments
and no longer have unicode matching issues.
2.1.2 (2018-05-24)
------------------
* Fixed spurious errors caused by websockets disconnecting before their
application was instantiated.
* Stronger checking for type-safety of headers as bytestrings
2.1.1 (2018-04-18)
------------------
* ASGI application constructors are now run in a threadpool as they might
contain blocking synchronous code.
2.1.0 (2018-03-05)
------------------
* Removed subprotocol support from server, as it never really worked. Subprotocols
can instead be negotiated by ASGI applications now.
* Non-ASCII query strings now raise a 400 Bad Request error rather than silently
breaking the logger
2.0.4 (2018-02-21)
------------------
* Ping timeouts no longer reset on outgoing data, only incoming data
* No more errors when connections close prematurely
2.0.3 (2018-02-07)
------------------
* Unix socket listening no longer errors during startup (introduced in 2.0.2)
* ASGI Applications are now not immediately killed on disconnection but instead
given --application-close-timeout seconds to exit (defaults to 10)
2.0.2 (2018-02-04)
------------------
* WebSockets are no longer closed after the duration of http_timeout
2.0.1 (2018-02-03)
------------------
* Updated logging to correctly route exceptions through the main Daphne logger
2.0.0 (2018-02-01)
------------------
* Major rewrite to the new async-based ASGI specification and to support
Channels 2. Not backwards compatible.
1.3.0 (2017-06-16)
------------------
* Ability to set the websocket connection timeout
* Server no longer reveals the exact Autobahn version number for security
* A few unicode fixes for Python 2/3 compatability
* Stopped logging messages to already-closed connections as ERROR
1.2.0 (2017-04-01)
------------------
* The new process-specific channel support is now implemented, resulting in
significantly less traffic to your channel backend.
* Native twisted blocking support for channel layers that support it is now
used. While it is a lot more efficient, it is also sometimes slightly more
latent; you can disable it using --force-sync.
* Native SSL termination is now correctly reflected in the ASGI-HTTP `scheme`
key.
* accept: False is now a valid way to deny a connection, as well as close: True.
* HTTP version is now correctly sent as one of "1.0", "1.1" or "2".
* More command line options for websocket timeouts
1.1.0 (2017-03-18)
------------------
* HTTP/2 termination is now supported natively. The Twisted dependency has been
increased to at least 17.1 as a result; for more information about setting up
HTTP/2, see the README.
* X-Forwarded-For decoding support understands IPv6 addresses, and picks the
most remote (leftmost) entry if there are multiple relay hosts.
* Fixed an error where `disconnect` messages would still try and get sent even
if the client never finished a request.
1.0.3 (2017-02-12)
------------------
* IPv6 addresses are correctly accepted as bind targets on the command line
* Twisted 17.1 compatability fixes for WebSocket receiving/keepalive and
proxy header detection.
1.0.2 (2017-02-01)
------------------
* The "null" WebSocket origin (including file:// and no value) is now accepted
by Daphne and passed onto the application to accept/deny.
* Listening on file descriptors works properly again.
* The DeprecationError caused by not passing endpoints into a Server class
directly is now a warning instead.
1.0.1 (2017-01-09)
------------------
* Endpoint unicode strings now work correctly on Python 2 and Python 3
1.0.0 (2017-01-08)
------------------
* BREAKING CHANGE: Daphne now requires acceptance of WebSocket connections
before it finishes the socket handshake and relays incoming packets.
You must upgrade to at least Channels 1.0.0 as well; see
http://channels.readthedocs.io/en/latest/releases/1.0.0.html for more.
* http.disconnect now has a `path` key
* WebSockets can now be closed with a specific code
* X-Forwarded-For header support; defaults to X-Forwarded-For, override with
--proxy-headers on the commandline.
* Twisted endpoint description string support with `-e` on the command line
(allowing for SNI/ACME support, among other things)
* Logging/error verbosity fixes and access log flushes properly
0.15.0 (2016-08-28)
-------------------
* Connections now force-close themselves after pings fail for a certain
timeframe, controllable via the new --ping-timeout option.
* Badly-formatted websocket response messages now log to console in
all situations
* Compatability with Twisted 16.3 and up
0.14.3 (2016-07-21) 0.14.3 (2016-07-21)
------------------- -------------------

27
LICENSE Normal file
View File

@ -0,0 +1,27 @@
Copyright (c) Django Software Foundation and individual contributors.
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice,
this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
3. Neither the name of Django nor the names of its contributors may be used
to endorse or promote products derived from this software without
specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

2
MANIFEST.in Normal file
View File

@ -0,0 +1,2 @@
include LICENSE
recursive-include tests *.py

View File

@ -13,4 +13,4 @@ endif
git tag $(version) git tag $(version)
git push git push
git push --tags git push --tags
python setup.py sdist bdist_wheel upload #python setup.py sdist bdist_wheel upload

View File

@ -1,15 +1,13 @@
daphne daphne
====== ======
.. image:: https://api.travis-ci.org/andrewgodwin/daphne.svg
:target: https://travis-ci.org/andrewgodwin/daphne
.. image:: https://img.shields.io/pypi/v/daphne.svg .. image:: https://img.shields.io/pypi/v/daphne.svg
:target: https://pypi.python.org/pypi/daphne :target: https://pypi.python.org/pypi/daphne
Daphne is a HTTP, HTTP2 and WebSocket protocol server for Daphne is a HTTP, HTTP2 and WebSocket protocol server for
`ASGI <http://channels.readthedocs.org/en/latest/asgi.html>`_, and developed `ASGI <https://github.com/django/asgiref/blob/main/specs/asgi.rst>`_ and
to power Django Channels. `ASGI-HTTP <https://github.com/django/asgiref/blob/main/specs/www.rst>`_,
developed to power Django Channels.
It supports automatic negotiation of protocols; there's no need for URL It supports automatic negotiation of protocols; there's no need for URL
prefixing to determine WebSocket endpoints versus HTTP endpoints. prefixing to determine WebSocket endpoints versus HTTP endpoints.
@ -18,26 +16,76 @@ prefixing to determine WebSocket endpoints versus HTTP endpoints.
Running Running
------- -------
Simply point Daphne to your ASGI channel layer instance, and optionally Simply point Daphne to your ASGI application, and optionally
set a bind address and port (defaults to localhost, port 8000):: set a bind address and port (defaults to localhost, port 8000)::
daphne -b 0.0.0.0 -p 8001 django_project.asgi:channel_layer daphne -b 0.0.0.0 -p 8001 django_project.asgi:application
If you intend to run daphne behind a proxy server you can use UNIX If you intend to run daphne behind a proxy server you can use UNIX
sockets to communicate between the two:: sockets to communicate between the two::
daphne -u /tmp/daphne.sock django_project.asgi:channel_layer daphne -u /tmp/daphne.sock django_project.asgi:application
If daphne is being run inside a process manager, you might
If daphne is being run inside a process manager such as
`Circus <https://github.com/circus-tent/circus/>`_ you might
want it to bind to a file descriptor passed down from a parent process. want it to bind to a file descriptor passed down from a parent process.
To achieve this you can use the --fd flag:: To achieve this you can use the --fd flag::
daphne --fd 5 django_project.asgi:channel_layer daphne --fd 5 django_project.asgi:application
If you want more control over the port/socket bindings you can fall back to
using `twisted's endpoint description strings
<http://twistedmatrix.com/documents/current/api/twisted.internet.endpoints.html#serverFromString>`_
by using the `--endpoint (-e)` flag, which can be used multiple times.
This line would start a SSL server on port 443, assuming that `key.pem` and `crt.pem`
exist in the current directory (requires pyopenssl to be installed)::
daphne -e ssl:443:privateKey=key.pem:certKey=crt.pem django_project.asgi:application
Endpoints even let you use the ``txacme`` endpoint syntax to get automatic certificates
from Let's Encrypt, which you can read more about at http://txacme.readthedocs.io/en/stable/.
To see all available command line options run daphne with the ``-h`` flag.
HTTP/2 Support
--------------
Daphne supports terminating HTTP/2 connections natively. You'll
need to do a couple of things to get it working, though. First, you need to
make sure you install the Twisted ``http2`` and ``tls`` extras::
pip install -U "Twisted[tls,http2]"
Next, because all current browsers only support HTTP/2 when using TLS, you will
need to start Daphne with TLS turned on, which can be done using the Twisted endpoint syntax::
daphne -e ssl:443:privateKey=key.pem:certKey=crt.pem django_project.asgi:application
Alternatively, you can use the ``txacme`` endpoint syntax or anything else that
enables TLS under the hood.
You will also need to be on a system that has **OpenSSL 1.0.2 or greater**; if you are
using Ubuntu, this means you need at least Ubuntu 16.04.
Now, when you start up Daphne, it should tell you this in the log::
2017-03-18 19:14:02,741 INFO Starting server at ssl:port=8000:privateKey=privkey.pem:certKey=cert.pem, channel layer django_project.asgi:channel_layer.
2017-03-18 19:14:02,742 INFO HTTP/2 support enabled
Then, connect with a browser that supports HTTP/2, and everything should be
working. It's often hard to tell that HTTP/2 is working, as the log Daphne gives you
will be identical (it's HTTP, after all), and most browsers don't make it obvious
in their network inspector windows. There are browser extensions that will let
you know clearly if it's working or not.
Daphne only supports "normal" requests over HTTP/2 at this time; there is not
yet support for extended features like Server Push. It will, however, result in
much faster connections and lower overheads.
If you have a reverse proxy in front of your site to serve static files or
similar, HTTP/2 will only work if that proxy understands and passes through the
connection correctly.
To see all available command line options run daphne with the *-h* flag.
Root Path (SCRIPT_NAME) Root Path (SCRIPT_NAME)
----------------------- -----------------------
@ -54,4 +102,36 @@ WSGI ``SCRIPT_NAME`` setting, you have two options:
The header takes precedence if both are set. As with ``SCRIPT_ALIAS``, the value The header takes precedence if both are set. As with ``SCRIPT_ALIAS``, the value
should start with a slash, but not end with one; for example:: should start with a slash, but not end with one; for example::
daphne --root-path=/forum django_project.asgi:channel_layer daphne --root-path=/forum django_project.asgi:application
Python Support
--------------
Daphne requires Python 3.9 or later.
Contributing
------------
Please refer to the
`main Channels contributing docs <https://github.com/django/channels/blob/main/CONTRIBUTING.rst>`_.
To run tests, make sure you have installed the ``tests`` extra with the package::
cd daphne/
pip install -e '.[tests]'
pytest
Maintenance and Security
------------------------
To report security issues, please contact security@djangoproject.com. For GPG
signatures and more security process information, see
https://docs.djangoproject.com/en/dev/internals/security/.
To report bugs or request new features, please open a new GitHub issue.
This repository is part of the Channels project. For the shepherd and maintenance team, please see the
`main Channels readme <https://github.com/django/channels/blob/main/README.rst>`_.

View File

@ -1 +1,14 @@
__version__ = "0.14.3" import sys
__version__ = "4.1.3"
# Windows on Python 3.8+ uses ProactorEventLoop, which is not compatible with
# Twisted. Does not implement add_writer/add_reader.
# See https://bugs.python.org/issue37373
# and https://twistedmatrix.com/trac/ticket/9766
PY38_WIN = sys.version_info >= (3, 8) and sys.platform == "win32"
if PY38_WIN:
import asyncio
asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())

3
daphne/__main__.py Normal file
View File

@ -0,0 +1,3 @@
from daphne.cli import CommandLineInterface
CommandLineInterface.entrypoint()

View File

@ -1,7 +1,7 @@
import datetime import datetime
class AccessLogGenerator(object): class AccessLogGenerator:
""" """
Object that implements the Daphne "action logger" internal interface in Object that implements the Daphne "action logger" internal interface in
order to provide an access log in something resembling NCSA format. order to provide an access log in something resembling NCSA format.
@ -17,33 +17,48 @@ class AccessLogGenerator(object):
# HTTP requests # HTTP requests
if protocol == "http" and action == "complete": if protocol == "http" and action == "complete":
self.write_entry( self.write_entry(
host=details['client'], host=details["client"],
date=datetime.datetime.now(), date=datetime.datetime.now(),
request="%(method)s %(path)s" % details, request="%(method)s %(path)s" % details,
status=details['status'], status=details["status"],
length=details['size'], length=details["size"],
) )
# Websocket requests # Websocket requests
elif protocol == "websocket" and action == "connecting":
self.write_entry(
host=details["client"],
date=datetime.datetime.now(),
request="WSCONNECTING %(path)s" % details,
)
elif protocol == "websocket" and action == "rejected":
self.write_entry(
host=details["client"],
date=datetime.datetime.now(),
request="WSREJECT %(path)s" % details,
)
elif protocol == "websocket" and action == "connected": elif protocol == "websocket" and action == "connected":
self.write_entry( self.write_entry(
host=details['client'], host=details["client"],
date=datetime.datetime.now(), date=datetime.datetime.now(),
request="WSCONNECT %(path)s" % details, request="WSCONNECT %(path)s" % details,
) )
elif protocol == "websocket" and action == "disconnected": elif protocol == "websocket" and action == "disconnected":
self.write_entry( self.write_entry(
host=details['client'], host=details["client"],
date=datetime.datetime.now(), date=datetime.datetime.now(),
request="WSDISCONNECT %(path)s" % details, request="WSDISCONNECT %(path)s" % details,
) )
def write_entry(self, host, date, request, status=None, length=None, ident=None, user=None): def write_entry(
self, host, date, request, status=None, length=None, ident=None, user=None
):
""" """
Writes an NCSA-style entry to the log file (some liberty is taken with Writes an NCSA-style entry to the log file (some liberty is taken with
what the entries are for non-HTTP) what the entries are for non-HTTP)
""" """
self.stream.write( self.stream.write(
"%s %s %s [%s] \"%s\" %s %s\n" % ( '%s %s %s [%s] "%s" %s %s\n'
% (
host, host,
ident or "-", ident or "-",
user or "-", user or "-",

16
daphne/apps.py Normal file
View File

@ -0,0 +1,16 @@
# Import the server here to ensure the reactor is installed very early on in case other
# packages import twisted.internet.reactor (e.g. raven does this).
from django.apps import AppConfig
from django.core import checks
import daphne.server # noqa: F401
from .checks import check_daphne_installed
class DaphneConfig(AppConfig):
name = "daphne"
verbose_name = "Daphne"
def ready(self):
checks.register(check_daphne_installed, checks.Tags.staticfiles)

21
daphne/checks.py Normal file
View File

@ -0,0 +1,21 @@
# Django system check to ensure daphne app is listed in INSTALLED_APPS before django.contrib.staticfiles.
from django.core.checks import Error, register
@register()
def check_daphne_installed(app_configs, **kwargs):
from django.apps import apps
from django.contrib.staticfiles.apps import StaticFilesConfig
from daphne.apps import DaphneConfig
for app in apps.get_app_configs():
if isinstance(app, DaphneConfig):
return []
if isinstance(app, StaticFilesConfig):
return [
Error(
"Daphne must be listed before django.contrib.staticfiles in INSTALLED_APPS.",
id="daphne.E001",
)
]

View File

@ -1,95 +1,167 @@
import sys
import argparse import argparse
import logging import logging
import importlib import sys
from .server import Server from argparse import ArgumentError, Namespace
from .access import AccessLogGenerator
from asgiref.compatibility import guarantee_single_callable
from .access import AccessLogGenerator
from .endpoints import build_endpoint_description_strings
from .server import Server
from .utils import import_by_path
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
DEFAULT_HOST = "127.0.0.1"
DEFAULT_PORT = 8000
class CommandLineInterface(object):
class CommandLineInterface:
""" """
Acts as the main CLI entry point for running the server. Acts as the main CLI entry point for running the server.
""" """
description = "Django HTTP/WebSocket server" description = "Django HTTP/WebSocket server"
server_class = Server
def __init__(self): def __init__(self):
self.parser = argparse.ArgumentParser( self.parser = argparse.ArgumentParser(description=self.description)
description=self.description, self.parser.add_argument(
"-p", "--port", type=int, help="Port number to listen on", default=None
) )
self.parser.add_argument( self.parser.add_argument(
'-p', "-b",
'--port', "--bind",
type=int, dest="host",
help='Port number to listen on', help="The host/address to bind to",
default=8000,
)
self.parser.add_argument(
'-b',
'--bind',
dest='host',
help='The host/address to bind to',
default="127.0.0.1",
)
self.parser.add_argument(
'-u',
'--unix-socket',
dest='unix_socket',
help='Bind to a UNIX socket rather than a TCP host/port',
default=None, default=None,
) )
self.parser.add_argument( self.parser.add_argument(
'--fd', "--websocket_timeout",
type=int, type=int,
dest='file_descriptor', help="Maximum time to allow a websocket to be connected. -1 for infinite.",
help='Bind to a file descriptor rather than a TCP host/port or named unix socket', default=86400,
)
self.parser.add_argument(
"--websocket_connect_timeout",
type=int,
help="Maximum time to allow a connection to handshake. -1 for infinite",
default=5,
)
self.parser.add_argument(
"-u",
"--unix-socket",
dest="unix_socket",
help="Bind to a UNIX socket rather than a TCP host/port",
default=None, default=None,
) )
self.parser.add_argument( self.parser.add_argument(
'-v', "--fd",
'--verbosity',
type=int, type=int,
help='How verbose to make the output', dest="file_descriptor",
help="Bind to a file descriptor rather than a TCP host/port or named unix socket",
default=None,
)
self.parser.add_argument(
"-e",
"--endpoint",
dest="socket_strings",
action="append",
help="Use raw server strings passed directly to twisted",
default=[],
)
self.parser.add_argument(
"-v",
"--verbosity",
type=int,
help="How verbose to make the output",
default=1, default=1,
) )
self.parser.add_argument( self.parser.add_argument(
'-t', "-t",
'--http-timeout', "--http-timeout",
type=int, type=int,
help='How long to wait for worker server before timing out HTTP connections', help="How long to wait for worker before timing out HTTP connections",
default=120,
)
self.parser.add_argument(
'--access-log',
help='Where to write the access log (- for stdout, the default for verbosity=1)',
default=None, default=None,
) )
self.parser.add_argument( self.parser.add_argument(
'--ping-interval', "--access-log",
help="Where to write the access log (- for stdout, the default for verbosity=1)",
default=None,
)
self.parser.add_argument(
"--log-fmt",
help="Log format to use",
default="%(asctime)-15s %(levelname)-8s %(message)s",
)
self.parser.add_argument(
"--ping-interval",
type=int, type=int,
help='The number of seconds a WebSocket must be idle before a keepalive ping is sent', help="The number of seconds a WebSocket must be idle before a keepalive ping is sent",
default=20, default=20,
) )
self.parser.add_argument( self.parser.add_argument(
'channel_layer', "--ping-timeout",
help='The ASGI channel layer instance to use as path.to.module:instance.path', type=int,
help="The number of seconds before a WebSocket is closed if no response to a keepalive ping",
default=30,
) )
self.parser.add_argument( self.parser.add_argument(
'--ws-protocol', "--application-close-timeout",
nargs='*', type=int,
dest='ws_protocols', help="The number of seconds an ASGI application has to exit after client disconnect before it is killed",
help='The WebSocket protocols you wish to support', default=10,
default=None,
) )
self.parser.add_argument( self.parser.add_argument(
'--root-path', "--root-path",
dest='root_path', dest="root_path",
help='The setting for the ASGI root_path variable', help="The setting for the ASGI root_path variable",
default="", default="",
) )
self.parser.add_argument(
"--proxy-headers",
dest="proxy_headers",
help="Enable parsing and using of X-Forwarded-For and X-Forwarded-Port headers and using that as the "
"client address",
default=False,
action="store_true",
)
self.arg_proxy_host = self.parser.add_argument(
"--proxy-headers-host",
dest="proxy_headers_host",
help="Specify which header will be used for getting the host "
"part. Can be omitted, requires --proxy-headers to be specified "
'when passed. "X-Real-IP" (when passed by your webserver) is a '
"good candidate for this.",
default=False,
action="store",
)
self.arg_proxy_port = self.parser.add_argument(
"--proxy-headers-port",
dest="proxy_headers_port",
help="Specify which header will be used for getting the port "
"part. Can be omitted, requires --proxy-headers to be specified "
"when passed.",
default=False,
action="store",
)
self.parser.add_argument(
"application",
help="The application to dispatch to as path.to.module:instance.path",
)
self.parser.add_argument(
"-s",
"--server-name",
dest="server_name",
help="specify which value should be passed to response header Server attribute",
default="daphne",
)
self.parser.add_argument(
"--no-server-name", dest="server_name", action="store_const", const=""
)
self.server = None
@classmethod @classmethod
def entrypoint(cls): def entrypoint(cls):
@ -98,6 +170,37 @@ class CommandLineInterface(object):
""" """
cls().run(sys.argv[1:]) cls().run(sys.argv[1:])
def _check_proxy_headers_passed(self, argument: str, args: Namespace):
"""Raise if the `--proxy-headers` weren't specified."""
if args.proxy_headers:
return
raise ArgumentError(
argument=argument,
message="--proxy-headers has to be passed for this parameter.",
)
def _get_forwarded_host(self, args: Namespace):
"""
Return the default host header from which the remote hostname/ip
will be extracted.
"""
if args.proxy_headers_host:
self._check_proxy_headers_passed(argument=self.arg_proxy_host, args=args)
return args.proxy_headers_host
if args.proxy_headers:
return "X-Forwarded-For"
def _get_forwarded_port(self, args: Namespace):
"""
Return the default host header from which the remote hostname/ip
will be extracted.
"""
if args.proxy_headers_port:
self._check_proxy_headers_passed(argument=self.arg_proxy_port, args=args)
return args.proxy_headers_port
if args.proxy_headers:
return "X-Forwarded-Port"
def run(self, args): def run(self, args):
""" """
Pass in raw argument list and it will decode them Pass in raw argument list and it will decode them
@ -107,12 +210,13 @@ class CommandLineInterface(object):
args = self.parser.parse_args(args) args = self.parser.parse_args(args)
# Set up logging # Set up logging
logging.basicConfig( logging.basicConfig(
level = { level={
0: logging.WARN, 0: logging.WARN,
1: logging.INFO, 1: logging.INFO,
2: logging.DEBUG, 2: logging.DEBUG,
3: logging.DEBUG, # Also turns on asyncio debug
}[args.verbosity], }[args.verbosity],
format = "%(asctime)-15s %(levelname)-8s %(message)s" , format=args.log_fmt,
) )
# If verbosity is 1 or greater, or they told us explicitly, set up access log # If verbosity is 1 or greater, or they told us explicitly, set up access log
access_log_stream = None access_log_stream = None
@ -120,30 +224,62 @@ class CommandLineInterface(object):
if args.access_log == "-": if args.access_log == "-":
access_log_stream = sys.stdout access_log_stream = sys.stdout
else: else:
access_log_stream = open(args.access_log, "a") access_log_stream = open(args.access_log, "a", 1)
elif args.verbosity >= 1: elif args.verbosity >= 1:
access_log_stream = sys.stdout access_log_stream = sys.stdout
# Import channel layer
# Import application
sys.path.insert(0, ".") sys.path.insert(0, ".")
module_path, object_path = args.channel_layer.split(":", 1) application = import_by_path(args.application)
channel_layer = importlib.import_module(module_path) application = guarantee_single_callable(application)
for bit in object_path.split("."):
channel_layer = getattr(channel_layer, bit) # Set up port/host bindings
# Run server if not any(
logger.info( [
"Starting server at %s, channel layer %s", args.host,
(args.unix_socket if args.unix_socket else "%s:%s" % (args.host, args.port)), args.port is not None,
args.channel_layer, args.unix_socket,
) args.file_descriptor is not None,
Server( args.socket_strings,
channel_layer=channel_layer, ]
):
# no advanced binding options passed, patch in defaults
args.host = DEFAULT_HOST
args.port = DEFAULT_PORT
elif args.host and args.port is None:
args.port = DEFAULT_PORT
elif args.port is not None and not args.host:
args.host = DEFAULT_HOST
# Build endpoint description strings from (optional) cli arguments
endpoints = build_endpoint_description_strings(
host=args.host, host=args.host,
port=args.port, port=args.port,
unix_socket=args.unix_socket, unix_socket=args.unix_socket,
file_descriptor=args.file_descriptor, file_descriptor=args.file_descriptor,
)
endpoints = sorted(args.socket_strings + endpoints)
# Start the server
logger.info("Starting server at {}".format(", ".join(endpoints)))
self.server = self.server_class(
application=application,
endpoints=endpoints,
http_timeout=args.http_timeout, http_timeout=args.http_timeout,
ping_interval=args.ping_interval, ping_interval=args.ping_interval,
action_logger=AccessLogGenerator(access_log_stream) if access_log_stream else None, ping_timeout=args.ping_timeout,
ws_protocols=args.ws_protocols, websocket_timeout=args.websocket_timeout,
websocket_connect_timeout=args.websocket_connect_timeout,
websocket_handshake_timeout=args.websocket_connect_timeout,
application_close_timeout=args.application_close_timeout,
action_logger=(
AccessLogGenerator(access_log_stream) if access_log_stream else None
),
root_path=args.root_path, root_path=args.root_path,
).run() verbosity=args.verbosity,
proxy_forwarded_address_header=self._get_forwarded_host(args=args),
proxy_forwarded_port_header=self._get_forwarded_port(args=args),
proxy_forwarded_proto_header=(
"X-Forwarded-Proto" if args.proxy_headers else None
),
server_name=args.server_name,
)
self.server.run()

22
daphne/endpoints.py Normal file
View File

@ -0,0 +1,22 @@
def build_endpoint_description_strings(
host=None, port=None, unix_socket=None, file_descriptor=None
):
"""
Build a list of twisted endpoint description strings that the server will listen on.
This is to streamline the generation of twisted endpoint description strings from easier
to use command line args such as host, port, unix sockets etc.
"""
socket_descriptions = []
if host and port is not None:
host = host.strip("[]").replace(":", r"\:")
socket_descriptions.append("tcp:port=%d:interface=%s" % (int(port), host))
elif any([host, port]):
raise ValueError("TCP binding requires both port and host kwargs.")
if unix_socket:
socket_descriptions.append("unix:%s" % unix_socket)
if file_descriptor is not None:
socket_descriptions.append("fd:fileno=%d" % int(file_descriptor))
return socket_descriptions

View File

@ -1,15 +1,15 @@
from __future__ import unicode_literals
import logging import logging
import six
import time import time
import traceback import traceback
from urllib.parse import unquote
from six.moves.urllib_parse import unquote, unquote_plus from twisted.internet.defer import inlineCallbacks, maybeDeferred
from twisted.internet.interfaces import IProtocolNegotiationFactory
from twisted.protocols.policies import ProtocolWrapper from twisted.protocols.policies import ProtocolWrapper
from twisted.web import http from twisted.web import http
from zope.interface import implementer
from .ws_protocol import WebSocketProtocol, WebSocketFactory from .utils import HEADER_NAME_RE, parse_x_forwarded_for
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -23,7 +23,8 @@ class WebRequest(http.Request):
GET and POST out. GET and POST out.
""" """
error_template = """ error_template = (
"""
<html> <html>
<head> <head>
<title>%(title)s</title> <title>%(title)s</title>
@ -40,33 +41,64 @@ class WebRequest(http.Request):
<footer>Daphne</footer> <footer>Daphne</footer>
</body> </body>
</html> </html>
""".replace("\n", "").replace(" ", " ").replace(" ", " ").replace(" ", " ") # Shorten it a bit, bytes wise """.replace(
"\n", ""
)
.replace(" ", " ")
.replace(" ", " ")
.replace(" ", " ")
) # Shorten it a bit, bytes wise
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
self.client_addr = None
self.server_addr = None
try:
http.Request.__init__(self, *args, **kwargs) http.Request.__init__(self, *args, **kwargs)
# Easy factory link # Easy server link
self.factory = self.channel.factory self.server = self.channel.factory.server
# Make a name for our reply channel self.application_queue = None
self.reply_channel = self.factory.channel_layer.new_channel("http.response!") self._response_started = False
# Tell factory we're that channel's client self.server.protocol_connected(self)
self.last_keepalive = time.time() except Exception:
self.factory.reply_protocols[self.reply_channel] = self logger.error(traceback.format_exc())
self._got_response_start = False raise
### Twisted progress callbacks
@inlineCallbacks
def process(self): def process(self):
try: try:
self.request_start = time.time() self.request_start = time.time()
# Validate header names.
for name, _ in self.requestHeaders.getAllRawHeaders():
if not HEADER_NAME_RE.fullmatch(name):
self.basic_error(400, b"Bad Request", "Invalid header name")
return
# Get upgrade header # Get upgrade header
upgrade_header = None upgrade_header = None
if self.requestHeaders.hasHeader(b"Upgrade"): if self.requestHeaders.hasHeader(b"Upgrade"):
upgrade_header = self.requestHeaders.getRawHeaders(b"Upgrade")[0] upgrade_header = self.requestHeaders.getRawHeaders(b"Upgrade")[0]
# Get client address if possible # Get client address if possible
if hasattr(self.client, "host") and hasattr(self.client, "port"): if hasattr(self.client, "host") and hasattr(self.client, "port"):
self.client_addr = [self.client.host, self.client.port] # client.host and host.host are byte strings in Python 2, but spec
self.server_addr = [self.host.host, self.host.port] # requires unicode string.
else: self.client_addr = [str(self.client.host), self.client.port]
self.client_addr = None self.server_addr = [str(self.host.host), self.host.port]
self.server_addr = None
self.client_scheme = "https" if self.isSecure() else "http"
# See if we need to get the address from a proxy header instead
if self.server.proxy_forwarded_address_header:
self.client_addr, self.client_scheme = parse_x_forwarded_for(
self.requestHeaders,
self.server.proxy_forwarded_address_header,
self.server.proxy_forwarded_port_header,
self.server.proxy_forwarded_proto_header,
self.client_addr,
self.client_scheme,
)
# Check for unicodeish path (or it'll crash when trying to parse) # Check for unicodeish path (or it'll crash when trying to parse)
try: try:
self.path.decode("ascii") self.path.decode("ascii")
@ -78,17 +110,25 @@ class WebRequest(http.Request):
self.query_string = b"" self.query_string = b""
if b"?" in self.uri: if b"?" in self.uri:
self.query_string = self.uri.split(b"?", 1)[1] self.query_string = self.uri.split(b"?", 1)[1]
try:
self.query_string.decode("ascii")
except UnicodeDecodeError:
self.basic_error(400, b"Bad Request", "Invalid query string")
return
# Is it WebSocket? IS IT?! # Is it WebSocket? IS IT?!
if upgrade_header and upgrade_header.lower() == b"websocket": if upgrade_header and upgrade_header.lower() == b"websocket":
# Make WebSocket protocol to hand off to # Make WebSocket protocol to hand off to
protocol = self.factory.ws_factory.buildProtocol(self.transport.getPeer()) protocol = self.server.ws_factory.buildProtocol(
self.transport.getPeer()
)
if not protocol: if not protocol:
# If protocol creation fails, we signal "internal server error" # If protocol creation fails, we signal "internal server error"
self.setResponseCode(500) self.setResponseCode(500)
logger.warn("Could not make WebSocket protocol") logger.warn("Could not make WebSocket protocol")
self.finish() self.finish()
# Give it the raw query string
protocol._raw_query_string = self.query_string
# Port across transport # Port across transport
protocol.set_main_factory(self.factory)
transport, self.transport = self.transport, None transport, self.transport = self.transport, None
if isinstance(transport, ProtocolWrapper): if isinstance(transport, ProtocolWrapper):
# i.e. TLS is a wrapping protocol # i.e. TLS is a wrapping protocol
@ -97,143 +137,203 @@ class WebRequest(http.Request):
transport.protocol = protocol transport.protocol = protocol
protocol.makeConnection(transport) protocol.makeConnection(transport)
# Re-inject request # Re-inject request
data = self.method + b' ' + self.uri + b' HTTP/1.1\x0d\x0a' data = self.method + b" " + self.uri + b" HTTP/1.1\x0d\x0a"
for h in self.requestHeaders.getAllRawHeaders(): for h in self.requestHeaders.getAllRawHeaders():
data += h[0] + b': ' + b",".join(h[1]) + b'\x0d\x0a' data += h[0] + b": " + b",".join(h[1]) + b"\x0d\x0a"
data += b"\x0d\x0a" data += b"\x0d\x0a"
data += self.content.read() data += self.content.read()
protocol.dataReceived(data) protocol.dataReceived(data)
# Remove our HTTP reply channel association # Remove our HTTP reply channel association
if hasattr(protocol, "reply_channel"): logger.debug("Upgraded connection %s to WebSocket", self.client_addr)
logger.debug("Upgraded connection %s to WebSocket %s", self.reply_channel, protocol.reply_channel) self.server.protocol_disconnected(self)
else: # Resume the producer so we keep getting data, if it's available as a method
logger.debug("Connection %s did not get successful WS handshake.", self.reply_channel) self.channel._networkProducer.resumeProducing()
del self.factory.reply_protocols[self.reply_channel]
self.reply_channel = None
# Boring old HTTP. # Boring old HTTP.
else: else:
# Sanitize and decode headers, potentially extracting root path # Sanitize and decode headers, potentially extracting root path
self.clean_headers = [] self.clean_headers = []
self.root_path = self.factory.root_path self.root_path = self.server.root_path
for name, values in self.requestHeaders.getAllRawHeaders(): for name, values in self.requestHeaders.getAllRawHeaders():
# Prevent CVE-2015-0219 # Prevent CVE-2015-0219
if b"_" in name: if b"_" in name:
continue continue
for value in values: for value in values:
if name.lower() == b"daphne-root-path": if name.lower() == b"daphne-root-path":
self.root_path = self.unquote(value) self.root_path = unquote(value.decode("ascii"))
else: else:
self.clean_headers.append((name.lower(), value)) self.clean_headers.append((name.lower(), value))
logger.debug("HTTP %s request for %s", self.method, self.reply_channel) logger.debug("HTTP %s request for %s", self.method, self.client_addr)
self.content.seek(0, 0) self.content.seek(0, 0)
# Send message # Work out the application scope and create application
try: self.application_queue = yield maybeDeferred(
self.factory.channel_layer.send("http.request", { self.server.create_application,
"reply_channel": self.reply_channel, self,
{
"type": "http",
# TODO: Correctly say if it's 1.1 or 1.0 # TODO: Correctly say if it's 1.1 or 1.0
"http_version": "1.1", "http_version": self.clientproto.split(b"/")[-1].decode(
"ascii"
),
"method": self.method.decode("ascii"), "method": self.method.decode("ascii"),
"path": self.unquote(self.path), "path": unquote(self.path.decode("ascii")),
"raw_path": self.path,
"root_path": self.root_path, "root_path": self.root_path,
"scheme": "http", "scheme": self.client_scheme,
"query_string": self.query_string, "query_string": self.query_string,
"headers": self.clean_headers, "headers": self.clean_headers,
"body": self.content.read(),
"client": self.client_addr, "client": self.client_addr,
"server": self.server_addr, "server": self.server_addr,
}) },
except self.factory.channel_layer.ChannelFull: )
# Channel is too full; reject request with 503 # Check they didn't close an unfinished request
self.basic_error(503, b"Service Unavailable", "Request queue full.") if self.application_queue is None or self.content.closed:
# Not much we can do, the request is prematurely abandoned.
return
# Run application against request
buffer_size = self.server.request_buffer_size
while True:
chunk = self.content.read(buffer_size)
more_body = not (len(chunk) < buffer_size)
payload = {
"type": "http.request",
"body": chunk,
"more_body": more_body,
}
self.application_queue.put_nowait(payload)
if not more_body:
break
except Exception: except Exception:
logger.error(traceback.format_exc()) logger.error(traceback.format_exc())
self.basic_error(500, b"Internal Server Error", "HTTP processing error") self.basic_error(
500, b"Internal Server Error", "Daphne HTTP processing error"
@classmethod )
def unquote(cls, value, plus_as_space=False):
"""
Python 2 and 3 compat layer for utf-8 unquoting
"""
if six.PY2:
if plus_as_space:
return unquote_plus(value).decode("utf8")
else:
return unquote(value).decode("utf8")
else:
if plus_as_space:
return unquote_plus(value.decode("ascii"))
else:
return unquote(value.decode("ascii"))
def send_disconnect(self):
"""
Sends a disconnect message on the http.disconnect channel.
Useful only really for long-polling.
"""
try:
self.factory.channel_layer.send("http.disconnect", {
"reply_channel": self.reply_channel,
})
except self.factory.channel_layer.ChannelFull:
pass
def connectionLost(self, reason): def connectionLost(self, reason):
""" """
Cleans up reply channel on close. Cleans up reply channel on close.
""" """
if self.reply_channel and self.reply_channel in self.channel.factory.reply_protocols: if self.application_queue:
self.send_disconnect() self.send_disconnect()
del self.channel.factory.reply_protocols[self.reply_channel] logger.debug("HTTP disconnect for %s", self.client_addr)
logger.debug("HTTP disconnect for %s", self.reply_channel)
http.Request.connectionLost(self, reason) http.Request.connectionLost(self, reason)
self.server.protocol_disconnected(self)
def finish(self): def finish(self):
""" """
Cleans up reply channel on close. Cleans up reply channel on close.
""" """
if self.reply_channel and self.reply_channel in self.channel.factory.reply_protocols: if self.application_queue:
self.send_disconnect() self.send_disconnect()
del self.channel.factory.reply_protocols[self.reply_channel] logger.debug("HTTP close for %s", self.client_addr)
logger.debug("HTTP close for %s", self.reply_channel)
http.Request.finish(self) http.Request.finish(self)
self.server.protocol_disconnected(self)
def serverResponse(self, message): ### Server reply callbacks
def handle_reply(self, message):
""" """
Writes a received HTTP response back out to the transport. Handles a reply from the client
""" """
if "status" in message: # Handle connections that are already closed
if self._got_response_start: if self.finished or self.channel is None:
raise ValueError("Got multiple Response messages for %s!" % self.reply_channel) return
self._got_response_start = True # Check message validity
# Write code if "type" not in message:
self.setResponseCode(message['status']) raise ValueError("Message has no type defined")
# Handle message
if message["type"] == "http.response.start":
if self._response_started:
raise ValueError("HTTP response has already been started")
self._response_started = True
if "status" not in message:
raise ValueError(
"Specifying a status code is required for a Response message."
)
# Set HTTP status code
self.setResponseCode(message["status"])
# Write headers # Write headers
for header, value in message.get("headers", {}): for header, value in message.get("headers", {}):
# Shim code from old ASGI version, can be removed after a while
if isinstance(header, six.text_type):
header = header.encode("latin1")
self.responseHeaders.addRawHeader(header, value) self.responseHeaders.addRawHeader(header, value)
logger.debug("HTTP %s response started for %s", message['status'], self.reply_channel) if self.server.server_name and not self.responseHeaders.hasHeader("server"):
self.setHeader(b"server", self.server.server_name.encode())
logger.debug(
"HTTP %s response started for %s", message["status"], self.client_addr
)
elif message["type"] == "http.response.body":
if not self._response_started:
raise ValueError(
"HTTP response has not yet been started but got %s"
% message["type"]
)
# Write out body # Write out body
if "content" in message: http.Request.write(self, message.get("body", b""))
http.Request.write(self, message['content'])
# End if there's no more content # End if there's no more content
if not message.get("more_content", False): if not message.get("more_body", False):
self.finish() self.finish()
logger.debug("HTTP response complete for %s", self.reply_channel) logger.debug("HTTP response complete for %s", self.client_addr)
try: try:
self.factory.log_action("http", "complete", { uri = self.uri.decode("ascii")
"path": self.path.decode("ascii"), except UnicodeDecodeError:
# The path is malformed somehow - do our best to log something
uri = repr(self.uri)
try:
self.server.log_action(
"http",
"complete",
{
"path": uri,
"status": self.code, "status": self.code,
"method": self.method.decode("ascii"), "method": self.method.decode("ascii", "replace"),
"client": "%s:%s" % tuple(self.client_addr) if self.client_addr else None, "client": (
"%s:%s" % tuple(self.client_addr)
if self.client_addr
else None
),
"time_taken": self.duration(), "time_taken": self.duration(),
"size": self.sentLength, "size": self.sentLength,
}) },
except Exception as e: )
logging.error(traceback.format_exc()) except Exception:
logger.error(traceback.format_exc())
else: else:
logger.debug("HTTP response chunk for %s", self.reply_channel) logger.debug("HTTP response chunk for %s", self.client_addr)
else:
raise ValueError("Cannot handle message type %s!" % message["type"])
def handle_exception(self, exception):
"""
Called by the server when our application tracebacks
"""
self.basic_error(500, b"Internal Server Error", "Exception inside application.")
def check_timeouts(self):
"""
Called periodically to see if we should timeout something
"""
# Web timeout checking
if self.server.http_timeout and self.duration() > self.server.http_timeout:
if self._response_started:
logger.warning("Application timed out while sending response")
self.finish()
else:
self.basic_error(
503,
b"Service Unavailable",
"Application failed to respond within time limit.",
)
### Utility functions
def send_disconnect(self):
"""
Sends a http.disconnect message.
Useful only really for long-polling.
"""
# If we don't yet have a path, then don't send as we never opened.
if self.path:
self.application_queue.put_nowait({"type": "http.disconnect"})
def duration(self): def duration(self):
""" """
@ -247,25 +347,34 @@ class WebRequest(http.Request):
""" """
Responds with a server-level error page (very basic) Responds with a server-level error page (very basic)
""" """
self.serverResponse({ self.handle_reply(
{
"type": "http.response.start",
"status": status, "status": status,
"status_text": status_text, "headers": [(b"Content-Type", b"text/html; charset=utf-8")],
"headers": [ }
(b"Content-Type", b"text/html; charset=utf-8"), )
], self.handle_reply(
"content": (self.error_template % { {
"type": "http.response.body",
"body": (
self.error_template
% {
"title": str(status) + " " + status_text.decode("ascii"), "title": str(status) + " " + status_text.decode("ascii"),
"body": body, "body": body,
}).encode("utf8"), }
}) ).encode("utf8"),
}
)
class HTTPProtocol(http.HTTPChannel): def __hash__(self):
return hash(id(self))
requestFactory = WebRequest
def __eq__(self, other):
return id(self) == id(other)
@implementer(IProtocolNegotiationFactory)
class HTTPFactory(http.HTTPFactory): class HTTPFactory(http.HTTPFactory):
""" """
Factory which takes care of tracking which protocol Factory which takes care of tracking which protocol
@ -274,69 +383,34 @@ class HTTPFactory(http.HTTPFactory):
routed appropriately. routed appropriately.
""" """
protocol = HTTPProtocol def __init__(self, server):
def __init__(self, channel_layer, action_logger=None, timeout=120, websocket_timeout=86400, ping_interval=20, ws_protocols=None, root_path=""):
http.HTTPFactory.__init__(self) http.HTTPFactory.__init__(self)
self.channel_layer = channel_layer self.server = server
self.action_logger = action_logger
self.timeout = timeout
self.websocket_timeout = websocket_timeout
self.ping_interval = ping_interval
# We track all sub-protocols for response channel mapping
self.reply_protocols = {}
# Make a factory for WebSocket protocols
self.ws_factory = WebSocketFactory(self, protocols=ws_protocols)
self.ws_factory.protocol = WebSocketProtocol
self.ws_factory.reply_protocols = self.reply_protocols
self.root_path = root_path
def reply_channels(self): def buildProtocol(self, addr):
return self.reply_protocols.keys() """
Builds protocol instances. This override is used to ensure we use our
own Request object instead of the default.
"""
try:
protocol = http.HTTPFactory.buildProtocol(self, addr)
protocol.requestFactory = WebRequest
return protocol
except Exception:
logger.error("Cannot build protocol: %s" % traceback.format_exc())
raise
def dispatch_reply(self, channel, message): # IProtocolNegotiationFactory
if channel.startswith("http") and isinstance(self.reply_protocols[channel], WebRequest): def acceptableProtocols(self):
self.reply_protocols[channel].serverResponse(message) """
elif channel.startswith("websocket") and isinstance(self.reply_protocols[channel], WebSocketProtocol): Protocols this server can speak after ALPN negotiation. Currently that
# Ensure the message is a valid WebSocket one is HTTP/1.1 and optionally HTTP/2. Websockets cannot be negotiated
unknown_message_keys = set(message.keys()) - {"bytes", "text", "close"} using ALPN, so that doesn't go here: anyone wanting websockets will
if unknown_message_keys: negotiate HTTP/1.1 and then do the upgrade dance.
raise ValueError( """
"Got invalid WebSocket reply message on %s - contains unknown keys %s" % ( baseProtocols = [b"http/1.1"]
channel,
unknown_message_keys,
)
)
if message.get("bytes", None):
self.reply_protocols[channel].serverSend(message["bytes"], True)
if message.get("text", None):
self.reply_protocols[channel].serverSend(message["text"], False)
if message.get("close", False):
self.reply_protocols[channel].serverClose()
else:
raise ValueError("Cannot dispatch message on channel %r" % channel)
def log_action(self, protocol, action, details): if http.H2_ENABLED:
""" baseProtocols.insert(0, b"h2")
Dispatches to any registered action logger, if there is one.
"""
if self.action_logger:
self.action_logger(protocol, action, details)
def check_timeouts(self): return baseProtocols
"""
Runs through all HTTP protocol instances and times them out if they've
taken too long (and so their message is probably expired)
"""
for protocol in list(self.reply_protocols.values()):
# Web timeout checking
if isinstance(protocol, WebRequest) and protocol.duration() > self.timeout:
protocol.basic_error(503, b"Service Unavailable", "Worker server failed to respond within time limit.")
# WebSocket timeout checking and keepalive ping sending
elif isinstance(protocol, WebSocketProtocol):
# Timeout check
if protocol.duration() > self.websocket_timeout:
protocol.serverClose()
# Ping check
else:
protocol.check_ping()

View File

View File

@ -0,0 +1,203 @@
import datetime
import importlib
import logging
import sys
from django.apps import apps
from django.conf import settings
from django.contrib.staticfiles.handlers import ASGIStaticFilesHandler
from django.core.exceptions import ImproperlyConfigured
from django.core.management import CommandError
from django.core.management.commands.runserver import Command as RunserverCommand
from daphne import __version__
from daphne.endpoints import build_endpoint_description_strings
from daphne.server import Server
logger = logging.getLogger("django.channels.server")
def get_default_application():
"""
Gets the default application, set in the ASGI_APPLICATION setting.
"""
try:
path, name = settings.ASGI_APPLICATION.rsplit(".", 1)
except (ValueError, AttributeError):
raise ImproperlyConfigured("Cannot find ASGI_APPLICATION setting.")
try:
module = importlib.import_module(path)
except ImportError:
raise ImproperlyConfigured("Cannot import ASGI_APPLICATION module %r" % path)
try:
value = getattr(module, name)
except AttributeError:
raise ImproperlyConfigured(
f"Cannot find {name!r} in ASGI_APPLICATION module {path}"
)
return value
class Command(RunserverCommand):
protocol = "http"
server_cls = Server
def add_arguments(self, parser):
super().add_arguments(parser)
parser.add_argument(
"--noasgi",
action="store_false",
dest="use_asgi",
default=True,
help="Run the old WSGI-based runserver rather than the ASGI-based one",
)
parser.add_argument(
"--http_timeout",
action="store",
dest="http_timeout",
type=int,
default=None,
help=(
"Specify the daphne http_timeout interval in seconds "
"(default: no timeout)"
),
)
parser.add_argument(
"--websocket_handshake_timeout",
action="store",
dest="websocket_handshake_timeout",
type=int,
default=5,
help=(
"Specify the daphne websocket_handshake_timeout interval in "
"seconds (default: 5)"
),
)
parser.add_argument(
"--nostatic",
action="store_false",
dest="use_static_handler",
help="Tells Django to NOT automatically serve static files at STATIC_URL.",
)
parser.add_argument(
"--insecure",
action="store_true",
dest="insecure_serving",
help="Allows serving static files even if DEBUG is False.",
)
def handle(self, *args, **options):
self.http_timeout = options.get("http_timeout", None)
self.websocket_handshake_timeout = options.get("websocket_handshake_timeout", 5)
# Check Channels is installed right
if options["use_asgi"] and not hasattr(settings, "ASGI_APPLICATION"):
raise CommandError(
"You have not set ASGI_APPLICATION, which is needed to run the server."
)
# Dispatch upward
super().handle(*args, **options)
def inner_run(self, *args, **options):
# Maybe they want the wsgi one?
if not options.get("use_asgi", True):
if hasattr(RunserverCommand, "server_cls"):
self.server_cls = RunserverCommand.server_cls
return RunserverCommand.inner_run(self, *args, **options)
# Run checks
self.stdout.write("Performing system checks...\n\n")
self.check(display_num_errors=True)
self.check_migrations()
# Print helpful text
quit_command = "CTRL-BREAK" if sys.platform == "win32" else "CONTROL-C"
now = datetime.datetime.now().strftime("%B %d, %Y - %X")
self.stdout.write(now)
self.stdout.write(
(
"Django version %(version)s, using settings %(settings)r\n"
"Starting ASGI/Daphne version %(daphne_version)s development server"
" at %(protocol)s://%(addr)s:%(port)s/\n"
"Quit the server with %(quit_command)s.\n"
)
% {
"version": self.get_version(),
"daphne_version": __version__,
"settings": settings.SETTINGS_MODULE,
"protocol": self.protocol,
"addr": "[%s]" % self.addr if self._raw_ipv6 else self.addr,
"port": self.port,
"quit_command": quit_command,
}
)
# Launch server in 'main' thread. Signals are disabled as it's still
# actually a subthread under the autoreloader.
logger.debug("Daphne running, listening on %s:%s", self.addr, self.port)
# build the endpoint description string from host/port options
endpoints = build_endpoint_description_strings(host=self.addr, port=self.port)
try:
self.server_cls(
application=self.get_application(options),
endpoints=endpoints,
signal_handlers=not options["use_reloader"],
action_logger=self.log_action,
http_timeout=self.http_timeout,
root_path=getattr(settings, "FORCE_SCRIPT_NAME", "") or "",
websocket_handshake_timeout=self.websocket_handshake_timeout,
).run()
logger.debug("Daphne exited")
except KeyboardInterrupt:
shutdown_message = options.get("shutdown_message", "")
if shutdown_message:
self.stdout.write(shutdown_message)
return
def get_application(self, options):
"""
Returns the static files serving application wrapping the default application,
if static files should be served. Otherwise just returns the default
handler.
"""
staticfiles_installed = apps.is_installed("django.contrib.staticfiles")
use_static_handler = options.get("use_static_handler", staticfiles_installed)
insecure_serving = options.get("insecure_serving", False)
if use_static_handler and (settings.DEBUG or insecure_serving):
return ASGIStaticFilesHandler(get_default_application())
else:
return get_default_application()
def log_action(self, protocol, action, details):
"""
Logs various different kinds of requests to the console.
"""
# HTTP requests
if protocol == "http" and action == "complete":
msg = "HTTP %(method)s %(path)s %(status)s [%(time_taken).2f, %(client)s]"
# Utilize terminal colors, if available
if 200 <= details["status"] < 300:
# Put 2XX first, since it should be the common case
logger.info(self.style.HTTP_SUCCESS(msg), details)
elif 100 <= details["status"] < 200:
logger.info(self.style.HTTP_INFO(msg), details)
elif details["status"] == 304:
logger.info(self.style.HTTP_NOT_MODIFIED(msg), details)
elif 300 <= details["status"] < 400:
logger.info(self.style.HTTP_REDIRECT(msg), details)
elif details["status"] == 404:
logger.warning(self.style.HTTP_NOT_FOUND(msg), details)
elif 400 <= details["status"] < 500:
logger.warning(self.style.HTTP_BAD_REQUEST(msg), details)
else:
# Any 5XX, or any other response
logger.error(self.style.HTTP_SERVER_ERROR(msg), details)
# Websocket requests
elif protocol == "websocket" and action == "connected":
logger.info("WebSocket CONNECT %(path)s [%(client)s]", details)
elif protocol == "websocket" and action == "disconnected":
logger.info("WebSocket DISCONNECT %(path)s [%(client)s]", details)
elif protocol == "websocket" and action == "connecting":
logger.info("WebSocket HANDSHAKING %(path)s [%(client)s]", details)
elif protocol == "websocket" and action == "rejected":
logger.info("WebSocket REJECT %(path)s [%(client)s]", details)

View File

@ -1,97 +1,342 @@
import logging # This has to be done first as Twisted is import-order-sensitive with reactors
import socket import asyncio # isort:skip
import os # isort:skip
import sys # isort:skip
import warnings # isort:skip
from concurrent.futures import ThreadPoolExecutor # isort:skip
from twisted.internet import asyncioreactor # isort:skip
from twisted.internet import reactor
from twisted.logger import globalLogBeginner twisted_loop = asyncio.new_event_loop()
if "ASGI_THREADS" in os.environ:
twisted_loop.set_default_executor(
ThreadPoolExecutor(max_workers=int(os.environ["ASGI_THREADS"]))
)
current_reactor = sys.modules.get("twisted.internet.reactor", None)
if current_reactor is not None:
if not isinstance(current_reactor, asyncioreactor.AsyncioSelectorReactor):
warnings.warn(
"Something has already installed a non-asyncio Twisted reactor. Attempting to uninstall it; "
+ "you can fix this warning by importing daphne.server early in your codebase or "
+ "finding the package that imports Twisted and importing it later on.",
UserWarning,
stacklevel=2,
)
del sys.modules["twisted.internet.reactor"]
asyncioreactor.install(twisted_loop)
else:
asyncioreactor.install(twisted_loop)
import logging
import time
from concurrent.futures import CancelledError
from functools import partial
from twisted.internet import defer, reactor
from twisted.internet.endpoints import serverFromString
from twisted.logger import STDLibLogObserver, globalLogBeginner
from twisted.web import http
from .http_protocol import HTTPFactory from .http_protocol import HTTPFactory
from .ws_protocol import WebSocketFactory
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
class Server(object): class Server:
def __init__( def __init__(
self, self,
channel_layer, application,
host="127.0.0.1", endpoints=None,
port=8000,
unix_socket=None,
file_descriptor=None,
signal_handlers=True, signal_handlers=True,
action_logger=None, action_logger=None,
http_timeout=120, http_timeout=None,
websocket_timeout=None, request_buffer_size=8192,
websocket_timeout=86400,
websocket_connect_timeout=20,
ping_interval=20, ping_interval=20,
ws_protocols=None, ping_timeout=30,
root_path="", root_path="",
proxy_forwarded_address_header=None,
proxy_forwarded_port_header=None,
proxy_forwarded_proto_header=None,
verbosity=1,
websocket_handshake_timeout=5,
application_close_timeout=10,
ready_callable=None,
server_name="daphne",
): ):
self.channel_layer = channel_layer self.application = application
self.host = host self.endpoints = endpoints or []
self.port = port self.listeners = []
self.unix_socket = unix_socket self.listening_addresses = []
self.file_descriptor = file_descriptor
self.signal_handlers = signal_handlers self.signal_handlers = signal_handlers
self.action_logger = action_logger self.action_logger = action_logger
self.http_timeout = http_timeout self.http_timeout = http_timeout
self.ping_interval = ping_interval self.ping_interval = ping_interval
# If they did not provide a websocket timeout, default it to the self.ping_timeout = ping_timeout
# channel layer's group_expiry value if present, or one day if not. self.request_buffer_size = request_buffer_size
self.websocket_timeout = websocket_timeout or getattr(channel_layer, "group_expiry", 86400) self.proxy_forwarded_address_header = proxy_forwarded_address_header
self.ws_protocols = ws_protocols self.proxy_forwarded_port_header = proxy_forwarded_port_header
self.proxy_forwarded_proto_header = proxy_forwarded_proto_header
self.websocket_timeout = websocket_timeout
self.websocket_connect_timeout = websocket_connect_timeout
self.websocket_handshake_timeout = websocket_handshake_timeout
self.application_close_timeout = application_close_timeout
self.root_path = root_path self.root_path = root_path
self.verbosity = verbosity
self.abort_start = False
self.ready_callable = ready_callable
self.server_name = server_name
# Check our construction is actually sensible
if not self.endpoints:
logger.error("No endpoints. This server will not listen on anything.")
sys.exit(1)
def run(self): def run(self):
self.factory = HTTPFactory( # A dict of protocol: {"application_instance":, "connected":, "disconnected":} dicts
self.channel_layer, self.connections = {}
self.action_logger, # Make the factory
timeout=self.http_timeout, self.http_factory = HTTPFactory(self)
websocket_timeout=self.websocket_timeout, self.ws_factory = WebSocketFactory(self, server=self.server_name)
ping_interval=self.ping_interval, self.ws_factory.setProtocolOptions(
ws_protocols=self.ws_protocols, autoPingTimeout=self.ping_timeout,
root_path=self.root_path, allowNullOrigin=True,
openHandshakeTimeout=self.websocket_handshake_timeout,
) )
if self.verbosity <= 1:
# Redirect the Twisted log to nowhere # Redirect the Twisted log to nowhere
globalLogBeginner.beginLoggingTo([lambda _: None], redirectStandardIO=False, discardBuffer=True) globalLogBeginner.beginLoggingTo(
# Listen on a socket [lambda _: None], redirectStandardIO=False, discardBuffer=True
if self.unix_socket: )
reactor.listenUNIX(self.unix_socket, self.factory)
elif self.file_descriptor:
# socket returns the same socket if supplied with a fileno
sock = socket.socket(fileno=self.file_descriptor)
reactor.adoptStreamPort(self.file_descriptor, sock.family, self.factory)
else: else:
reactor.listenTCP(self.port, self.factory, interface=self.host) globalLogBeginner.beginLoggingTo([STDLibLogObserver(__name__)])
reactor.callLater(0, self.backend_reader) # Detect what Twisted features are enabled
if http.H2_ENABLED:
logger.info("HTTP/2 support enabled")
else:
logger.info(
"HTTP/2 support not enabled (install the http2 and tls Twisted extras)"
)
# Kick off the timeout loop
reactor.callLater(1, self.application_checker)
reactor.callLater(2, self.timeout_checker) reactor.callLater(2, self.timeout_checker)
for socket_description in self.endpoints:
logger.info("Configuring endpoint %s", socket_description)
ep = serverFromString(reactor, str(socket_description))
listener = ep.listen(self.http_factory)
listener.addCallback(self.listen_success)
listener.addErrback(self.listen_error)
self.listeners.append(listener)
# Set the asyncio reactor's event loop as global
# TODO: Should we instead pass the global one into the reactor?
asyncio.set_event_loop(reactor._asyncioEventloop)
# Verbosity 3 turns on asyncio debug to find those blocking yields
if self.verbosity >= 3:
asyncio.get_event_loop().set_debug(True)
reactor.addSystemEventTrigger("before", "shutdown", self.kill_all_applications)
if not self.abort_start:
# Trigger the ready flag if we had one
if self.ready_callable:
self.ready_callable()
# Run the reactor
reactor.run(installSignalHandlers=self.signal_handlers) reactor.run(installSignalHandlers=self.signal_handlers)
def backend_reader(self): def listen_success(self, port):
""" """
Runs as an-often-as-possible task with the reactor, unless there was Called when a listen succeeds so we can store port details (if there are any)
no result previously in which case we add a small delay.
""" """
channels = self.factory.reply_channels() if hasattr(port, "getHost"):
delay = 0.05 host = port.getHost()
# Quit if reactor is stopping if hasattr(host, "host") and hasattr(host, "port"):
if not reactor.running: self.listening_addresses.append((host.host, host.port))
logging.debug("Backend reader quitting due to reactor stop") logger.info(
"Listening on TCP address %s:%s",
port.getHost().host,
port.getHost().port,
)
def listen_error(self, failure):
logger.critical("Listen failure: %s", failure.getErrorMessage())
self.stop()
def stop(self):
"""
Force-stops the server.
"""
if reactor.running:
reactor.stop()
else:
self.abort_start = True
### Protocol handling
def protocol_connected(self, protocol):
"""
Adds a protocol as a current connection.
"""
if protocol in self.connections:
raise RuntimeError("Protocol %r was added to main list twice!" % protocol)
self.connections[protocol] = {"connected": time.time()}
def protocol_disconnected(self, protocol):
# Set its disconnected time (the loops will come and clean it up)
# Do not set it if it is already set. Overwriting it might
# cause it to never be cleaned up.
# See https://github.com/django/channels/issues/1181
if "disconnected" not in self.connections[protocol]:
self.connections[protocol]["disconnected"] = time.time()
### Internal event/message handling
def create_application(self, protocol, scope):
"""
Creates a new application instance that fronts a Protocol instance
for one of our supported protocols. Pass it the protocol,
and it will work out the type, supply appropriate callables, and
return you the application's input queue
"""
# Make sure the protocol has not had another application made for it
assert "application_instance" not in self.connections[protocol]
# Make an instance of the application
input_queue = asyncio.Queue()
scope.setdefault("asgi", {"version": "3.0"})
application_instance = self.application(
scope=scope,
receive=input_queue.get,
send=partial(self.handle_reply, protocol),
)
# Run it, and stash the future for later checking
if protocol not in self.connections:
return None
self.connections[protocol]["application_instance"] = asyncio.ensure_future(
application_instance,
loop=asyncio.get_event_loop(),
)
return input_queue
async def handle_reply(self, protocol, message):
"""
Coroutine that jumps the reply message from asyncio to Twisted
"""
# Don't do anything if the connection is closed or does not exist
if protocol not in self.connections or self.connections[protocol].get(
"disconnected", None
):
return return
# Don't do anything if there's no channels to listen on try:
if channels: self.check_headers_type(message)
delay = 0.01 except ValueError:
channel, message = self.channel_layer.receive_many(channels, block=False) # Ensure to send SOME reply.
if channel: protocol.basic_error(500, b"Server Error", "Server Error")
delay = 0 raise
# Deal with the message # Let the protocol handle it
self.factory.dispatch_reply(channel, message) protocol.handle_reply(message)
reactor.callLater(delay, self.backend_reader)
@staticmethod
def check_headers_type(message):
if not message["type"] == "http.response.start":
return
for k, v in message.get("headers", []):
if not isinstance(k, bytes):
raise ValueError(
"Header name '{}' expected to be `bytes`, but got `{}`".format(
k, type(k)
)
)
if not isinstance(v, bytes):
raise ValueError(
"Header value '{}' expected to be `bytes`, but got `{}`".format(
v, type(v)
)
)
### Utility
def application_checker(self):
"""
Goes through the set of current application Futures and cleans up
any that are done/prints exceptions for any that errored.
"""
for protocol, details in list(self.connections.items()):
disconnected = details.get("disconnected", None)
application_instance = details.get("application_instance", None)
# First, see if the protocol disconnected and the app has taken
# too long to close up
if (
disconnected
and time.time() - disconnected > self.application_close_timeout
):
if application_instance and not application_instance.done():
logger.warning(
"Application instance %r for connection %s took too long to shut down and was killed.",
application_instance,
repr(protocol),
)
application_instance.cancel()
# Then see if the app is done and we should reap it
if application_instance and application_instance.done():
try:
exception = application_instance.exception()
except (CancelledError, asyncio.CancelledError):
# Future cancellation. We can ignore this.
pass
else:
if exception:
if isinstance(exception, KeyboardInterrupt):
# Protocol is asking the server to exit (likely during test)
self.stop()
else:
logger.error(
"Exception inside application: %s",
exception,
exc_info=exception,
)
if not disconnected:
protocol.handle_exception(exception)
del self.connections[protocol]["application_instance"]
application_instance = None
# Check to see if protocol is closed and app is closed so we can remove it
if not application_instance and disconnected:
del self.connections[protocol]
reactor.callLater(1, self.application_checker)
def kill_all_applications(self):
"""
Kills all application coroutines before reactor exit.
"""
# Send cancel to all coroutines
wait_for = []
for details in self.connections.values():
application_instance = details["application_instance"]
if not application_instance.done():
application_instance.cancel()
wait_for.append(application_instance)
logger.info("Killed %i pending application instances", len(wait_for))
# Make Twisted wait until they're all dead
wait_deferred = defer.Deferred.fromFuture(asyncio.gather(*wait_for))
wait_deferred.addErrback(lambda x: None)
return wait_deferred
def timeout_checker(self): def timeout_checker(self):
""" """
Called periodically to enforce timeout rules on all connections. Called periodically to enforce timeout rules on all connections.
Also checks pings at the same time. Also checks pings at the same time.
""" """
self.factory.check_timeouts() for protocol in list(self.connections.keys()):
protocol.check_timeouts()
reactor.callLater(2, self.timeout_checker) reactor.callLater(2, self.timeout_checker)
def log_action(self, protocol, action, details):
"""
Dispatches to any registered action logger, if there is one.
"""
if self.action_logger:
self.action_logger(protocol, action, details)

309
daphne/testing.py Normal file
View File

@ -0,0 +1,309 @@
import logging
import multiprocessing
import os
import pickle
import tempfile
import traceback
from concurrent.futures import CancelledError
class BaseDaphneTestingInstance:
"""
Launches an instance of Daphne in a subprocess, with a host and port
attribute allowing you to call it.
Works as a context manager.
"""
startup_timeout = 2
def __init__(
self, xff=False, http_timeout=None, request_buffer_size=None, *, application
):
self.xff = xff
self.http_timeout = http_timeout
self.host = "127.0.0.1"
self.request_buffer_size = request_buffer_size
self.application = application
def get_application(self):
return self.application
def __enter__(self):
# Option Daphne features
kwargs = {}
if self.request_buffer_size:
kwargs["request_buffer_size"] = self.request_buffer_size
# Optionally enable X-Forwarded-For support.
if self.xff:
kwargs["proxy_forwarded_address_header"] = "X-Forwarded-For"
kwargs["proxy_forwarded_port_header"] = "X-Forwarded-Port"
kwargs["proxy_forwarded_proto_header"] = "X-Forwarded-Proto"
if self.http_timeout:
kwargs["http_timeout"] = self.http_timeout
# Start up process
self.process = DaphneProcess(
host=self.host,
get_application=self.get_application,
kwargs=kwargs,
setup=self.process_setup,
teardown=self.process_teardown,
)
self.process.start()
# Wait for the port
if self.process.ready.wait(self.startup_timeout):
self.port = self.process.port.value
return self
else:
if self.process.errors.empty():
raise RuntimeError("Daphne did not start up, no error caught")
else:
error, traceback = self.process.errors.get(False)
raise RuntimeError("Daphne did not start up:\n%s" % traceback)
def __exit__(self, exc_type, exc_value, traceback):
# Shut down the process
self.process.terminate()
del self.process
def process_setup(self):
"""
Called by the process just before it starts serving.
"""
pass
def process_teardown(self):
"""
Called by the process just after it stops serving
"""
pass
def get_received(self):
pass
class DaphneTestingInstance(BaseDaphneTestingInstance):
def __init__(self, *args, **kwargs):
self.lock = multiprocessing.Lock()
super().__init__(*args, **kwargs, application=TestApplication(lock=self.lock))
def __enter__(self):
# Clear result storage
TestApplication.delete_setup()
TestApplication.delete_result()
return super().__enter__()
def get_received(self):
"""
Returns the scope and messages the test application has received
so far. Note you'll get all messages since scope start, not just any
new ones since the last call.
Also checks for any exceptions in the application. If there are,
raises them.
"""
try:
with self.lock:
inner_result = TestApplication.load_result()
except FileNotFoundError:
raise ValueError("No results available yet.")
# Check for exception
if "exception" in inner_result:
raise inner_result["exception"]
return inner_result["scope"], inner_result["messages"]
def add_send_messages(self, messages):
"""
Adds messages for the application to send back.
The next time it receives an incoming message, it will reply with these.
"""
TestApplication.save_setup(response_messages=messages)
class DaphneProcess(multiprocessing.Process):
"""
Process subclass that launches and runs a Daphne instance, communicating the
port it ends up listening on back to the parent process.
"""
def __init__(self, host, get_application, kwargs=None, setup=None, teardown=None):
super().__init__()
self.host = host
self.get_application = get_application
self.kwargs = kwargs or {}
self.setup = setup
self.teardown = teardown
self.port = multiprocessing.Value("i")
self.ready = multiprocessing.Event()
self.errors = multiprocessing.Queue()
def run(self):
# OK, now we are in a forked child process, and want to use the reactor.
# However, FreeBSD systems like MacOS do not fork the underlying Kqueue,
# which asyncio (hence asyncioreactor) is built on.
# Therefore, we should uninstall the broken reactor and install a new one.
_reinstall_reactor()
from twisted.internet import reactor
from .endpoints import build_endpoint_description_strings
from .server import Server
application = self.get_application()
try:
# Create the server class
endpoints = build_endpoint_description_strings(host=self.host, port=0)
self.server = Server(
application=application,
endpoints=endpoints,
signal_handlers=False,
**self.kwargs,
)
# Set up a poller to look for the port
reactor.callLater(0.1, self.resolve_port)
# Run with setup/teardown
if self.setup is not None:
self.setup()
try:
self.server.run()
finally:
if self.teardown is not None:
self.teardown()
except BaseException as e:
# Put the error on our queue so the parent gets it
self.errors.put((e, traceback.format_exc()))
def resolve_port(self):
from twisted.internet import reactor
if self.server.listening_addresses:
self.port.value = self.server.listening_addresses[0][1]
self.ready.set()
else:
reactor.callLater(0.1, self.resolve_port)
class TestApplication:
"""
An application that receives one or more messages, sends a response,
and then quits the server. For testing.
"""
setup_storage = os.path.join(tempfile.gettempdir(), "setup.testio")
result_storage = os.path.join(tempfile.gettempdir(), "result.testio")
def __init__(self, lock):
self.lock = lock
self.messages = []
async def __call__(self, scope, receive, send):
self.scope = scope
# Receive input and send output
logging.debug("test app coroutine alive")
try:
while True:
# Receive a message and save it into the result store
self.messages.append(await receive())
self.lock.acquire()
logging.debug("test app received %r", self.messages[-1])
self.save_result(self.scope, self.messages)
self.lock.release()
# See if there are any messages to send back
setup = self.load_setup()
self.delete_setup()
for message in setup["response_messages"]:
await send(message)
logging.debug("test app sent %r", message)
except Exception as e:
if isinstance(e, CancelledError):
# Don't catch task-cancelled errors!
raise
else:
self.save_exception(e)
@classmethod
def save_setup(cls, response_messages):
"""
Stores setup information.
"""
with open(cls.setup_storage, "wb") as fh:
pickle.dump({"response_messages": response_messages}, fh)
@classmethod
def load_setup(cls):
"""
Returns setup details.
"""
try:
with open(cls.setup_storage, "rb") as fh:
return pickle.load(fh)
except FileNotFoundError:
return {"response_messages": []}
@classmethod
def save_result(cls, scope, messages):
"""
Saves details of what happened to the result storage.
We could use pickle here, but that seems wrong, still, somehow.
"""
with open(cls.result_storage, "wb") as fh:
pickle.dump({"scope": scope, "messages": messages}, fh)
@classmethod
def save_exception(cls, exception):
"""
Saves details of what happened to the result storage.
We could use pickle here, but that seems wrong, still, somehow.
"""
with open(cls.result_storage, "wb") as fh:
pickle.dump({"exception": exception}, fh)
@classmethod
def load_result(cls):
"""
Returns result details.
"""
with open(cls.result_storage, "rb") as fh:
return pickle.load(fh)
@classmethod
def delete_setup(cls):
"""
Clears setup storage files.
"""
try:
os.unlink(cls.setup_storage)
except OSError:
pass
@classmethod
def delete_result(cls):
"""
Clears result storage files.
"""
try:
os.unlink(cls.result_storage)
except OSError:
pass
def _reinstall_reactor():
import asyncio
import sys
from twisted.internet import asyncioreactor
# Uninstall the reactor.
if "twisted.internet.reactor" in sys.modules:
del sys.modules["twisted.internet.reactor"]
# The daphne.server module may have already installed the reactor.
# If so, using this module will use uninstalled one, thus we should
# reimport this module too.
if "daphne.server" in sys.modules:
del sys.modules["daphne.server"]
event_loop = asyncio.new_event_loop()
asyncioreactor.install(event_loop)
asyncio.set_event_loop(event_loop)

View File

@ -1,68 +0,0 @@
# coding: utf8
from __future__ import unicode_literals
from unittest import TestCase
from asgiref.inmemory import ChannelLayer
from twisted.test import proto_helpers
from ..http_protocol import HTTPFactory
class TestHTTPProtocol(TestCase):
"""
Tests that the HTTP protocol class correctly generates and parses messages.
"""
def setUp(self):
self.channel_layer = ChannelLayer()
self.factory = HTTPFactory(self.channel_layer)
self.proto = self.factory.buildProtocol(('127.0.0.1', 0))
self.tr = proto_helpers.StringTransport()
self.proto.makeConnection(self.tr)
def test_basic(self):
"""
Tests basic HTTP parsing
"""
# Send a simple request to the protocol
self.proto.dataReceived(
b"GET /te%20st-%C3%A0/?foo=+bar HTTP/1.1\r\n" +
b"Host: somewhere.com\r\n" +
b"\r\n"
)
# Get the resulting message off of the channel layer
_, message = self.channel_layer.receive_many(["http.request"])
self.assertEqual(message['http_version'], "1.1")
self.assertEqual(message['method'], "GET")
self.assertEqual(message['scheme'], "http")
self.assertEqual(message['path'], "/te st-à/")
self.assertEqual(message['query_string'], b"foo=+bar")
self.assertEqual(message['headers'], [(b"host", b"somewhere.com")])
self.assertFalse(message.get("body", None))
self.assertTrue(message['reply_channel'])
# Send back an example response
self.factory.dispatch_reply(
message['reply_channel'],
{
"status": 201,
"status_text": b"Created",
"content": b"OH HAI",
"headers": [[b"X-Test", b"Boom!"]],
}
)
# Make sure that comes back right on the protocol
self.assertEqual(self.tr.value(), b"HTTP/1.1 201 Created\r\nTransfer-Encoding: chunked\r\nX-Test: Boom!\r\n\r\n6\r\nOH HAI\r\n0\r\n\r\n")
def test_root_path_header(self):
"""
Tests root path header handling
"""
# Send a simple request to the protocol
self.proto.dataReceived(
b"GET /te%20st-%C3%A0/?foo=bar HTTP/1.1\r\n" +
b"Host: somewhere.com\r\n" +
b"Daphne-Root-Path: /foobar%20/bar\r\n" +
b"\r\n"
)
# Get the resulting message off of the channel layer, check root_path
_, message = self.channel_layer.receive_many(["http.request"])
self.assertEqual(message['root_path'], "/foobar /bar")

View File

@ -0,0 +1,24 @@
import socket
from twisted.internet import endpoints
from twisted.internet.interfaces import IStreamServerEndpointStringParser
from twisted.plugin import IPlugin
from zope.interface import implementer
@implementer(IPlugin, IStreamServerEndpointStringParser)
class _FDParser:
prefix = "fd"
def _parseServer(self, reactor, fileno, domain=socket.AF_INET):
fileno = int(fileno)
return endpoints.AdoptedStreamServerEndpoint(reactor, fileno, domain)
def parseStreamServer(self, reactor, *args, **kwargs):
# Delegate to another function with a sane signature. This function has
# an insane signature to trick zope.interface into believing the
# interface is correctly implemented.
return self._parseServer(reactor, *args, **kwargs)
parser = _FDParser()

89
daphne/utils.py Normal file
View File

@ -0,0 +1,89 @@
import importlib
import re
from twisted.web.http_headers import Headers
# Header name regex as per h11.
# https://github.com/python-hyper/h11/blob/a2c68948accadc3876dffcf979d98002e4a4ed27/h11/_abnf.py#L10-L21
HEADER_NAME_RE = re.compile(rb"[-!#$%&'*+.^_`|~0-9a-zA-Z]+")
def import_by_path(path):
"""
Given a dotted/colon path, like project.module:ClassName.callable,
returns the object at the end of the path.
"""
module_path, object_path = path.split(":", 1)
target = importlib.import_module(module_path)
for bit in object_path.split("."):
target = getattr(target, bit)
return target
def header_value(headers, header_name):
value = headers[header_name]
if isinstance(value, list):
value = value[0]
return value.decode("utf-8")
def parse_x_forwarded_for(
headers,
address_header_name="X-Forwarded-For",
port_header_name="X-Forwarded-Port",
proto_header_name="X-Forwarded-Proto",
original_addr=None,
original_scheme=None,
):
"""
Parses an X-Forwarded-For header and returns a host/port pair as a list.
@param headers: The twisted-style object containing a request's headers
@param address_header_name: The name of the expected host header
@param port_header_name: The name of the expected port header
@param proto_header_name: The name of the expected proto header
@param original_addr: A host/port pair that should be returned if the headers are not in the request
@param original_scheme: A scheme that should be returned if the headers are not in the request
@return: A list containing a host (string) as the first entry and a port (int) as the second.
"""
if not address_header_name:
return original_addr, original_scheme
# Convert twisted-style headers into dicts
if isinstance(headers, Headers):
headers = dict(headers.getAllRawHeaders())
# Lowercase all header names in the dict
headers = {name.lower(): values for name, values in headers.items()}
# Make sure header names are bytes (values are checked in header_value)
assert all(isinstance(name, bytes) for name in headers.keys())
address_header_name = address_header_name.lower().encode("utf-8")
result_addr = original_addr
result_scheme = original_scheme
if address_header_name in headers:
address_value = header_value(headers, address_header_name)
if "," in address_value:
address_value = address_value.split(",")[0].strip()
result_addr = [address_value, 0]
if port_header_name:
# We only want to parse the X-Forwarded-Port header if we also parsed the X-Forwarded-For
# header to avoid inconsistent results.
port_header_name = port_header_name.lower().encode("utf-8")
if port_header_name in headers:
port_value = header_value(headers, port_header_name)
try:
result_addr[1] = int(port_value)
except ValueError:
pass
if proto_header_name:
proto_header_name = proto_header_name.lower().encode("utf-8")
if proto_header_name in headers:
result_scheme = header_value(headers, proto_header_name)
return result_addr, result_scheme

View File

@ -1,12 +1,16 @@
from __future__ import unicode_literals
import logging import logging
import six
import time import time
import traceback import traceback
from six.moves.urllib_parse import unquote, urlencode from urllib.parse import unquote
from autobahn.twisted.websocket import WebSocketServerProtocol, WebSocketServerFactory from autobahn.twisted.websocket import (
ConnectionDeny,
WebSocketServerFactory,
WebSocketServerProtocol,
)
from twisted.internet import defer
from .utils import parse_x_forwarded_for
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -17,166 +21,253 @@ class WebSocketProtocol(WebSocketServerProtocol):
the websocket channels. the websocket channels.
""" """
application_type = "websocket"
# If we should send no more messages (e.g. we error-closed the socket) # If we should send no more messages (e.g. we error-closed the socket)
muted = False muted = False
def set_main_factory(self, main_factory):
self.main_factory = main_factory
self.channel_layer = self.main_factory.channel_layer
def onConnect(self, request): def onConnect(self, request):
self.server = self.factory.server_class
self.server.protocol_connected(self)
self.request = request self.request = request
self.packets_received = 0 self.protocol_to_accept = None
self.root_path = self.server.root_path
self.socket_opened = time.time() self.socket_opened = time.time()
self.last_data = time.time() self.last_ping = time.time()
try: try:
# Sanitize and decode headers # Sanitize and decode headers, potentially extracting root path
self.clean_headers = [] self.clean_headers = []
for name, value in request.headers.items(): for name, value in request.headers.items():
name = name.encode("ascii") name = name.encode("ascii")
# Prevent CVE-2015-0219 # Prevent CVE-2015-0219
if b"_" in name: if b"_" in name:
continue continue
if name.lower() == b"daphne-root-path":
self.root_path = unquote(value)
else:
self.clean_headers.append((name.lower(), value.encode("latin1"))) self.clean_headers.append((name.lower(), value.encode("latin1")))
# Reconstruct query string
# TODO: get autobahn to provide it raw
query_string = urlencode(request.params, doseq=True).encode("ascii")
# Make sending channel
self.reply_channel = self.channel_layer.new_channel("websocket.send!")
# Tell main factory about it
self.main_factory.reply_protocols[self.reply_channel] = self
# Get client address if possible # Get client address if possible
if hasattr(self.transport.getPeer(), "host") and hasattr(self.transport.getPeer(), "port"): peer = self.transport.getPeer()
self.client_addr = [self.transport.getPeer().host, self.transport.getPeer().port] host = self.transport.getHost()
self.server_addr = [self.transport.getHost().host, self.transport.getHost().port] if hasattr(peer, "host") and hasattr(peer, "port"):
self.client_addr = [str(peer.host), peer.port]
self.server_addr = [str(host.host), host.port]
else: else:
self.client_addr = None self.client_addr = None
self.server_addr = None self.server_addr = None
# Make initial request info dict from request (we only have it here)
if self.server.proxy_forwarded_address_header:
self.client_addr, self.client_scheme = parse_x_forwarded_for(
dict(self.clean_headers),
self.server.proxy_forwarded_address_header,
self.server.proxy_forwarded_port_header,
self.server.proxy_forwarded_proto_header,
self.client_addr,
)
# Decode websocket subprotocol options
subprotocols = []
for header, value in self.clean_headers:
if header == b"sec-websocket-protocol":
subprotocols = [
x.strip() for x in unquote(value.decode("ascii")).split(",")
]
# Make new application instance with scope
self.path = request.path.encode("ascii") self.path = request.path.encode("ascii")
self.request_info = { self.application_deferred = defer.maybeDeferred(
"path": self.unquote(self.path), self.server.create_application,
self,
{
"type": "websocket",
"path": unquote(self.path.decode("ascii")),
"raw_path": self.path,
"root_path": self.root_path,
"headers": self.clean_headers, "headers": self.clean_headers,
"query_string": self.unquote(query_string), "query_string": self._raw_query_string, # Passed by HTTP protocol
"client": self.client_addr, "client": self.client_addr,
"server": self.server_addr, "server": self.server_addr,
"reply_channel": self.reply_channel, "subprotocols": subprotocols,
"order": 0, },
} )
except: if self.application_deferred is not None:
self.application_deferred.addCallback(self.applicationCreateWorked)
self.application_deferred.addErrback(self.applicationCreateFailed)
except Exception:
# Exceptions here are not displayed right, just 500. # Exceptions here are not displayed right, just 500.
# Turn them into an ERROR log. # Turn them into an ERROR log.
logger.error(traceback.format_exc()) logger.error(traceback.format_exc())
raise raise
ws_protocol = None # Make a deferred and return it - we'll either call it or err it later on
for header, value in self.clean_headers: self.handshake_deferred = defer.Deferred()
if header == b'sec-websocket-protocol': return self.handshake_deferred
protocols = [x.strip() for x in self.unquote(value).split(",")]
for protocol in protocols:
if protocol in self.factory.protocols:
ws_protocol = protocol
break
if ws_protocol and ws_protocol in self.factory.protocols: def applicationCreateWorked(self, application_queue):
return ws_protocol """
Called when the background thread has successfully made the application
instance.
"""
# Store the application's queue
self.application_queue = application_queue
# Send over the connect message
self.application_queue.put_nowait({"type": "websocket.connect"})
self.server.log_action(
"websocket",
"connecting",
{
"path": self.request.path,
"client": (
"%s:%s" % tuple(self.client_addr) if self.client_addr else None
),
},
)
@classmethod def applicationCreateFailed(self, failure):
def unquote(cls, value):
""" """
Python 2 and 3 compat layer for utf-8 unquoting Called when application creation fails.
""" """
if six.PY2: logger.error(failure)
return unquote(value).decode("utf8") return failure
else:
return unquote(value.decode("ascii")) ### Twisted event handling
def onOpen(self): def onOpen(self):
# Send news that this channel is open # Send news that this channel is open
logger.debug("WebSocket open for %s", self.reply_channel) logger.debug("WebSocket %s open and established", self.client_addr)
try: self.server.log_action(
self.channel_layer.send("websocket.connect", self.request_info) "websocket",
except self.channel_layer.ChannelFull: "connected",
# You have to consume websocket.connect according to the spec, {
# so drop the connection.
self.muted = True
logger.warn("WebSocket force closed for %s due to connect backpressure", self.reply_channel)
# Send code 1013 "try again later" with close.
self.sendCloseFrame(code=1013, isReply=False)
else:
self.factory.log_action("websocket", "connected", {
"path": self.request.path, "path": self.request.path,
"client": "%s:%s" % tuple(self.client_addr) if self.client_addr else None, "client": (
}) "%s:%s" % tuple(self.client_addr) if self.client_addr else None
),
},
)
def onMessage(self, payload, isBinary): def onMessage(self, payload, isBinary):
# If we're muted, do nothing. # If we're muted, do nothing.
if self.muted: if self.muted:
logger.debug("Muting incoming frame on %s", self.reply_channel) logger.debug("Muting incoming frame on %s", self.client_addr)
return return
logger.debug("WebSocket incoming frame on %s", self.reply_channel) logger.debug("WebSocket incoming frame on %s", self.client_addr)
self.packets_received += 1 self.last_ping = time.time()
self.last_data = time.time()
try:
if isBinary: if isBinary:
self.channel_layer.send("websocket.receive", { self.application_queue.put_nowait(
"reply_channel": self.reply_channel, {"type": "websocket.receive", "bytes": payload}
"path": self.unquote(self.path), )
"order": self.packets_received,
"bytes": payload,
})
else: else:
self.channel_layer.send("websocket.receive", { self.application_queue.put_nowait(
"reply_channel": self.reply_channel, {"type": "websocket.receive", "text": payload.decode("utf8")}
"path": self.unquote(self.path), )
"order": self.packets_received,
"text": payload.decode("utf8"), def onClose(self, wasClean, code, reason):
}) """
except self.channel_layer.ChannelFull: Called when Twisted closes the socket.
# You have to consume websocket.receive according to the spec, """
# so drop the connection. self.server.protocol_disconnected(self)
self.muted = True logger.debug("WebSocket closed for %s", self.client_addr)
logger.warn("WebSocket force closed for %s due to receive backpressure", self.reply_channel) if not self.muted and hasattr(self, "application_queue"):
# Send code 1013 "try again later" with close. self.application_queue.put_nowait(
self.sendCloseFrame(code=1013, isReply=False) {"type": "websocket.disconnect", "code": code}
)
self.server.log_action(
"websocket",
"disconnected",
{
"path": self.request.path,
"client": (
"%s:%s" % tuple(self.client_addr) if self.client_addr else None
),
},
)
### Internal event handling
def handle_reply(self, message):
if "type" not in message:
raise ValueError("Message has no type defined")
if message["type"] == "websocket.accept":
self.serverAccept(message.get("subprotocol", None))
elif message["type"] == "websocket.close":
if self.state == self.STATE_CONNECTING:
self.serverReject()
else:
self.serverClose(code=message.get("code", None))
elif message["type"] == "websocket.send":
if self.state == self.STATE_CONNECTING:
raise ValueError("Socket has not been accepted, so cannot send over it")
if message.get("bytes", None) and message.get("text", None):
raise ValueError(
"Got invalid WebSocket reply message on %s - contains both bytes and text keys"
% (message,)
)
if message.get("bytes", None):
self.serverSend(message["bytes"], True)
if message.get("text", None):
self.serverSend(message["text"], False)
def handle_exception(self, exception):
"""
Called by the server when our application tracebacks
"""
if hasattr(self, "handshake_deferred"):
# If the handshake is still ongoing, we need to emit a HTTP error
# code rather than a WebSocket one.
self.handshake_deferred.errback(
ConnectionDeny(code=500, reason="Internal server error")
)
else:
self.sendCloseFrame(code=1011)
def serverAccept(self, subprotocol=None):
"""
Called when we get a message saying to accept the connection.
"""
self.handshake_deferred.callback(subprotocol)
del self.handshake_deferred
logger.debug("WebSocket %s accepted by application", self.client_addr)
def serverReject(self):
"""
Called when we get a message saying to reject the connection.
"""
self.handshake_deferred.errback(
ConnectionDeny(code=403, reason="Access denied")
)
del self.handshake_deferred
self.server.protocol_disconnected(self)
logger.debug("WebSocket %s rejected by application", self.client_addr)
self.server.log_action(
"websocket",
"rejected",
{
"path": self.request.path,
"client": (
"%s:%s" % tuple(self.client_addr) if self.client_addr else None
),
},
)
def serverSend(self, content, binary=False): def serverSend(self, content, binary=False):
""" """
Server-side channel message to send a message. Server-side channel message to send a message.
""" """
self.last_data = time.time() if self.state == self.STATE_CONNECTING:
logger.debug("Sent WebSocket packet to client for %s", self.reply_channel) self.serverAccept()
logger.debug("Sent WebSocket packet to client for %s", self.client_addr)
if binary: if binary:
self.sendMessage(content, binary) self.sendMessage(content, binary)
else: else:
self.sendMessage(content.encode("utf8"), binary) self.sendMessage(content.encode("utf8"), binary)
def serverClose(self): def serverClose(self, code=None):
""" """
Server-side channel message to close the socket Server-side channel message to close the socket
""" """
self.sendClose() code = 1000 if code is None else code
self.sendClose(code=code)
def onClose(self, wasClean, code, reason): ### Utils
if hasattr(self, "reply_channel"):
logger.debug("WebSocket closed for %s", self.reply_channel)
del self.factory.reply_protocols[self.reply_channel]
try:
if not self.muted:
self.channel_layer.send("websocket.disconnect", {
"reply_channel": self.reply_channel,
"code": code,
"path": self.unquote(self.path),
"order": self.packets_received + 1,
})
except self.channel_layer.ChannelFull:
pass
self.factory.log_action("websocket", "disconnected", {
"path": self.request.path,
"client": "%s:%s" % tuple(self.client_addr) if self.client_addr else None,
})
else:
logger.debug("WebSocket closed before handshake established")
def duration(self): def duration(self):
""" """
@ -184,13 +275,34 @@ class WebSocketProtocol(WebSocketServerProtocol):
""" """
return time.time() - self.socket_opened return time.time() - self.socket_opened
def check_ping(self): def check_timeouts(self):
""" """
Checks to see if we should send a keepalive ping. Called periodically to see if we should timeout something
""" """
if (time.time() - self.last_data) > self.main_factory.ping_interval: # Web timeout checking
self.sendPing() if (
self.last_data = time.time() self.duration() > self.server.websocket_timeout
and self.server.websocket_timeout >= 0
):
self.serverClose()
# Ping check
# If we're still connecting, deny the connection
if self.state == self.STATE_CONNECTING:
if self.duration() > self.server.websocket_connect_timeout:
self.serverReject()
elif self.state == self.STATE_OPEN:
if (time.time() - self.last_ping) > self.server.ping_interval:
self._sendAutoPing()
self.last_ping = time.time()
def __hash__(self):
return hash(id(self))
def __eq__(self, other):
return id(self) == id(other)
def __repr__(self):
return f"<WebSocketProtocol client={self.client_addr!r} path={self.path!r}>"
class WebSocketFactory(WebSocketServerFactory): class WebSocketFactory(WebSocketServerFactory):
@ -200,9 +312,20 @@ class WebSocketFactory(WebSocketServerFactory):
to get reply ID info. to get reply ID info.
""" """
def __init__(self, main_factory, *args, **kwargs): protocol = WebSocketProtocol
self.main_factory = main_factory
def __init__(self, server_class, *args, **kwargs):
self.server_class = server_class
WebSocketServerFactory.__init__(self, *args, **kwargs) WebSocketServerFactory.__init__(self, *args, **kwargs)
def log_action(self, *args, **kwargs): def buildProtocol(self, addr):
self.main_factory.log_action(*args, **kwargs) """
Builds protocol instances. We use this to inject the factory object into the protocol.
"""
try:
protocol = super().buildProtocol(addr)
protocol.factory = self
return protocol
except Exception:
logger.error("Cannot build protocol: %s" % traceback.format_exc())
raise

81
pyproject.toml Normal file
View File

@ -0,0 +1,81 @@
[project]
name = "daphne"
dynamic = ["version"]
description = "Django ASGI (HTTP/WebSocket) server"
requires-python = ">=3.9"
authors = [
{ name = "Django Software Foundation", email = "foundation@djangoproject.com" },
]
license = { text = "BSD" }
classifiers = [
"Development Status :: 4 - Beta",
"Environment :: Web Environment",
"Intended Audience :: Developers",
"License :: OSI Approved :: BSD License",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Topic :: Internet :: WWW/HTTP",
]
dependencies = ["asgiref>=3.5.2,<4", "autobahn>=22.4.2", "twisted[tls]>=22.4"]
[project.optional-dependencies]
tests = [
"django",
"hypothesis",
"pytest",
"pytest-asyncio",
"pytest-cov",
"black",
"tox",
"flake8",
"flake8-bugbear",
"mypy",
]
[project.urls]
homepage = "https://github.com/django/daphne"
documentation = "https://channels.readthedocs.io"
repository = "https://github.com/django/daphne.git"
changelog = "https://github.com/django/daphne/blob/main/CHANGELOG.txt"
issues = "https://github.com/django/daphne/issues"
[project.scripts]
daphne = "daphne.cli:CommandLineInterface.entrypoint"
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[tool.setuptools]
packages = ["daphne"]
[tool.setuptools.dynamic]
version = { attr = "daphne.__version__" }
readme = { file = "README.rst", content-type = "text/x-rst" }
[tool.isort]
profile = "black"
[tool.pytest]
testpaths = ["tests"]
asyncio_mode = "strict"
filterwarnings = ["ignore::pytest.PytestDeprecationWarning"]
[tool.coverage.run]
omit = ["tests/*"]
concurrency = ["multiprocessing"]
[tool.coverage.report]
show_missing = "true"
skip_covered = "true"
[tool.coverage.html]
directory = "reports/coverage_html_report"

View File

@ -1,2 +0,0 @@
[bdist_wheel]
universal=1

View File

@ -1,31 +0,0 @@
import os
import sys
from setuptools import find_packages, setup
from daphne import __version__
# We use the README as the long_description
readme_path = os.path.join(os.path.dirname(__file__), "README.rst")
setup(
name='daphne',
version=__version__,
url='http://www.djangoproject.com/',
author='Django Software Foundation',
author_email='foundation@djangoproject.com',
description='Django ASGI (HTTP/WebSocket) server',
long_description=open(readme_path).read(),
license='BSD',
zip_safe=False,
packages=find_packages(),
include_package_data=True,
install_requires=[
'asgiref>=0.13',
'twisted>=15.5,<16.3',
'autobahn>=0.12',
],
entry_points={'console_scripts': [
'daphne = daphne.cli:CommandLineInterface.entrypoint',
]},
)

284
tests/http_base.py Normal file
View File

@ -0,0 +1,284 @@
import socket
import struct
import time
import unittest
from http.client import HTTPConnection
from urllib import parse
from daphne.testing import DaphneTestingInstance, TestApplication
class DaphneTestCase(unittest.TestCase):
"""
Base class for Daphne integration test cases.
Boots up a copy of Daphne on a test port and sends it a request, and
retrieves the response. Uses a custom ASGI application and temporary files
to store/retrieve the request/response messages.
"""
### Plain HTTP helpers
def run_daphne_http(
self,
method,
path,
params,
body,
responses,
headers=None,
timeout=1,
xff=False,
request_buffer_size=None,
):
"""
Runs Daphne with the given request callback (given the base URL)
and response messages.
"""
with DaphneTestingInstance(
xff=xff, request_buffer_size=request_buffer_size
) as test_app:
# Add the response messages
test_app.add_send_messages(responses)
# Send it the request. We have to do this the long way to allow
# duplicate headers.
conn = HTTPConnection(test_app.host, test_app.port, timeout=timeout)
if params:
path += "?" + parse.urlencode(params, doseq=True)
conn.putrequest(method, path, skip_accept_encoding=True, skip_host=True)
# Manually send over headers
if headers:
for header_name, header_value in headers:
conn.putheader(header_name, header_value)
# Send body if provided.
if body:
conn.putheader("Content-Length", str(len(body)))
conn.endheaders(message_body=body)
else:
conn.endheaders()
try:
response = conn.getresponse()
except socket.timeout:
# See if they left an exception for us to load
test_app.get_received()
raise RuntimeError(
"Daphne timed out handling request, no exception found."
)
# Return scope, messages, response
return test_app.get_received() + (response,)
def run_daphne_raw(self, data, *, responses=None, timeout=1):
"""
Runs Daphne and sends it the given raw bytestring over a socket.
Accepts list of response messages the application will reply with.
Returns what Daphne sends back.
"""
assert isinstance(data, bytes)
with DaphneTestingInstance() as test_app:
if responses is not None:
test_app.add_send_messages(responses)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.settimeout(timeout)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.connect((test_app.host, test_app.port))
s.send(data)
try:
return s.recv(1000000)
except socket.timeout:
raise RuntimeError(
"Daphne timed out handling raw request, no exception found."
)
def run_daphne_request(
self,
method,
path,
params=None,
body=None,
headers=None,
xff=False,
request_buffer_size=None,
):
"""
Convenience method for just testing request handling.
Returns (scope, messages)
"""
scope, messages, _ = self.run_daphne_http(
method=method,
path=path,
params=params,
body=body,
headers=headers,
xff=xff,
request_buffer_size=request_buffer_size,
responses=[
{"type": "http.response.start", "status": 200},
{"type": "http.response.body", "body": b"OK"},
],
)
return scope, messages
def run_daphne_response(self, response_messages):
"""
Convenience method for just testing response handling.
Returns (scope, messages)
"""
_, _, response = self.run_daphne_http(
method="GET", path="/", params={}, body=b"", responses=response_messages
)
return response
### WebSocket helpers
def websocket_handshake(
self,
test_app,
path="/",
params=None,
headers=None,
subprotocols=None,
timeout=1,
):
"""
Runs a WebSocket handshake negotiation and returns the raw socket
object & the selected subprotocol.
You'll need to inject an accept or reject message before this
to let it complete.
"""
# Send it the request. We have to do this the long way to allow
# duplicate headers.
conn = HTTPConnection(test_app.host, test_app.port, timeout=timeout)
if params:
path += "?" + parse.urlencode(params, doseq=True)
conn.putrequest("GET", path, skip_accept_encoding=True, skip_host=True)
# Do WebSocket handshake headers + any other headers
if headers is None:
headers = []
headers.extend(
[
(b"Host", b"example.com"),
(b"Upgrade", b"websocket"),
(b"Connection", b"Upgrade"),
(b"Sec-WebSocket-Key", b"x3JJHMbDL1EzLkh9GBhXDw=="),
(b"Sec-WebSocket-Version", b"13"),
(b"Origin", b"http://example.com"),
]
)
if subprotocols:
headers.append((b"Sec-WebSocket-Protocol", ", ".join(subprotocols)))
if headers:
for header_name, header_value in headers:
conn.putheader(header_name, header_value)
conn.endheaders()
# Read out the response
try:
response = conn.getresponse()
except socket.timeout:
# See if they left an exception for us to load
test_app.get_received()
raise RuntimeError("Daphne timed out handling request, no exception found.")
# Check we got a good response code
if response.status != 101:
raise RuntimeError("WebSocket upgrade did not result in status code 101")
# Prepare headers for subprotocol searching
response_headers = {n.lower(): v for n, v in response.getheaders()}
response.read()
assert not response.closed
# Return the raw socket and any subprotocol
return conn.sock, response_headers.get("sec-websocket-protocol", None)
def websocket_send_frame(self, sock, value):
"""
Sends a WebSocket text or binary frame. Cannot handle long frames.
"""
# Header and text opcode
if isinstance(value, str):
frame = b"\x81"
value = value.encode("utf8")
else:
frame = b"\x82"
# Length plus masking signal bit
frame += struct.pack("!B", len(value) | 0b10000000)
# Mask badly
frame += b"\0\0\0\0"
# Payload
frame += value
sock.sendall(frame)
def receive_from_socket(self, sock, length, timeout=1):
"""
Receives the given amount of bytes from the socket, or times out.
"""
buf = b""
started = time.time()
while len(buf) < length:
buf += sock.recv(length - len(buf))
time.sleep(0.001)
if time.time() - started > timeout:
raise ValueError("Timed out reading from socket")
return buf
def websocket_receive_frame(self, sock):
"""
Receives a WebSocket frame. Cannot handle long frames.
"""
# Read header byte
# TODO: Proper receive buffer handling
opcode = self.receive_from_socket(sock, 1)
if opcode in [b"\x81", b"\x82"]:
# Read length
length = struct.unpack("!B", self.receive_from_socket(sock, 1))[0]
# Read payload
payload = self.receive_from_socket(sock, length)
if opcode == b"\x81":
payload = payload.decode("utf8")
return payload
else:
raise ValueError("Unknown websocket opcode: %r" % opcode)
### Assertions and test management
def tearDown(self):
"""
Ensures any storage files are cleared.
"""
TestApplication.delete_setup()
TestApplication.delete_result()
def assert_is_ip_address(self, address):
"""
Tests whether a given address string is a valid IPv4 or IPv6 address.
"""
try:
socket.inet_aton(address)
except OSError:
self.fail("'%s' is not a valid IP address." % address)
def assert_key_sets(self, required_keys, optional_keys, actual_keys):
"""
Asserts that all required_keys are in actual_keys, and that there
are no keys in actual_keys that aren't required or optional.
"""
present_keys = set(actual_keys)
# Make sure all required keys are present
self.assertTrue(required_keys <= present_keys)
# Assert that no other keys are present
self.assertEqual(set(), present_keys - required_keys - optional_keys)
def assert_valid_path(self, path):
"""
Checks the path is valid and already url-decoded.
"""
self.assertIsInstance(path, str)
# Assert that it's already url decoded
self.assertEqual(path, parse.unquote(path))
def assert_valid_address_and_port(self, host):
"""
Asserts the value is a valid (host, port) tuple.
"""
address, port = host
self.assertIsInstance(address, str)
self.assert_is_ip_address(address)
self.assertIsInstance(port, int)

126
tests/http_strategies.py Normal file
View File

@ -0,0 +1,126 @@
import string
from urllib import parse
from hypothesis import strategies
HTTP_METHODS = ["OPTIONS", "GET", "HEAD", "POST", "PUT", "DELETE", "TRACE", "CONNECT"]
# Unicode characters of the "Letter" category
letters = strategies.characters(
whitelist_categories=("Lu", "Ll", "Lt", "Lm", "Lo", "Nl")
)
def http_method():
return strategies.sampled_from(HTTP_METHODS)
def _http_path_portion():
alphabet = string.ascii_letters + string.digits + "-._~"
return strategies.text(min_size=1, max_size=128, alphabet=alphabet)
def http_path():
"""
Returns a URL path (not encoded).
"""
return strategies.lists(_http_path_portion(), min_size=0, max_size=10).map(
lambda s: "/" + "/".join(s)
)
def http_body():
"""
Returns random binary body data.
"""
return strategies.binary(min_size=0, max_size=1500)
def valid_bidi(value):
"""
Rejects strings which nonsensical Unicode text direction flags.
Relying on random Unicode characters means that some combinations don't make sense, from a
direction of text point of view. This little helper just rejects those.
"""
try:
value.encode("idna")
except UnicodeError:
return False
else:
return True
def _domain_label():
return strategies.text(alphabet=letters, min_size=1, max_size=63).filter(valid_bidi)
def international_domain_name():
"""
Returns a byte string of a domain name, IDNA-encoded.
"""
return strategies.lists(_domain_label(), min_size=2).map(
lambda s: (".".join(s)).encode("idna")
)
def _query_param():
return strategies.text(alphabet=letters, min_size=1, max_size=255).map(
lambda s: s.encode("utf8")
)
def query_params():
"""
Returns a list of two-tuples byte strings, ready for encoding with urlencode.
We're aiming for a total length of a URL below 2083 characters, so this strategy
ensures that the total urlencoded query string is not longer than 1500 characters.
"""
return strategies.lists(
strategies.tuples(_query_param(), _query_param()), min_size=0
).filter(lambda x: len(parse.urlencode(x)) < 1500)
def header_name():
"""
Strategy returning something that looks like a HTTP header field
https://en.wikipedia.org/wiki/List_of_HTTP_header_fields suggests they are between 4
and 20 characters long
"""
return strategies.text(
alphabet=string.ascii_letters + string.digits + "-", min_size=1, max_size=30
).map(lambda s: s.encode("utf-8"))
def header_value():
"""
Strategy returning something that looks like a HTTP header value
"For example, the Apache 2.3 server by default limits the size of each field to 8190 bytes"
https://en.wikipedia.org/wiki/List_of_HTTP_header_fields
"""
return (
strategies.text(
alphabet=string.ascii_letters
+ string.digits
+ string.punctuation.replace(",", "")
+ " /t",
min_size=1,
max_size=8190,
)
.map(lambda s: s.encode("utf-8"))
.filter(lambda s: len(s) < 8190)
)
def headers():
"""
Strategy returning a list of tuples, containing HTTP header fields and their values.
"[Apache 2.3] there can be at most 100 header fields in a single request."
https://en.wikipedia.org/wiki/List_of_HTTP_header_fields
"""
return strategies.lists(
strategies.tuples(header_name(), header_value()), min_size=0, max_size=100
)

21
tests/test_checks.py Normal file
View File

@ -0,0 +1,21 @@
import django
from django.conf import settings
from django.test.utils import override_settings
from daphne.checks import check_daphne_installed
def test_check_daphne_installed():
"""
Test check error is raised if daphne is not listed before staticfiles, and vice versa.
"""
settings.configure(
INSTALLED_APPS=["daphne.apps.DaphneConfig", "django.contrib.staticfiles"]
)
django.setup()
errors = check_daphne_installed(None)
assert len(errors) == 0
with override_settings(INSTALLED_APPS=["django.contrib.staticfiles", "daphne"]):
errors = check_daphne_installed(None)
assert len(errors) == 1
assert errors[0].id == "daphne.E001"

267
tests/test_cli.py Normal file
View File

@ -0,0 +1,267 @@
import logging
import os
from argparse import ArgumentError
from unittest import TestCase, skipUnless
from daphne.cli import CommandLineInterface
from daphne.endpoints import build_endpoint_description_strings as build
class TestEndpointDescriptions(TestCase):
"""
Tests that the endpoint parsing/generation works as intended.
"""
def testBasics(self):
self.assertEqual(build(), [], msg="Empty list returned when no kwargs given")
def testTcpPortBindings(self):
self.assertEqual(
build(port=1234, host="example.com"),
["tcp:port=1234:interface=example.com"],
)
self.assertEqual(
build(port=8000, host="127.0.0.1"), ["tcp:port=8000:interface=127.0.0.1"]
)
self.assertEqual(
build(port=8000, host="[200a::1]"), [r"tcp:port=8000:interface=200a\:\:1"]
)
self.assertEqual(
build(port=8000, host="200a::1"), [r"tcp:port=8000:interface=200a\:\:1"]
)
# incomplete port/host kwargs raise errors
self.assertRaises(ValueError, build, port=123)
self.assertRaises(ValueError, build, host="example.com")
def testUnixSocketBinding(self):
self.assertEqual(
build(unix_socket="/tmp/daphne.sock"), ["unix:/tmp/daphne.sock"]
)
def testFileDescriptorBinding(self):
self.assertEqual(build(file_descriptor=5), ["fd:fileno=5"])
def testMultipleEnpoints(self):
self.assertEqual(
sorted(
build(
file_descriptor=123,
unix_socket="/tmp/daphne.sock",
port=8080,
host="10.0.0.1",
)
),
sorted(
[
"tcp:port=8080:interface=10.0.0.1",
"unix:/tmp/daphne.sock",
"fd:fileno=123",
]
),
)
class TestCLIInterface(TestCase):
"""
Tests the overall CLI class.
"""
class TestedCLI(CommandLineInterface):
"""
CommandLineInterface subclass that we used for testing (has a fake
server subclass).
"""
class TestedServer:
"""
Mock server object for testing.
"""
def __init__(self, **kwargs):
self.init_kwargs = kwargs
def run(self):
pass
server_class = TestedServer
def setUp(self):
logging.disable(logging.CRITICAL)
def tearDown(self):
logging.disable(logging.NOTSET)
def assertCLI(self, args, server_kwargs):
"""
Asserts that the CLI class passes the right args to the server class.
Passes in a fake application automatically.
"""
cli = self.TestedCLI()
cli.run(
args + ["daphne:__version__"]
) # We just pass something importable as app
# Check the server got all arguments as intended
for key, value in server_kwargs.items():
# Get the value and sort it if it's a list (for endpoint checking)
actual_value = cli.server.init_kwargs.get(key)
if isinstance(actual_value, list):
actual_value.sort()
# Check values
self.assertEqual(
value,
actual_value,
"Wrong value for server kwarg %s: %r != %r"
% (key, value, actual_value),
)
def testCLIBasics(self):
"""
Tests basic endpoint generation.
"""
self.assertCLI([], {"endpoints": ["tcp:port=8000:interface=127.0.0.1"]})
self.assertCLI(
["-p", "123"], {"endpoints": ["tcp:port=123:interface=127.0.0.1"]}
)
self.assertCLI(
["-b", "10.0.0.1"], {"endpoints": ["tcp:port=8000:interface=10.0.0.1"]}
)
self.assertCLI(
["-b", "200a::1"], {"endpoints": [r"tcp:port=8000:interface=200a\:\:1"]}
)
self.assertCLI(
["-b", "[200a::1]"], {"endpoints": [r"tcp:port=8000:interface=200a\:\:1"]}
)
self.assertCLI(
["-p", "8080", "-b", "example.com"],
{"endpoints": ["tcp:port=8080:interface=example.com"]},
)
def testUnixSockets(self):
self.assertCLI(
["-p", "8080", "-u", "/tmp/daphne.sock"],
{
"endpoints": [
"tcp:port=8080:interface=127.0.0.1",
"unix:/tmp/daphne.sock",
]
},
)
self.assertCLI(
["-b", "example.com", "-u", "/tmp/daphne.sock"],
{
"endpoints": [
"tcp:port=8000:interface=example.com",
"unix:/tmp/daphne.sock",
]
},
)
self.assertCLI(
["-u", "/tmp/daphne.sock", "--fd", "5"],
{"endpoints": ["fd:fileno=5", "unix:/tmp/daphne.sock"]},
)
def testMixedCLIEndpointCreation(self):
"""
Tests mixing the shortcut options with the endpoint string options.
"""
self.assertCLI(
["-p", "8080", "-e", "unix:/tmp/daphne.sock"],
{
"endpoints": [
"tcp:port=8080:interface=127.0.0.1",
"unix:/tmp/daphne.sock",
]
},
)
self.assertCLI(
["-p", "8080", "-e", "tcp:port=8080:interface=127.0.0.1"],
{
"endpoints": [
"tcp:port=8080:interface=127.0.0.1",
"tcp:port=8080:interface=127.0.0.1",
]
},
)
def testCustomEndpoints(self):
"""
Tests entirely custom endpoints
"""
self.assertCLI(["-e", "imap:"], {"endpoints": ["imap:"]})
def test_default_proxyheaders(self):
"""
Passing `--proxy-headers` without a parameter will use the
`X-Forwarded-For` header.
"""
self.assertCLI(
["--proxy-headers"], {"proxy_forwarded_address_header": "X-Forwarded-For"}
)
def test_custom_proxyhost(self):
"""
Passing `--proxy-headers-host` will set the used host header to
the passed one, and `--proxy-headers` is mandatory.
"""
self.assertCLI(
["--proxy-headers", "--proxy-headers-host", "blah"],
{"proxy_forwarded_address_header": "blah"},
)
with self.assertRaises(expected_exception=ArgumentError) as exc:
self.assertCLI(
["--proxy-headers-host", "blah"],
{"proxy_forwarded_address_header": "blah"},
)
self.assertEqual(exc.exception.argument_name, "--proxy-headers-host")
self.assertEqual(
exc.exception.message,
"--proxy-headers has to be passed for this parameter.",
)
def test_custom_proxyport(self):
"""
Passing `--proxy-headers-port` will set the used port header to
the passed one, and `--proxy-headers` is mandatory.
"""
self.assertCLI(
["--proxy-headers", "--proxy-headers-port", "blah2"],
{"proxy_forwarded_port_header": "blah2"},
)
with self.assertRaises(expected_exception=ArgumentError) as exc:
self.assertCLI(
["--proxy-headers-port", "blah2"],
{"proxy_forwarded_address_header": "blah2"},
)
self.assertEqual(exc.exception.argument_name, "--proxy-headers-port")
self.assertEqual(
exc.exception.message,
"--proxy-headers has to be passed for this parameter.",
)
def test_custom_servername(self):
"""
Passing `--server-name` will set the default server header
from 'daphne' to the passed one.
"""
self.assertCLI([], {"server_name": "daphne"})
self.assertCLI(["--server-name", ""], {"server_name": ""})
self.assertCLI(["--server-name", "python"], {"server_name": "python"})
def test_no_servername(self):
"""
Passing `--no-server-name` will set server name to '' (empty string)
"""
self.assertCLI(["--no-server-name"], {"server_name": ""})
@skipUnless(os.getenv("ASGI_THREADS"), "ASGI_THREADS environment variable not set.")
class TestASGIThreads(TestCase):
def test_default_executor(self):
from daphne.server import twisted_loop
executor = twisted_loop._default_executor
self.assertEqual(executor._max_workers, int(os.getenv("ASGI_THREADS")))

View File

@ -0,0 +1,49 @@
import unittest
from daphne.http_protocol import WebRequest
class MockServer:
"""
Mock server object for testing.
"""
def protocol_connected(self, *args, **kwargs):
pass
class MockFactory:
"""
Mock factory object for testing.
"""
def __init__(self):
self.server = MockServer()
class MockChannel:
"""
Mock channel object for testing.
"""
def __init__(self):
self.factory = MockFactory()
self.transport = None
def getPeer(self, *args, **kwargs):
return "peer"
def getHost(self, *args, **kwargs):
return "host"
class TestHTTPProtocol(unittest.TestCase):
"""
Tests the HTTP protocol classes.
"""
def test_web_request_initialisation(self):
channel = MockChannel()
request = WebRequest(channel)
self.assertIsNone(request.client_addr)
self.assertIsNone(request.server_addr)

324
tests/test_http_request.py Normal file
View File

@ -0,0 +1,324 @@
import collections
from urllib import parse
import http_strategies
from http_base import DaphneTestCase
from hypothesis import assume, given, settings
from hypothesis.strategies import integers
class TestHTTPRequest(DaphneTestCase):
"""
Tests the HTTP request handling.
"""
def assert_valid_http_scope(
self, scope, method, path, params=None, headers=None, scheme=None
):
"""
Checks that the passed scope is a valid ASGI HTTP scope regarding types
and some urlencoding things.
"""
# Check overall keys
self.assert_key_sets(
required_keys={
"asgi",
"type",
"http_version",
"method",
"path",
"raw_path",
"query_string",
"headers",
},
optional_keys={"scheme", "root_path", "client", "server"},
actual_keys=scope.keys(),
)
self.assertEqual(scope["asgi"]["version"], "3.0")
# Check that it is the right type
self.assertEqual(scope["type"], "http")
# Method (uppercased unicode string)
self.assertIsInstance(scope["method"], str)
self.assertEqual(scope["method"], method.upper())
# Path
self.assert_valid_path(scope["path"])
# HTTP version
self.assertIn(scope["http_version"], ["1.0", "1.1", "1.2"])
# Scheme
self.assertIn(scope["scheme"], ["http", "https"])
if scheme:
self.assertEqual(scheme, scope["scheme"])
# Query string (byte string and still url encoded)
query_string = scope["query_string"]
self.assertIsInstance(query_string, bytes)
if params:
self.assertEqual(
query_string, parse.urlencode(params or []).encode("ascii")
)
# Ordering of header names is not important, but the order of values for a header
# name is. To assert whether that order is kept, we transform both the request
# headers and the channel message headers into a dictionary
# {name: [value1, value2, ...]} and check if they're equal.
transformed_scope_headers = collections.defaultdict(list)
for name, value in scope["headers"]:
transformed_scope_headers[name].append(value)
transformed_request_headers = collections.defaultdict(list)
for name, value in headers or []:
expected_name = name.lower().strip()
expected_value = value.strip()
transformed_request_headers[expected_name].append(expected_value)
for name, value in transformed_request_headers.items():
self.assertIn(name, transformed_scope_headers)
self.assertEqual(value, transformed_scope_headers[name])
# Root path
self.assertIsInstance(scope.get("root_path", ""), str)
# Client and server addresses
client = scope.get("client")
if client is not None:
self.assert_valid_address_and_port(client)
server = scope.get("server")
if server is not None:
self.assert_valid_address_and_port(server)
def assert_valid_http_request_message(self, message, body=None):
"""
Asserts that a message is a valid http.request message
"""
# Check overall keys
self.assert_key_sets(
required_keys={"type"},
optional_keys={"body", "more_body"},
actual_keys=message.keys(),
)
# Check that it is the right type
self.assertEqual(message["type"], "http.request")
# If there's a body present, check its type
self.assertIsInstance(message.get("body", b""), bytes)
if body is not None:
self.assertEqual(body, message.get("body", b""))
def test_minimal_request(self):
"""
Smallest viable example. Mostly verifies that our request building works.
"""
scope, messages = self.run_daphne_request("GET", "/")
self.assert_valid_http_scope(scope, "GET", "/")
self.assert_valid_http_request_message(messages[0], body=b"")
@given(
request_path=http_strategies.http_path(),
request_params=http_strategies.query_params(),
)
@settings(max_examples=5, deadline=5000)
def test_get_request(self, request_path, request_params):
"""
Tests a typical HTTP GET request, with a path and query parameters
"""
scope, messages = self.run_daphne_request(
"GET", request_path, params=request_params
)
self.assert_valid_http_scope(scope, "GET", request_path, params=request_params)
self.assert_valid_http_request_message(messages[0], body=b"")
@given(request_path=http_strategies.http_path(), chunk_size=integers(min_value=1))
@settings(max_examples=5, deadline=5000)
def test_request_body_chunking(self, request_path, chunk_size):
"""
Tests request body chunking logic.
"""
body = b"The quick brown fox jumps over the lazy dog"
_, messages = self.run_daphne_request(
"POST",
request_path,
body=body,
request_buffer_size=chunk_size,
)
# Avoid running those asserts when there's a single "http.disconnect"
if len(messages) > 1:
assert messages[0]["body"].decode() == body.decode()[:chunk_size]
assert not messages[-2]["more_body"]
assert messages[-1] == {"type": "http.disconnect"}
@given(
request_path=http_strategies.http_path(),
request_body=http_strategies.http_body(),
)
@settings(max_examples=5, deadline=5000)
def test_post_request(self, request_path, request_body):
"""
Tests a typical HTTP POST request, with a path and body.
"""
scope, messages = self.run_daphne_request(
"POST", request_path, body=request_body
)
self.assert_valid_http_scope(scope, "POST", request_path)
self.assert_valid_http_request_message(messages[0], body=request_body)
def test_raw_path(self):
"""
Tests that /foo%2Fbar produces raw_path and a decoded path
"""
scope, _ = self.run_daphne_request("GET", "/foo%2Fbar")
self.assertEqual(scope["path"], "/foo/bar")
self.assertEqual(scope["raw_path"], b"/foo%2Fbar")
@given(request_headers=http_strategies.headers())
@settings(max_examples=5, deadline=5000)
def test_headers(self, request_headers):
"""
Tests that HTTP header fields are handled as specified
"""
request_path = parse.quote("/te st-à/")
scope, messages = self.run_daphne_request(
"OPTIONS", request_path, headers=request_headers
)
self.assert_valid_http_scope(
scope, "OPTIONS", request_path, headers=request_headers
)
self.assert_valid_http_request_message(messages[0], body=b"")
@given(request_headers=http_strategies.headers())
@settings(max_examples=5, deadline=5000)
def test_duplicate_headers(self, request_headers):
"""
Tests that duplicate header values are preserved
"""
# Make sure there's duplicate headers
assume(len(request_headers) >= 2)
header_name = request_headers[0][0]
duplicated_headers = [(header_name, header[1]) for header in request_headers]
# Run the request
request_path = parse.quote("/te st-à/")
scope, messages = self.run_daphne_request(
"OPTIONS", request_path, headers=duplicated_headers
)
self.assert_valid_http_scope(
scope, "OPTIONS", request_path, headers=duplicated_headers
)
self.assert_valid_http_request_message(messages[0], body=b"")
@given(
request_method=http_strategies.http_method(),
request_path=http_strategies.http_path(),
request_params=http_strategies.query_params(),
request_headers=http_strategies.headers(),
request_body=http_strategies.http_body(),
)
@settings(max_examples=2, deadline=5000)
def test_kitchen_sink(
self,
request_method,
request_path,
request_params,
request_headers,
request_body,
):
"""
Throw everything at Daphne that we dare. The idea is that if a combination
of method/path/headers/body would break the spec, hypothesis will eventually find it.
"""
scope, messages = self.run_daphne_request(
request_method,
request_path,
params=request_params,
headers=request_headers,
body=request_body,
)
self.assert_valid_http_scope(
scope,
request_method,
request_path,
params=request_params,
headers=request_headers,
)
self.assert_valid_http_request_message(messages[0], body=request_body)
def test_headers_are_lowercased_and_stripped(self):
"""
Make sure headers are normalized as the spec says they are.
"""
headers = [(b"MYCUSTOMHEADER", b" foobar ")]
scope, messages = self.run_daphne_request("GET", "/", headers=headers)
self.assert_valid_http_scope(scope, "GET", "/", headers=headers)
self.assert_valid_http_request_message(messages[0], body=b"")
# Note that Daphne returns a list of tuples here, which is fine, because the spec
# asks to treat them interchangeably.
assert [list(x) for x in scope["headers"]] == [[b"mycustomheader", b"foobar"]]
@given(daphne_path=http_strategies.http_path())
@settings(max_examples=5, deadline=5000)
def test_root_path_header(self, daphne_path):
"""
Tests root_path handling.
"""
# Daphne-Root-Path must be URL encoded when submitting as HTTP header field
headers = [("Daphne-Root-Path", parse.quote(daphne_path.encode("utf8")))]
scope, messages = self.run_daphne_request("GET", "/", headers=headers)
# Daphne-Root-Path is not included in the returned 'headers' section. So we expect
# empty headers.
self.assert_valid_http_scope(scope, "GET", "/", headers=[])
self.assert_valid_http_request_message(messages[0], body=b"")
# And what we're looking for, root_path being set.
assert scope["root_path"] == daphne_path
def test_x_forwarded_for_ignored(self):
"""
Make sure that, by default, X-Forwarded-For is ignored.
"""
headers = [[b"X-Forwarded-For", b"10.1.2.3"], [b"X-Forwarded-Port", b"80"]]
scope, messages = self.run_daphne_request("GET", "/", headers=headers)
self.assert_valid_http_scope(scope, "GET", "/", headers=headers)
self.assert_valid_http_request_message(messages[0], body=b"")
# It should NOT appear in the client scope item
self.assertNotEqual(scope["client"], ["10.1.2.3", 80])
def test_x_forwarded_for_parsed(self):
"""
When X-Forwarded-For is enabled, make sure it is respected.
"""
headers = [[b"X-Forwarded-For", b"10.1.2.3"], [b"X-Forwarded-Port", b"80"]]
scope, messages = self.run_daphne_request("GET", "/", headers=headers, xff=True)
self.assert_valid_http_scope(scope, "GET", "/", headers=headers)
self.assert_valid_http_request_message(messages[0], body=b"")
# It should now appear in the client scope item
self.assertEqual(scope["client"], ["10.1.2.3", 80])
def test_x_forwarded_for_no_port(self):
"""
When X-Forwarded-For is enabled but only the host is passed, make sure
that at least makes it through.
"""
headers = [[b"X-Forwarded-For", b"10.1.2.3"]]
scope, messages = self.run_daphne_request("GET", "/", headers=headers, xff=True)
self.assert_valid_http_scope(scope, "GET", "/", headers=headers)
self.assert_valid_http_request_message(messages[0], body=b"")
# It should now appear in the client scope item
self.assertEqual(scope["client"], ["10.1.2.3", 0])
def test_bad_requests(self):
"""
Tests that requests with invalid (non-ASCII) characters fail.
"""
# Bad path
response = self.run_daphne_raw(
b"GET /\xc3\xa4\xc3\xb6\xc3\xbc HTTP/1.0\r\n\r\n"
)
self.assertTrue(b"400 Bad Request" in response)
# Bad querystring
response = self.run_daphne_raw(
b"GET /?\xc3\xa4\xc3\xb6\xc3\xbc HTTP/1.0\r\n\r\n"
)
self.assertTrue(b"400 Bad Request" in response)
def test_invalid_header_name(self):
"""
Tests that requests with invalid header names fail.
"""
# Test cases follow those used by h11
# https://github.com/python-hyper/h11/blob/a2c68948accadc3876dffcf979d98002e4a4ed27/h11/tests/test_headers.py#L24-L35
for header_name in [b"foo bar", b"foo\x00bar", b"foo\xffbar", b"foo\x01bar"]:
response = self.run_daphne_raw(
f"GET / HTTP/1.0\r\n{header_name}: baz\r\n\r\n".encode("ascii")
)
self.assertTrue(b"400 Bad Request" in response)

187
tests/test_http_response.py Normal file
View File

@ -0,0 +1,187 @@
import http_strategies
from http_base import DaphneTestCase
from hypothesis import given, settings
class TestHTTPResponse(DaphneTestCase):
"""
Tests HTTP response handling.
"""
def normalize_headers(self, headers):
"""
Lowercases and sorts headers, and strips transfer-encoding ones.
"""
return sorted(
[(b"server", b"daphne")]
+ [
(name.lower(), value.strip())
for name, value in headers
if name.lower() not in (b"server", b"transfer-encoding")
]
)
def encode_headers(self, headers):
def encode(s):
return s if isinstance(s, bytes) else s.encode("utf-8")
return [[encode(k), encode(v)] for k, v in headers]
def test_minimal_response(self):
"""
Smallest viable example. Mostly verifies that our response building works.
"""
response = self.run_daphne_response(
[
{"type": "http.response.start", "status": 200},
{"type": "http.response.body", "body": b"hello world"},
]
)
self.assertEqual(response.status, 200)
self.assertEqual(response.read(), b"hello world")
def test_status_code_required(self):
"""
Asserts that passing in the 'status' key is required.
Previous versions of Daphne did not enforce this, so this test is here
to make sure it stays required.
"""
with self.assertRaises(ValueError):
self.run_daphne_response(
[
{"type": "http.response.start"},
{"type": "http.response.body", "body": b"hello world"},
]
)
def test_custom_status_code(self):
"""
Tries a non-default status code.
"""
response = self.run_daphne_response(
[
{"type": "http.response.start", "status": 201},
{"type": "http.response.body", "body": b"i made a thing!"},
]
)
self.assertEqual(response.status, 201)
self.assertEqual(response.read(), b"i made a thing!")
def test_chunked_response(self):
"""
Tries sending a response in multiple parts.
"""
response = self.run_daphne_response(
[
{"type": "http.response.start", "status": 201},
{"type": "http.response.body", "body": b"chunk 1 ", "more_body": True},
{"type": "http.response.body", "body": b"chunk 2"},
]
)
self.assertEqual(response.status, 201)
self.assertEqual(response.read(), b"chunk 1 chunk 2")
def test_chunked_response_empty(self):
"""
Tries sending a response in multiple parts and an empty end.
"""
response = self.run_daphne_response(
[
{"type": "http.response.start", "status": 201},
{"type": "http.response.body", "body": b"chunk 1 ", "more_body": True},
{"type": "http.response.body", "body": b"chunk 2", "more_body": True},
{"type": "http.response.body"},
]
)
self.assertEqual(response.status, 201)
self.assertEqual(response.read(), b"chunk 1 chunk 2")
@given(body=http_strategies.http_body())
@settings(max_examples=5, deadline=5000)
def test_body(self, body):
"""
Tries body variants.
"""
response = self.run_daphne_response(
[
{"type": "http.response.start", "status": 200},
{"type": "http.response.body", "body": body},
]
)
self.assertEqual(response.status, 200)
self.assertEqual(response.read(), body)
@given(headers=http_strategies.headers())
@settings(max_examples=5, deadline=5000)
def test_headers(self, headers):
# The ASGI spec requires us to lowercase our header names
response = self.run_daphne_response(
[
{
"type": "http.response.start",
"status": 200,
"headers": self.normalize_headers(headers),
},
{"type": "http.response.body"},
]
)
# Check headers in a sensible way. Ignore transfer-encoding.
self.assertEqual(
self.normalize_headers(self.encode_headers(response.getheaders())),
self.normalize_headers(headers),
)
def test_headers_type(self):
"""
Headers should be `bytes`
"""
with self.assertRaises(ValueError) as context:
self.run_daphne_response(
[
{
"type": "http.response.start",
"status": 200,
"headers": [["foo", b"bar"]],
},
{"type": "http.response.body", "body": b""},
]
)
self.assertEqual(
str(context.exception),
"Header name 'foo' expected to be `bytes`, but got `<class 'str'>`",
)
with self.assertRaises(ValueError) as context:
self.run_daphne_response(
[
{
"type": "http.response.start",
"status": 200,
"headers": [[b"foo", True]],
},
{"type": "http.response.body", "body": b""},
]
)
self.assertEqual(
str(context.exception),
"Header value 'True' expected to be `bytes`, but got `<class 'bool'>`",
)
def test_headers_type_raw(self):
"""
Daphne returns a 500 error response if the application sends invalid
headers.
"""
response = self.run_daphne_raw(
b"GET / HTTP/1.0\r\n\r\n",
responses=[
{
"type": "http.response.start",
"status": 200,
"headers": [["foo", b"bar"]],
},
{"type": "http.response.body", "body": b""},
],
)
self.assertTrue(response.startswith(b"HTTP/1.0 500 Internal Server Error"))

84
tests/test_utils.py Normal file
View File

@ -0,0 +1,84 @@
from unittest import TestCase
from twisted.web.http_headers import Headers
from daphne.utils import parse_x_forwarded_for
class TestXForwardedForHttpParsing(TestCase):
"""
Tests that the parse_x_forwarded_for util correctly parses twisted Header.
"""
def test_basic(self):
headers = Headers(
{
b"X-Forwarded-For": [b"10.1.2.3"],
b"X-Forwarded-Port": [b"1234"],
b"X-Forwarded-Proto": [b"https"],
}
)
result = parse_x_forwarded_for(headers)
self.assertEqual(result, (["10.1.2.3", 1234], "https"))
self.assertIsInstance(result[0][0], str)
self.assertIsInstance(result[1], str)
def test_address_only(self):
headers = Headers({b"X-Forwarded-For": [b"10.1.2.3"]})
self.assertEqual(parse_x_forwarded_for(headers), (["10.1.2.3", 0], None))
def test_v6_address(self):
headers = Headers({b"X-Forwarded-For": [b"1043::a321:0001, 10.0.5.6"]})
self.assertEqual(parse_x_forwarded_for(headers), (["1043::a321:0001", 0], None))
def test_multiple_proxys(self):
headers = Headers({b"X-Forwarded-For": [b"10.1.2.3, 10.1.2.4"]})
self.assertEqual(parse_x_forwarded_for(headers), (["10.1.2.3", 0], None))
def test_original(self):
headers = Headers({})
self.assertEqual(
parse_x_forwarded_for(headers, original_addr=["127.0.0.1", 80]),
(["127.0.0.1", 80], None),
)
def test_no_original(self):
headers = Headers({})
self.assertEqual(parse_x_forwarded_for(headers), (None, None))
class TestXForwardedForWsParsing(TestCase):
"""
Tests that the parse_x_forwarded_for util correctly parses dict headers.
"""
def test_basic(self):
headers = {
b"X-Forwarded-For": b"10.1.2.3",
b"X-Forwarded-Port": b"1234",
b"X-Forwarded-Proto": b"https",
}
self.assertEqual(parse_x_forwarded_for(headers), (["10.1.2.3", 1234], "https"))
def test_address_only(self):
headers = {b"X-Forwarded-For": b"10.1.2.3"}
self.assertEqual(parse_x_forwarded_for(headers), (["10.1.2.3", 0], None))
def test_v6_address(self):
headers = {b"X-Forwarded-For": [b"1043::a321:0001, 10.0.5.6"]}
self.assertEqual(parse_x_forwarded_for(headers), (["1043::a321:0001", 0], None))
def test_multiple_proxies(self):
headers = {b"X-Forwarded-For": b"10.1.2.3, 10.1.2.4"}
self.assertEqual(parse_x_forwarded_for(headers), (["10.1.2.3", 0], None))
def test_original(self):
headers = {}
self.assertEqual(
parse_x_forwarded_for(headers, original_addr=["127.0.0.1", 80]),
(["127.0.0.1", 80], None),
)
def test_no_original(self):
headers = {}
self.assertEqual(parse_x_forwarded_for(headers), (None, None))

338
tests/test_websocket.py Normal file
View File

@ -0,0 +1,338 @@
import collections
import time
from urllib import parse
import http_strategies
from http_base import DaphneTestCase, DaphneTestingInstance
from hypothesis import given, settings
from daphne.testing import BaseDaphneTestingInstance
class TestWebsocket(DaphneTestCase):
"""
Tests WebSocket handshake, send and receive.
"""
def assert_valid_websocket_scope(
self, scope, path="/", params=None, headers=None, scheme=None, subprotocols=None
):
"""
Checks that the passed scope is a valid ASGI HTTP scope regarding types
and some urlencoding things.
"""
# Check overall keys
self.assert_key_sets(
required_keys={
"asgi",
"type",
"path",
"raw_path",
"query_string",
"headers",
},
optional_keys={"scheme", "root_path", "client", "server", "subprotocols"},
actual_keys=scope.keys(),
)
self.assertEqual(scope["asgi"]["version"], "3.0")
# Check that it is the right type
self.assertEqual(scope["type"], "websocket")
# Path
self.assert_valid_path(scope["path"])
# Scheme
self.assertIn(scope.get("scheme", "ws"), ["ws", "wss"])
if scheme:
self.assertEqual(scheme, scope["scheme"])
# Query string (byte string and still url encoded)
query_string = scope["query_string"]
self.assertIsInstance(query_string, bytes)
if params:
self.assertEqual(
query_string, parse.urlencode(params or []).encode("ascii")
)
# Ordering of header names is not important, but the order of values for a header
# name is. To assert whether that order is kept, we transform both the request
# headers and the channel message headers into a dictionary
# {name: [value1, value2, ...]} and check if they're equal.
transformed_scope_headers = collections.defaultdict(list)
for name, value in scope["headers"]:
transformed_scope_headers.setdefault(name, [])
# Make sure to split out any headers collapsed with commas
for bit in value.split(b","):
if bit.strip():
transformed_scope_headers[name].append(bit.strip())
transformed_request_headers = collections.defaultdict(list)
for name, value in headers or []:
expected_name = name.lower().strip()
expected_value = value.strip()
# Make sure to split out any headers collapsed with commas
transformed_request_headers.setdefault(expected_name, [])
for bit in expected_value.split(b","):
if bit.strip():
transformed_request_headers[expected_name].append(bit.strip())
for name, value in transformed_request_headers.items():
self.assertIn(name, transformed_scope_headers)
self.assertEqual(value, transformed_scope_headers[name])
# Root path
self.assertIsInstance(scope.get("root_path", ""), str)
# Client and server addresses
client = scope.get("client")
if client is not None:
self.assert_valid_address_and_port(client)
server = scope.get("server")
if server is not None:
self.assert_valid_address_and_port(server)
# Subprotocols
scope_subprotocols = scope.get("subprotocols", [])
if scope_subprotocols:
assert all(isinstance(x, str) for x in scope_subprotocols)
if subprotocols:
assert sorted(scope_subprotocols) == sorted(subprotocols)
def assert_valid_websocket_connect_message(self, message):
"""
Asserts that a message is a valid http.request message
"""
# Check overall keys
self.assert_key_sets(
required_keys={"type"}, optional_keys=set(), actual_keys=message.keys()
)
# Check that it is the right type
self.assertEqual(message["type"], "websocket.connect")
def test_accept(self):
"""
Tests we can open and accept a socket.
"""
with DaphneTestingInstance() as test_app:
test_app.add_send_messages([{"type": "websocket.accept"}])
self.websocket_handshake(test_app)
# Validate the scope and messages we got
scope, messages = test_app.get_received()
self.assert_valid_websocket_scope(scope)
self.assert_valid_websocket_connect_message(messages[0])
def test_reject(self):
"""
Tests we can reject a socket and it won't complete the handshake.
"""
with DaphneTestingInstance() as test_app:
test_app.add_send_messages([{"type": "websocket.close"}])
with self.assertRaises(RuntimeError):
self.websocket_handshake(test_app)
def test_subprotocols(self):
"""
Tests that we can ask for subprotocols and then select one.
"""
subprotocols = ["proto1", "proto2"]
with DaphneTestingInstance() as test_app:
test_app.add_send_messages(
[{"type": "websocket.accept", "subprotocol": "proto2"}]
)
_, subprotocol = self.websocket_handshake(
test_app, subprotocols=subprotocols
)
# Validate the scope and messages we got
assert subprotocol == "proto2"
scope, messages = test_app.get_received()
self.assert_valid_websocket_scope(scope, subprotocols=subprotocols)
self.assert_valid_websocket_connect_message(messages[0])
def test_xff(self):
"""
Tests that X-Forwarded-For headers get parsed right
"""
headers = [["X-Forwarded-For", "10.1.2.3"], ["X-Forwarded-Port", "80"]]
with DaphneTestingInstance(xff=True) as test_app:
test_app.add_send_messages([{"type": "websocket.accept"}])
self.websocket_handshake(test_app, headers=headers)
# Validate the scope and messages we got
scope, messages = test_app.get_received()
self.assert_valid_websocket_scope(scope)
self.assert_valid_websocket_connect_message(messages[0])
assert scope["client"] == ["10.1.2.3", 80]
@given(
request_path=http_strategies.http_path(),
request_params=http_strategies.query_params(),
request_headers=http_strategies.headers(),
)
@settings(max_examples=5, deadline=2000)
def test_http_bits(self, request_path, request_params, request_headers):
"""
Tests that various HTTP-level bits (query string params, path, headers)
carry over into the scope.
"""
with DaphneTestingInstance() as test_app:
test_app.add_send_messages([{"type": "websocket.accept"}])
self.websocket_handshake(
test_app,
path=parse.quote(request_path),
params=request_params,
headers=request_headers,
)
# Validate the scope and messages we got
scope, messages = test_app.get_received()
self.assert_valid_websocket_scope(
scope, path=request_path, params=request_params, headers=request_headers
)
self.assert_valid_websocket_connect_message(messages[0])
def test_raw_path(self):
"""
Tests that /foo%2Fbar produces raw_path and a decoded path
"""
with DaphneTestingInstance() as test_app:
test_app.add_send_messages([{"type": "websocket.accept"}])
self.websocket_handshake(test_app, path="/foo%2Fbar")
# Validate the scope and messages we got
scope, _ = test_app.get_received()
self.assertEqual(scope["path"], "/foo/bar")
self.assertEqual(scope["raw_path"], b"/foo%2Fbar")
@given(daphne_path=http_strategies.http_path())
@settings(max_examples=5, deadline=2000)
def test_root_path(self, *, daphne_path):
"""
Tests root_path handling.
"""
headers = [("Daphne-Root-Path", parse.quote(daphne_path))]
with DaphneTestingInstance() as test_app:
test_app.add_send_messages([{"type": "websocket.accept"}])
self.websocket_handshake(
test_app,
path="/",
headers=headers,
)
# Validate the scope and messages we got
scope, _ = test_app.get_received()
# Daphne-Root-Path is not included in the returned 'headers' section.
self.assertNotIn(
"daphne-root-path", (header[0].lower() for header in scope["headers"])
)
# And what we're looking for, root_path being set.
self.assertEqual(scope["root_path"], daphne_path)
def test_text_frames(self):
"""
Tests we can send and receive text frames.
"""
with DaphneTestingInstance() as test_app:
# Connect
test_app.add_send_messages([{"type": "websocket.accept"}])
sock, _ = self.websocket_handshake(test_app)
_, messages = test_app.get_received()
self.assert_valid_websocket_connect_message(messages[0])
# Prep frame for it to send
test_app.add_send_messages(
[{"type": "websocket.send", "text": "here be dragons 🐉"}]
)
# Send it a frame
self.websocket_send_frame(sock, "what is here? 🌍")
# Receive a frame and make sure it's correct
assert self.websocket_receive_frame(sock) == "here be dragons 🐉"
# Make sure it got our frame
_, messages = test_app.get_received()
assert messages[1] == {
"type": "websocket.receive",
"text": "what is here? 🌍",
}
def test_binary_frames(self):
"""
Tests we can send and receive binary frames with things that are very
much not valid UTF-8.
"""
with DaphneTestingInstance() as test_app:
# Connect
test_app.add_send_messages([{"type": "websocket.accept"}])
sock, _ = self.websocket_handshake(test_app)
_, messages = test_app.get_received()
self.assert_valid_websocket_connect_message(messages[0])
# Prep frame for it to send
test_app.add_send_messages(
[{"type": "websocket.send", "bytes": b"here be \xe2 bytes"}]
)
# Send it a frame
self.websocket_send_frame(sock, b"what is here? \xe2")
# Receive a frame and make sure it's correct
assert self.websocket_receive_frame(sock) == b"here be \xe2 bytes"
# Make sure it got our frame
_, messages = test_app.get_received()
assert messages[1] == {
"type": "websocket.receive",
"bytes": b"what is here? \xe2",
}
def test_http_timeout(self):
"""
Tests that the HTTP timeout doesn't kick in for WebSockets
"""
with DaphneTestingInstance(http_timeout=1) as test_app:
# Connect
test_app.add_send_messages([{"type": "websocket.accept"}])
sock, _ = self.websocket_handshake(test_app)
_, messages = test_app.get_received()
self.assert_valid_websocket_connect_message(messages[0])
# Wait 2 seconds
time.sleep(2)
# Prep frame for it to send
test_app.add_send_messages([{"type": "websocket.send", "text": "cake"}])
# Send it a frame
self.websocket_send_frame(sock, "still alive?")
# Receive a frame and make sure it's correct
assert self.websocket_receive_frame(sock) == "cake"
def test_application_checker_handles_asyncio_cancellederror(self):
with CancellingTestingInstance() as app:
# Connect to the websocket app, it will immediately raise
# asyncio.CancelledError
sock, _ = self.websocket_handshake(app)
# Disconnect from the socket
sock.close()
# Wait for application_checker to clean up the applications for
# disconnected clients, and for the server to be stopped.
time.sleep(3)
# Make sure we received either no error, or a ConnectionsNotEmpty
while not app.process.errors.empty():
err, _tb = app.process.errors.get()
if not isinstance(err, ConnectionsNotEmpty):
raise err
self.fail(
"Server connections were not cleaned up after an asyncio.CancelledError was raised"
)
async def cancelling_application(scope, receive, send):
import asyncio
from twisted.internet import reactor
# Stop the server after a short delay so that the teardown is run.
reactor.callLater(2, reactor.stop)
await send({"type": "websocket.accept"})
raise asyncio.CancelledError()
class ConnectionsNotEmpty(Exception):
pass
class CancellingTestingInstance(BaseDaphneTestingInstance):
def __init__(self):
super().__init__(application=cancelling_application)
def process_teardown(self):
import multiprocessing
# Get a hold of the enclosing DaphneProcess (we're currently running in
# the same process as the application).
proc = multiprocessing.current_process()
# By now the (only) socket should have disconnected, and the
# application_checker should have run. If there are any connections
# still, it means that the application_checker did not clean them up.
if proc.server.connections:
raise ConnectionsNotEmpty()

8
tox.ini Normal file
View File

@ -0,0 +1,8 @@
[tox]
envlist =
py{39,310,311,312,313}
[testenv]
extras = tests
commands =
pytest -v {posargs}