Compare commits

..

No commits in common. "main" and "2.2.2" have entirely different histories.
main ... 2.2.2

39 changed files with 815 additions and 1838 deletions

11
.flake8
View File

@ -1,11 +0,0 @@
[flake8]
exclude =
.venv,
.tox,
docs,
testproject,
js_client,
.eggs
extend-ignore = E123, E128, E266, E402, W503, E731, W601, B036
max-line-length = 120

View File

@ -1,6 +0,0 @@
version: 2
updates:
- package-ecosystem: github-actions
directory: "/"
schedule:
interval: weekly

View File

@ -1,43 +0,0 @@
name: Tests
on:
push:
branches:
- main
pull_request:
workflow_dispatch:
permissions:
contents: read
jobs:
tests:
runs-on: ${{ matrix.os }}-latest
strategy:
fail-fast: false
matrix:
os:
- ubuntu
- windows
python-version:
- "3.9"
- "3.10"
- "3.11"
- "3.12"
- "3.13"
steps:
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip setuptools wheel
python -m pip install --upgrade tox
- name: Run tox targets for ${{ matrix.python-version }}
run: tox run -f py$(echo ${{ matrix.python-version }} | tr -d .)

5
.gitignore vendored
View File

@ -1,4 +1,3 @@
.idea/
*.egg-info
*.pyc
__pycache__
@ -10,7 +9,3 @@ build/
.eggs
test_layer*
test_consumer*
.python-version
.pytest_cache/
.vscode
.coverage

View File

@ -1,23 +0,0 @@
repos:
- repo: https://github.com/asottile/pyupgrade
rev: v3.20.0
hooks:
- id: pyupgrade
args: [--py39-plus]
- repo: https://github.com/psf/black
rev: 25.1.0
hooks:
- id: black
language_version: python3
- repo: https://github.com/pycqa/isort
rev: 6.0.1
hooks:
- id: isort
- repo: https://github.com/PyCQA/flake8
rev: 7.3.0
hooks:
- id: flake8
additional_dependencies:
- flake8-bugbear
ci:
autoupdate_schedule: quarterly

34
.travis.yml Normal file
View File

@ -0,0 +1,34 @@
sudo: false
language: python
python:
- '3.5'
- '3.6'
env:
- TWISTED="twisted==17.5.0"
- TWISTED="twisted"
install:
- pip install $TWISTED isort unify flake8 -e .[tests]
- pip freeze
script:
- pytest
- flake8
- isort --check-only --diff --recursive daphne tests
- unify --check-only --recursive --quote \" daphne tests
jobs:
include:
- stage: release
script: skip
deploy:
provider: pypi
user: andrewgodwin_bot
on:
tags: true
distributions: sdist bdist_wheel
password:
secure: IA+dvSmMKN+fT47rgRb6zdmrExhK5QCVEDH8kheC6kAacw80ORBZKo6sMX9GQBJ3BlfhTqrzAhItHkDUxonb579rJDvmlJ7FPg7axZpsY9Fmls6q1rJC/La8iGWx20+ctberejKSH3wSwa0LH0imJXGDoKKzf1DLmk5pEEWjG2QqhKdEtyAcnzOPnDWcRCs+DKfQcMzETH7lMFN8oe3aBhHLLtcg4yA78cN5CeyyH92lmbaVp7k/b1FqXXFgf16bi5tlgLrb6DhmcnNjwLMSHRafNoPCXkWQOwh6gEHeHRR3OsHsBueyJHIikuHNrpmgpAqjYlVQ5WqmfgMlhCfRm9xL+G4G+KK9n8AJNGAszUfxVlPvMTw+nkOSd/bmxKrdCqqYnDIvDLucXJ86TstNzklfAwr3FL+wBlucRtOMLhQlHIaPTXYcNpOuh6B4ELjC+WjDGh8EdRKvcsZz7+5AS5ZaDDccuviMzQFsXVcE2d4HiosbARVrkxJ7j3MWp0OGgWVxXgRO2EQIksbgGSIjI8PqFjBqht2WT6MhVZPCc9XHUlP2CiAR5+QY8JgTIztbEDuhpgr0cRAtiHwJEAxDR9tJR/j/v4X/Pau2ZdR0C0yW77lVgD75spLL0khAnU7q+qgiF0hyQ7gRRVy0tElT0HBenVbzjzHowdJX8lSPjRg=

View File

@ -1,212 +1,3 @@
4.2.1 (2025-07-02)
------------------
* Fixed a packaging error in 4.2.0.
* Removed --nostatic and --insecure args to runserver command when staticfiles
app is not installed.
4.2.0 (2025-05-16)
------------------
Daphne 4.2 is a maintenance release in the 4.x series.
* Added support for Python 3.13.
* Dropped support for EOL Python 3.8.
* Updated pyupgrade configuration to target Python 3.9.
* Added a `load_asgi_app` hook to CLI class, useful for compiled or frozen
applications.
* Allowed assigning a port in the DaphneProcess test helper, useful for live
server test cases, such as that provided by Channels.
* Added --nostatic and --insecure args to runserver command to match Django's
built-in command.
* Moved metadata to use pyproject.toml.
* Updated sdist file to include tests and changelog.
* Removed unused pytest-runner.
4.1.2 (2024-04-11)
------------------
* Fixed a setuptools configuration error in 4.1.1.
4.1.1 (2024-04-10)
------------------
* Fixed a twisted.plugin packaging error in 4.1.0.
Thanks to sdc50.
4.1.0 (2024-02-10)
------------------
* Added support for Python 3.12.
* Dropped support for EOL Python 3.7.
* Handled root path for websocket scopes.
* Validate HTTP header names as per RFC 9110.
4.0.0 (2022-10-07)
------------------
Major versioning targeting use with Channels 4.0 and beyond. Except where
noted should remain usable with Channels v3 projects, but updating Channels to the latest version is recommended.
* Added a ``runserver`` command to run an ASGI Django development server.
Added ``"daphne"`` to the ``INSTALLED_APPS`` setting, before
``"django.contrib.staticfiles"`` to enable:
INSTALLED_APPS = [
"daphne",
...
]
This replaces the Channels implementation of ``runserver``, which is removed
in Channels 4.0.
* Made the ``DaphneProcess`` tests helper class compatible with the ``spawn``
process start method, which is used on macOS and Windows.
Note that requires Channels v4 if using with ``ChannelsLiveServerTestCase``.
* Dropped support for Python 3.6.
* Updated dependencies to the latest versions.
Previously a range of Twisted versions have been supported. Recent Twisted
releases (22.2, 22.4) have issued security fixes, so those are now the
minimum supported version. Given the stability of Twisted, supporting a
range of versions does not represent a good use of maintainer time. Going
forward the latest Twisted version will be required.
* Set ``daphne`` as default ``Server`` header.
This can be configured with the ``--server-name`` CLI argument.
Added the new ``--no-server-name`` CLI argument to disable the ``Server``
header, which is equivalent to ``--server-name=` (an empty name).
* Added ``--log-fmt`` CLI argument.
* Added support for ``ASGI_THREADS`` environment variable, setting the maximum
number of workers used by a ``SyncToAsync`` thread-pool executor.
Set e.g. ``ASGI_THREADS=4 daphne ...`` when running to limit the number of
workers.
* Removed deprecated ``--ws_protocols`` CLI option.
3.0.2 (2021-04-07)
------------------
* Fixed a bug where ``send`` passed to applications wasn't a true async
function but a lambda wrapper, preventing it from being used with
``asgiref.sync.async_to_sync()``.
3.0.1 (2020-11-12)
------------------
* Fixed a bug where ``asyncio.CancelledError`` was not correctly handled on
Python 3.8+, resulting in incorrect protocol application cleanup.
3.0.0 (2020-10-28)
------------------
* Updates internals to use ASGI v3 throughout. ``asgiref.compatibility`` is
used for older applications.
* Consequently, the `--asgi-protocol` command-line option is removed.
* HTTP request bodies are now read, and passed to the application, in chunks.
* Added support for Python 3.9.
* Dropped support for Python 3.5.
2.5.0 (2020-04-15)
------------------
* Fixes compatability for twisted when running Python 3.8+ on Windows, by
setting ``asyncio.WindowsSelectorEventLoopPolicy`` as the event loop policy
in this case.
* The internal ``daphne.testing.TestApplication`` now requires an addition
``lock`` argument to ``__init__()``. This is expected to be an instance of
``multiprocessing.Lock``.
2.4.1 (2019-12-18)
------------------
* Avoids Twisted using the default event loop, for compatibility with Django
3.0's ``async_unsafe()`` decorator in threaded contexts, such as using the
auto-reloader.
2.4.0 (2019-11-20)
------------------
* Adds CI testing against and support for Python 3.8.
* Adds support for ``raw_path`` in ASGI scope.
* Ensures an error response is sent to the client if the application sends
malformed headers.
* Resolves an asyncio + multiprocessing problem when testing that would cause
the test suite to fail/hang on macOS.
* Requires installing Twisted's TLS extras, via ``install_requires``.
* Adds missing LICENSE to distribution.
2.3.0 (2019-04-09)
------------------
* Added support for ASGI v3.
2.2.5 (2019-01-31)
------------------
* WebSocket handshakes are now affected by the websocket connect timeout, so
you can limit them from the command line.
* Server name can now be set using --server-name
2.2.4 (2018-12-15)
------------------
* No longer listens on port 8000 when a file descriptor is provided with --fd
* Fixed a memory leak with WebSockets
2.2.3 (2018-11-06)
------------------
* Enforce that response headers are only bytestrings, rather than allowing
unicode strings and coercing them into bytes.
* New command-line options to set proxy header names: --proxy-headers-host and
--proxy-headers-port.
2.2.2 (2018-08-16)
------------------
@ -589,4 +380,4 @@ noted should remain usable with Channels v3 projects, but updating Channels to t
* http.disconnect messages are now sent
* Request handling speed significantly improved
* Request handling speed significantly improved

View File

@ -1,3 +0,0 @@
include LICENSE
include CHANGELOG.txt
recursive-include tests *.py

View File

@ -1,17 +1,24 @@
daphne
======
.. image:: https://api.travis-ci.org/django/daphne.svg
:target: https://travis-ci.org/django/daphne
.. image:: https://img.shields.io/pypi/v/daphne.svg
:target: https://pypi.python.org/pypi/daphne
Daphne is a HTTP, HTTP2 and WebSocket protocol server for
`ASGI <https://github.com/django/asgiref/blob/main/specs/asgi.rst>`_ and
`ASGI-HTTP <https://github.com/django/asgiref/blob/main/specs/www.rst>`_,
`ASGI <https://github.com/django/asgiref/blob/master/specs/asgi.rst>`_ and
`ASGI-HTTP <https://github.com/django/asgiref/blob/master/specs/www.rst>`_,
developed to power Django Channels.
It supports automatic negotiation of protocols; there's no need for URL
prefixing to determine WebSocket endpoints versus HTTP endpoints.
*Note:* Daphne 2 is not compatible with Channels 1.x applications, only with
Channels 2.x and other ASGI applications. Install a 1.x version of Daphne
for Channels 1.x support.
Running
-------
@ -54,7 +61,7 @@ Daphne supports terminating HTTP/2 connections natively. You'll
need to do a couple of things to get it working, though. First, you need to
make sure you install the Twisted ``http2`` and ``tls`` extras::
pip install -U "Twisted[tls,http2]"
pip install -U Twisted[tls,http2]
Next, because all current browsers only support HTTP/2 when using TLS, you will
need to start Daphne with TLS turned on, which can be done using the Twisted endpoint syntax::
@ -108,19 +115,19 @@ should start with a slash, but not end with one; for example::
Python Support
--------------
Daphne requires Python 3.9 or later.
Daphne requires Python 3.5 or later.
Contributing
------------
Please refer to the
`main Channels contributing docs <https://github.com/django/channels/blob/main/CONTRIBUTING.rst>`_.
`main Channels contributing docs <https://github.com/django/channels/blob/master/CONTRIBUTING.rst>`_.
To run tests, make sure you have installed the ``tests`` extra with the package::
cd daphne/
pip install -e '.[tests]'
pip install -e .[tests]
pytest
@ -134,4 +141,4 @@ https://docs.djangoproject.com/en/dev/internals/security/.
To report bugs or request new features, please open a new GitHub issue.
This repository is part of the Channels project. For the shepherd and maintenance team, please see the
`main Channels readme <https://github.com/django/channels/blob/main/README.rst>`_.
`main Channels readme <https://github.com/django/channels/blob/master/README.rst>`_.

View File

@ -1,14 +1 @@
import sys
__version__ = "4.2.1"
# Windows on Python 3.8+ uses ProactorEventLoop, which is not compatible with
# Twisted. Does not implement add_writer/add_reader.
# See https://bugs.python.org/issue37373
# and https://twistedmatrix.com/trac/ticket/9766
PY38_WIN = sys.version_info >= (3, 8) and sys.platform == "win32"
if PY38_WIN:
import asyncio
asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
__version__ = "2.2.2"

View File

@ -1,3 +0,0 @@
from daphne.cli import CommandLineInterface
CommandLineInterface.entrypoint()

View File

@ -1,7 +1,7 @@
import datetime
class AccessLogGenerator:
class AccessLogGenerator(object):
"""
Object that implements the Daphne "action logger" internal interface in
order to provide an access log in something resembling NCSA format.
@ -49,16 +49,13 @@ class AccessLogGenerator:
request="WSDISCONNECT %(path)s" % details,
)
def write_entry(
self, host, date, request, status=None, length=None, ident=None, user=None
):
def write_entry(self, host, date, request, status=None, length=None, ident=None, user=None):
"""
Writes an NCSA-style entry to the log file (some liberty is taken with
what the entries are for non-HTTP)
"""
self.stream.write(
'%s %s %s [%s] "%s" %s %s\n'
% (
"%s %s %s [%s] \"%s\" %s %s\n" % (
host,
ident or "-",
user or "-",

View File

@ -1,16 +0,0 @@
# Import the server here to ensure the reactor is installed very early on in case other
# packages import twisted.internet.reactor (e.g. raven does this).
from django.apps import AppConfig
from django.core import checks
import daphne.server # noqa: F401
from .checks import check_daphne_installed
class DaphneConfig(AppConfig):
name = "daphne"
verbose_name = "Daphne"
def ready(self):
checks.register(check_daphne_installed, checks.Tags.staticfiles)

View File

@ -1,21 +0,0 @@
# Django system check to ensure daphne app is listed in INSTALLED_APPS before django.contrib.staticfiles.
from django.core.checks import Error, register
@register()
def check_daphne_installed(app_configs, **kwargs):
from django.apps import apps
from django.contrib.staticfiles.apps import StaticFilesConfig
from daphne.apps import DaphneConfig
for app in apps.get_app_configs():
if isinstance(app, DaphneConfig):
return []
if isinstance(app, StaticFilesConfig):
return [
Error(
"Daphne must be listed before django.contrib.staticfiles in INSTALLED_APPS.",
id="daphne.E001",
)
]

View File

@ -1,9 +1,6 @@
import argparse
import logging
import sys
from argparse import ArgumentError, Namespace
from asgiref.compatibility import guarantee_single_callable
from .access import AccessLogGenerator
from .endpoints import build_endpoint_description_strings
@ -16,7 +13,7 @@ DEFAULT_HOST = "127.0.0.1"
DEFAULT_PORT = 8000
class CommandLineInterface:
class CommandLineInterface(object):
"""
Acts as the main CLI entry point for running the server.
"""
@ -26,9 +23,15 @@ class CommandLineInterface:
server_class = Server
def __init__(self):
self.parser = argparse.ArgumentParser(description=self.description)
self.parser = argparse.ArgumentParser(
description=self.description,
)
self.parser.add_argument(
"-p", "--port", type=int, help="Port number to listen on", default=None
"-p",
"--port",
type=int,
help="Port number to listen on",
default=None,
)
self.parser.add_argument(
"-b",
@ -90,11 +93,6 @@ class CommandLineInterface:
help="Where to write the access log (- for stdout, the default for verbosity=1)",
default=None,
)
self.parser.add_argument(
"--log-fmt",
help="Log format to use",
default="%(asctime)-15s %(levelname)-8s %(message)s",
)
self.parser.add_argument(
"--ping-interval",
type=int,
@ -113,6 +111,13 @@ class CommandLineInterface:
help="The number of seconds an ASGI application has to exit after client disconnect before it is killed",
default=10,
)
self.parser.add_argument(
"--ws-protocol",
nargs="*",
dest="ws_protocols",
help="The WebSocket protocols you wish to support",
default=None,
)
self.parser.add_argument(
"--root-path",
dest="root_path",
@ -123,43 +128,14 @@ class CommandLineInterface:
"--proxy-headers",
dest="proxy_headers",
help="Enable parsing and using of X-Forwarded-For and X-Forwarded-Port headers and using that as the "
"client address",
"client address",
default=False,
action="store_true",
)
self.arg_proxy_host = self.parser.add_argument(
"--proxy-headers-host",
dest="proxy_headers_host",
help="Specify which header will be used for getting the host "
"part. Can be omitted, requires --proxy-headers to be specified "
'when passed. "X-Real-IP" (when passed by your webserver) is a '
"good candidate for this.",
default=False,
action="store",
)
self.arg_proxy_port = self.parser.add_argument(
"--proxy-headers-port",
dest="proxy_headers_port",
help="Specify which header will be used for getting the port "
"part. Can be omitted, requires --proxy-headers to be specified "
"when passed.",
default=False,
action="store",
)
self.parser.add_argument(
"application",
help="The application to dispatch to as path.to.module:instance.path",
)
self.parser.add_argument(
"-s",
"--server-name",
dest="server_name",
help="specify which value should be passed to response header Server attribute",
default="daphne",
)
self.parser.add_argument(
"--no-server-name", dest="server_name", action="store_const", const=""
)
self.server = None
@ -170,43 +146,6 @@ class CommandLineInterface:
"""
cls().run(sys.argv[1:])
def _check_proxy_headers_passed(self, argument: str, args: Namespace):
"""Raise if the `--proxy-headers` weren't specified."""
if args.proxy_headers:
return
raise ArgumentError(
argument=argument,
message="--proxy-headers has to be passed for this parameter.",
)
def _get_forwarded_host(self, args: Namespace):
"""
Return the default host header from which the remote hostname/ip
will be extracted.
"""
if args.proxy_headers_host:
self._check_proxy_headers_passed(argument=self.arg_proxy_host, args=args)
return args.proxy_headers_host
if args.proxy_headers:
return "X-Forwarded-For"
def _get_forwarded_port(self, args: Namespace):
"""
Return the default host header from which the remote hostname/ip
will be extracted.
"""
if args.proxy_headers_port:
self._check_proxy_headers_passed(argument=self.arg_proxy_port, args=args)
return args.proxy_headers_port
if args.proxy_headers:
return "X-Forwarded-Port"
def load_asgi_app(self, asgi_app_path: str):
"""
Return the imported application.
"""
return import_by_path(asgi_app_path)
def run(self, args):
"""
Pass in raw argument list and it will decode them
@ -222,7 +161,7 @@ class CommandLineInterface:
2: logging.DEBUG,
3: logging.DEBUG, # Also turns on asyncio debug
}[args.verbosity],
format=args.log_fmt,
format="%(asctime)-15s %(levelname)-8s %(message)s",
)
# If verbosity is 1 or greater, or they told us explicitly, set up access log
access_log_stream = None
@ -233,22 +172,11 @@ class CommandLineInterface:
access_log_stream = open(args.access_log, "a", 1)
elif args.verbosity >= 1:
access_log_stream = sys.stdout
# Import application
sys.path.insert(0, ".")
application = self.load_asgi_app(args.application)
application = guarantee_single_callable(application)
application = import_by_path(args.application)
# Set up port/host bindings
if not any(
[
args.host,
args.port is not None,
args.unix_socket,
args.file_descriptor is not None,
args.socket_strings,
]
):
if not any([args.host, args.port is not None, args.unix_socket, args.file_descriptor, args.socket_strings]):
# no advanced binding options passed, patch in defaults
args.host = DEFAULT_HOST
args.port = DEFAULT_PORT
@ -261,11 +189,16 @@ class CommandLineInterface:
host=args.host,
port=args.port,
unix_socket=args.unix_socket,
file_descriptor=args.file_descriptor,
file_descriptor=args.file_descriptor
)
endpoints = sorted(
args.socket_strings + endpoints
)
endpoints = sorted(args.socket_strings + endpoints)
# Start the server
logger.info("Starting server at {}".format(", ".join(endpoints)))
logger.info(
"Starting server at %s" %
(", ".join(endpoints), )
)
self.server = self.server_class(
application=application,
endpoints=endpoints,
@ -274,20 +207,13 @@ class CommandLineInterface:
ping_timeout=args.ping_timeout,
websocket_timeout=args.websocket_timeout,
websocket_connect_timeout=args.websocket_connect_timeout,
websocket_handshake_timeout=args.websocket_connect_timeout,
application_close_timeout=args.application_close_timeout,
action_logger=(
AccessLogGenerator(access_log_stream) if access_log_stream else None
),
action_logger=AccessLogGenerator(access_log_stream) if access_log_stream else None,
ws_protocols=args.ws_protocols,
root_path=args.root_path,
verbosity=args.verbosity,
proxy_forwarded_address_header=self._get_forwarded_host(args=args),
proxy_forwarded_port_header=self._get_forwarded_port(args=args),
proxy_forwarded_proto_header=(
"X-Forwarded-Proto" if args.proxy_headers else None
),
server_name=args.server_name,
proxy_forwarded_address_header="X-Forwarded-For" if args.proxy_headers else None,
proxy_forwarded_port_header="X-Forwarded-Port" if args.proxy_headers else None,
proxy_forwarded_proto_header="X-Forwarded-Proto" if args.proxy_headers else None,
)
self.server.run()
if self.server.abort_start:
exit(1)

View File

@ -1,5 +1,10 @@
def build_endpoint_description_strings(
host=None, port=None, unix_socket=None, file_descriptor=None
host=None,
port=None,
unix_socket=None,
file_descriptor=None
):
"""
Build a list of twisted endpoint description strings that the server will listen on.
@ -8,7 +13,7 @@ def build_endpoint_description_strings(
"""
socket_descriptions = []
if host and port is not None:
host = host.strip("[]").replace(":", r"\:")
host = host.strip("[]").replace(":", "\:")
socket_descriptions.append("tcp:port=%d:interface=%s" % (int(port), host))
elif any([host, port]):
raise ValueError("TCP binding requires both port and host kwargs.")

View File

@ -9,7 +9,7 @@ from twisted.protocols.policies import ProtocolWrapper
from twisted.web import http
from zope.interface import implementer
from .utils import HEADER_NAME_RE, parse_x_forwarded_for
from .utils import parse_x_forwarded_for
logger = logging.getLogger(__name__)
@ -23,8 +23,7 @@ class WebRequest(http.Request):
GET and POST out.
"""
error_template = (
"""
error_template = """
<html>
<head>
<title>%(title)s</title>
@ -41,17 +40,9 @@ class WebRequest(http.Request):
<footer>Daphne</footer>
</body>
</html>
""".replace(
"\n", ""
)
.replace(" ", " ")
.replace(" ", " ")
.replace(" ", " ")
) # Shorten it a bit, bytes wise
""".replace("\n", "").replace(" ", " ").replace(" ", " ").replace(" ", " ") # Shorten it a bit, bytes wise
def __init__(self, *args, **kwargs):
self.client_addr = None
self.server_addr = None
try:
http.Request.__init__(self, *args, **kwargs)
# Easy server link
@ -69,13 +60,6 @@ class WebRequest(http.Request):
def process(self):
try:
self.request_start = time.time()
# Validate header names.
for name, _ in self.requestHeaders.getAllRawHeaders():
if not HEADER_NAME_RE.fullmatch(name):
self.basic_error(400, b"Bad Request", "Invalid header name")
return
# Get upgrade header
upgrade_header = None
if self.requestHeaders.hasHeader(b"Upgrade"):
@ -86,6 +70,9 @@ class WebRequest(http.Request):
# requires unicode string.
self.client_addr = [str(self.client.host), self.client.port]
self.server_addr = [str(self.host.host), self.host.port]
else:
self.client_addr = None
self.server_addr = None
self.client_scheme = "https" if self.isSecure() else "http"
@ -97,7 +84,7 @@ class WebRequest(http.Request):
self.server.proxy_forwarded_port_header,
self.server.proxy_forwarded_proto_header,
self.client_addr,
self.client_scheme,
self.client_scheme
)
# Check for unicodeish path (or it'll crash when trying to parse)
try:
@ -118,9 +105,7 @@ class WebRequest(http.Request):
# Is it WebSocket? IS IT?!
if upgrade_header and upgrade_header.lower() == b"websocket":
# Make WebSocket protocol to hand off to
protocol = self.server.ws_factory.buildProtocol(
self.transport.getPeer()
)
protocol = self.server.ws_factory.buildProtocol(self.transport.getPeer())
if not protocol:
# If protocol creation fails, we signal "internal server error"
self.setResponseCode(500)
@ -166,49 +151,33 @@ class WebRequest(http.Request):
logger.debug("HTTP %s request for %s", self.method, self.client_addr)
self.content.seek(0, 0)
# Work out the application scope and create application
self.application_queue = yield maybeDeferred(
self.server.create_application,
self,
{
"type": "http",
# TODO: Correctly say if it's 1.1 or 1.0
"http_version": self.clientproto.split(b"/")[-1].decode(
"ascii"
),
"method": self.method.decode("ascii"),
"path": unquote(self.path.decode("ascii")),
"raw_path": self.path,
"root_path": self.root_path,
"scheme": self.client_scheme,
"query_string": self.query_string,
"headers": self.clean_headers,
"client": self.client_addr,
"server": self.server_addr,
},
)
self.application_queue = yield maybeDeferred(self.server.create_application, self, {
"type": "http",
# TODO: Correctly say if it's 1.1 or 1.0
"http_version": self.clientproto.split(b"/")[-1].decode("ascii"),
"method": self.method.decode("ascii"),
"path": unquote(self.path.decode("ascii")),
"root_path": self.root_path,
"scheme": self.client_scheme,
"query_string": self.query_string,
"headers": self.clean_headers,
"client": self.client_addr,
"server": self.server_addr,
})
# Check they didn't close an unfinished request
if self.application_queue is None or self.content.closed:
# Not much we can do, the request is prematurely abandoned.
return
# Run application against request
buffer_size = self.server.request_buffer_size
while True:
chunk = self.content.read(buffer_size)
more_body = not (len(chunk) < buffer_size)
payload = {
self.application_queue.put_nowait(
{
"type": "http.request",
"body": chunk,
"more_body": more_body,
}
self.application_queue.put_nowait(payload)
if not more_body:
break
"body": self.content.read(),
},
)
except Exception:
logger.error(traceback.format_exc())
self.basic_error(
500, b"Internal Server Error", "Daphne HTTP processing error"
)
self.basic_error(500, b"Internal Server Error", "Daphne HTTP processing error")
def connectionLost(self, reason):
"""
@ -248,25 +217,16 @@ class WebRequest(http.Request):
raise ValueError("HTTP response has already been started")
self._response_started = True
if "status" not in message:
raise ValueError(
"Specifying a status code is required for a Response message."
)
raise ValueError("Specifying a status code is required for a Response message.")
# Set HTTP status code
self.setResponseCode(message["status"])
# Write headers
for header, value in message.get("headers", {}):
self.responseHeaders.addRawHeader(header, value)
if self.server.server_name and not self.responseHeaders.hasHeader("server"):
self.setHeader(b"server", self.server.server_name.encode())
logger.debug(
"HTTP %s response started for %s", message["status"], self.client_addr
)
logger.debug("HTTP %s response started for %s", message["status"], self.client_addr)
elif message["type"] == "http.response.body":
if not self._response_started:
raise ValueError(
"HTTP response has not yet been started but got %s"
% message["type"]
)
raise ValueError("HTTP response has not yet been started but got %s" % message["type"])
# Write out body
http.Request.write(self, message.get("body", b""))
# End if there's no more content
@ -279,23 +239,15 @@ class WebRequest(http.Request):
# The path is malformed somehow - do our best to log something
uri = repr(self.uri)
try:
self.server.log_action(
"http",
"complete",
{
"path": uri,
"status": self.code,
"method": self.method.decode("ascii", "replace"),
"client": (
"%s:%s" % tuple(self.client_addr)
if self.client_addr
else None
),
"time_taken": self.duration(),
"size": self.sentLength,
},
)
except Exception:
self.server.log_action("http", "complete", {
"path": uri,
"status": self.code,
"method": self.method.decode("ascii", "replace"),
"client": "%s:%s" % tuple(self.client_addr) if self.client_addr else None,
"time_taken": self.duration(),
"size": self.sentLength,
})
except Exception as e:
logger.error(traceback.format_exc())
else:
logger.debug("HTTP response chunk for %s", self.client_addr)
@ -318,11 +270,7 @@ class WebRequest(http.Request):
logger.warning("Application timed out while sending response")
self.finish()
else:
self.basic_error(
503,
b"Service Unavailable",
"Application failed to respond within time limit.",
)
self.basic_error(503, b"Service Unavailable", "Application failed to respond within time limit.")
### Utility functions
@ -333,7 +281,11 @@ class WebRequest(http.Request):
"""
# If we don't yet have a path, then don't send as we never opened.
if self.path:
self.application_queue.put_nowait({"type": "http.disconnect"})
self.application_queue.put_nowait(
{
"type": "http.disconnect",
},
)
def duration(self):
"""
@ -347,25 +299,20 @@ class WebRequest(http.Request):
"""
Responds with a server-level error page (very basic)
"""
self.handle_reply(
{
"type": "http.response.start",
"status": status,
"headers": [(b"Content-Type", b"text/html; charset=utf-8")],
}
)
self.handle_reply(
{
"type": "http.response.body",
"body": (
self.error_template
% {
"title": str(status) + " " + status_text.decode("ascii"),
"body": body,
}
).encode("utf8"),
}
)
self.handle_reply({
"type": "http.response.start",
"status": status,
"headers": [
(b"Content-Type", b"text/html; charset=utf-8"),
],
})
self.handle_reply({
"type": "http.response.body",
"body": (self.error_template % {
"title": str(status) + " " + status_text.decode("ascii"),
"body": body,
}).encode("utf8"),
})
def __hash__(self):
return hash(id(self))
@ -396,7 +343,7 @@ class HTTPFactory(http.HTTPFactory):
protocol = http.HTTPFactory.buildProtocol(self, addr)
protocol.requestFactory = WebRequest
return protocol
except Exception:
except Exception as e:
logger.error("Cannot build protocol: %s" % traceback.format_exc())
raise

View File

@ -1,204 +0,0 @@
import datetime
import importlib
import logging
import sys
from django.apps import apps
from django.conf import settings
from django.contrib.staticfiles.handlers import ASGIStaticFilesHandler
from django.core.exceptions import ImproperlyConfigured
from django.core.management import CommandError
from django.core.management.commands.runserver import Command as RunserverCommand
from daphne import __version__
from daphne.endpoints import build_endpoint_description_strings
from daphne.server import Server
logger = logging.getLogger("django.channels.server")
def get_default_application():
"""
Gets the default application, set in the ASGI_APPLICATION setting.
"""
try:
path, name = settings.ASGI_APPLICATION.rsplit(".", 1)
except (ValueError, AttributeError):
raise ImproperlyConfigured("Cannot find ASGI_APPLICATION setting.")
try:
module = importlib.import_module(path)
except ImportError:
raise ImproperlyConfigured("Cannot import ASGI_APPLICATION module %r" % path)
try:
value = getattr(module, name)
except AttributeError:
raise ImproperlyConfigured(
f"Cannot find {name!r} in ASGI_APPLICATION module {path}"
)
return value
class Command(RunserverCommand):
protocol = "http"
server_cls = Server
def add_arguments(self, parser):
super().add_arguments(parser)
parser.add_argument(
"--noasgi",
action="store_false",
dest="use_asgi",
default=True,
help="Run the old WSGI-based runserver rather than the ASGI-based one",
)
parser.add_argument(
"--http_timeout",
action="store",
dest="http_timeout",
type=int,
default=None,
help=(
"Specify the daphne http_timeout interval in seconds "
"(default: no timeout)"
),
)
parser.add_argument(
"--websocket_handshake_timeout",
action="store",
dest="websocket_handshake_timeout",
type=int,
default=5,
help=(
"Specify the daphne websocket_handshake_timeout interval in "
"seconds (default: 5)"
),
)
if apps.is_installed("django.contrib.staticfiles"):
parser.add_argument(
"--nostatic",
action="store_false",
dest="use_static_handler",
help="Tells Django to NOT automatically serve static files at STATIC_URL.",
)
parser.add_argument(
"--insecure",
action="store_true",
dest="insecure_serving",
help="Allows serving static files even if DEBUG is False.",
)
def handle(self, *args, **options):
self.http_timeout = options.get("http_timeout", None)
self.websocket_handshake_timeout = options.get("websocket_handshake_timeout", 5)
# Check Channels is installed right
if options["use_asgi"] and not hasattr(settings, "ASGI_APPLICATION"):
raise CommandError(
"You have not set ASGI_APPLICATION, which is needed to run the server."
)
# Dispatch upward
super().handle(*args, **options)
def inner_run(self, *args, **options):
# Maybe they want the wsgi one?
if not options.get("use_asgi", True):
if hasattr(RunserverCommand, "server_cls"):
self.server_cls = RunserverCommand.server_cls
return RunserverCommand.inner_run(self, *args, **options)
# Run checks
self.stdout.write("Performing system checks...\n\n")
self.check(display_num_errors=True)
self.check_migrations()
# Print helpful text
quit_command = "CTRL-BREAK" if sys.platform == "win32" else "CONTROL-C"
now = datetime.datetime.now().strftime("%B %d, %Y - %X")
self.stdout.write(now)
self.stdout.write(
(
"Django version %(version)s, using settings %(settings)r\n"
"Starting ASGI/Daphne version %(daphne_version)s development server"
" at %(protocol)s://%(addr)s:%(port)s/\n"
"Quit the server with %(quit_command)s.\n"
)
% {
"version": self.get_version(),
"daphne_version": __version__,
"settings": settings.SETTINGS_MODULE,
"protocol": self.protocol,
"addr": "[%s]" % self.addr if self._raw_ipv6 else self.addr,
"port": self.port,
"quit_command": quit_command,
}
)
# Launch server in 'main' thread. Signals are disabled as it's still
# actually a subthread under the autoreloader.
logger.debug("Daphne running, listening on %s:%s", self.addr, self.port)
# build the endpoint description string from host/port options
endpoints = build_endpoint_description_strings(host=self.addr, port=self.port)
try:
self.server_cls(
application=self.get_application(options),
endpoints=endpoints,
signal_handlers=not options["use_reloader"],
action_logger=self.log_action,
http_timeout=self.http_timeout,
root_path=getattr(settings, "FORCE_SCRIPT_NAME", "") or "",
websocket_handshake_timeout=self.websocket_handshake_timeout,
).run()
logger.debug("Daphne exited")
except KeyboardInterrupt:
shutdown_message = options.get("shutdown_message", "")
if shutdown_message:
self.stdout.write(shutdown_message)
return
def get_application(self, options):
"""
Returns the static files serving application wrapping the default application,
if static files should be served. Otherwise just returns the default
handler.
"""
staticfiles_installed = apps.is_installed("django.contrib.staticfiles")
use_static_handler = options.get("use_static_handler", staticfiles_installed)
insecure_serving = options.get("insecure_serving", False)
if use_static_handler and (settings.DEBUG or insecure_serving):
return ASGIStaticFilesHandler(get_default_application())
else:
return get_default_application()
def log_action(self, protocol, action, details):
"""
Logs various different kinds of requests to the console.
"""
# HTTP requests
if protocol == "http" and action == "complete":
msg = "HTTP %(method)s %(path)s %(status)s [%(time_taken).2f, %(client)s]"
# Utilize terminal colors, if available
if 200 <= details["status"] < 300:
# Put 2XX first, since it should be the common case
logger.info(self.style.HTTP_SUCCESS(msg), details)
elif 100 <= details["status"] < 200:
logger.info(self.style.HTTP_INFO(msg), details)
elif details["status"] == 304:
logger.info(self.style.HTTP_NOT_MODIFIED(msg), details)
elif 300 <= details["status"] < 400:
logger.info(self.style.HTTP_REDIRECT(msg), details)
elif details["status"] == 404:
logger.warning(self.style.HTTP_NOT_FOUND(msg), details)
elif 400 <= details["status"] < 500:
logger.warning(self.style.HTTP_BAD_REQUEST(msg), details)
else:
# Any 5XX, or any other response
logger.error(self.style.HTTP_SERVER_ERROR(msg), details)
# Websocket requests
elif protocol == "websocket" and action == "connected":
logger.info("WebSocket CONNECT %(path)s [%(client)s]", details)
elif protocol == "websocket" and action == "disconnected":
logger.info("WebSocket DISCONNECT %(path)s [%(client)s]", details)
elif protocol == "websocket" and action == "connecting":
logger.info("WebSocket HANDSHAKING %(path)s [%(client)s]", details)
elif protocol == "websocket" and action == "rejected":
logger.info("WebSocket REJECT %(path)s [%(client)s]", details)

View File

@ -1,37 +1,26 @@
# This has to be done first as Twisted is import-order-sensitive with reactors
import asyncio # isort:skip
import os # isort:skip
import sys # isort:skip
import warnings # isort:skip
from concurrent.futures import ThreadPoolExecutor # isort:skip
from twisted.internet import asyncioreactor # isort:skip
twisted_loop = asyncio.new_event_loop()
if "ASGI_THREADS" in os.environ:
twisted_loop.set_default_executor(
ThreadPoolExecutor(max_workers=int(os.environ["ASGI_THREADS"]))
)
current_reactor = sys.modules.get("twisted.internet.reactor", None)
if current_reactor is not None:
if not isinstance(current_reactor, asyncioreactor.AsyncioSelectorReactor):
warnings.warn(
"Something has already installed a non-asyncio Twisted reactor. Attempting to uninstall it; "
+ "you can fix this warning by importing daphne.server early in your codebase or "
+ "finding the package that imports Twisted and importing it later on.",
"Something has already installed a non-asyncio Twisted reactor. Attempting to uninstall it; " +
"you can fix this warning by importing daphne.server early in your codebase or " +
"finding the package that imports Twisted and importing it later on.",
UserWarning,
stacklevel=2,
)
del sys.modules["twisted.internet.reactor"]
asyncioreactor.install(twisted_loop)
asyncioreactor.install()
else:
asyncioreactor.install(twisted_loop)
asyncioreactor.install()
import asyncio
import logging
import time
import traceback
from concurrent.futures import CancelledError
from functools import partial
from twisted.internet import defer, reactor
from twisted.internet.endpoints import serverFromString
@ -44,7 +33,8 @@ from .ws_protocol import WebSocketFactory
logger = logging.getLogger(__name__)
class Server:
class Server(object):
def __init__(
self,
application,
@ -52,7 +42,6 @@ class Server:
signal_handlers=True,
action_logger=None,
http_timeout=None,
request_buffer_size=8192,
websocket_timeout=86400,
websocket_connect_timeout=20,
ping_interval=20,
@ -65,7 +54,8 @@ class Server:
websocket_handshake_timeout=5,
application_close_timeout=10,
ready_callable=None,
server_name="daphne",
# Deprecated and does not work, remove in version 2.2
ws_protocols=None,
):
self.application = application
self.endpoints = endpoints or []
@ -76,7 +66,6 @@ class Server:
self.http_timeout = http_timeout
self.ping_interval = ping_interval
self.ping_timeout = ping_timeout
self.request_buffer_size = request_buffer_size
self.proxy_forwarded_address_header = proxy_forwarded_address_header
self.proxy_forwarded_port_header = proxy_forwarded_port_header
self.proxy_forwarded_proto_header = proxy_forwarded_proto_header
@ -88,7 +77,6 @@ class Server:
self.verbosity = verbosity
self.abort_start = False
self.ready_callable = ready_callable
self.server_name = server_name
# Check our construction is actually sensible
if not self.endpoints:
logger.error("No endpoints. This server will not listen on anything.")
@ -99,17 +87,15 @@ class Server:
self.connections = {}
# Make the factory
self.http_factory = HTTPFactory(self)
self.ws_factory = WebSocketFactory(self, server=self.server_name)
self.ws_factory = WebSocketFactory(self, server="Daphne")
self.ws_factory.setProtocolOptions(
autoPingTimeout=self.ping_timeout,
allowNullOrigin=True,
openHandshakeTimeout=self.websocket_handshake_timeout,
openHandshakeTimeout=self.websocket_handshake_timeout
)
if self.verbosity <= 1:
# Redirect the Twisted log to nowhere
globalLogBeginner.beginLoggingTo(
[lambda _: None], redirectStandardIO=False, discardBuffer=True
)
globalLogBeginner.beginLoggingTo([lambda _: None], redirectStandardIO=False, discardBuffer=True)
else:
globalLogBeginner.beginLoggingTo([STDLibLogObserver(__name__)])
@ -117,9 +103,7 @@ class Server:
if http.H2_ENABLED:
logger.info("HTTP/2 support enabled")
else:
logger.info(
"HTTP/2 support not enabled (install the http2 and tls Twisted extras)"
)
logger.info("HTTP/2 support not enabled (install the http2 and tls Twisted extras)")
# Kick off the timeout loop
reactor.callLater(1, self.application_checker)
@ -157,11 +141,7 @@ class Server:
host = port.getHost()
if hasattr(host, "host") and hasattr(host, "port"):
self.listening_addresses.append((host.host, host.port))
logger.info(
"Listening on TCP address %s:%s",
port.getHost().host,
port.getHost().port,
)
logger.info("Listening on TCP address %s:%s", port.getHost().host, port.getHost().port)
def listen_error(self, failure):
logger.critical("Listen failure: %s", failure.getErrorMessage())
@ -188,11 +168,7 @@ class Server:
def protocol_disconnected(self, protocol):
# Set its disconnected time (the loops will come and clean it up)
# Do not set it if it is already set. Overwriting it might
# cause it to never be cleaned up.
# See https://github.com/django/channels/issues/1181
if "disconnected" not in self.connections[protocol]:
self.connections[protocol]["disconnected"] = time.time()
self.connections[protocol]["disconnected"] = time.time()
### Internal event/message handling
@ -207,57 +183,26 @@ class Server:
assert "application_instance" not in self.connections[protocol]
# Make an instance of the application
input_queue = asyncio.Queue()
scope.setdefault("asgi", {"version": "3.0"})
application_instance = self.application(
scope=scope,
receive=input_queue.get,
send=partial(self.handle_reply, protocol),
)
application_instance = self.application(scope=scope)
# Run it, and stash the future for later checking
if protocol not in self.connections:
return None
self.connections[protocol]["application_instance"] = asyncio.ensure_future(
application_instance,
loop=asyncio.get_event_loop(),
)
self.connections[protocol]["application_instance"] = asyncio.ensure_future(application_instance(
receive=input_queue.get,
send=lambda message: self.handle_reply(protocol, message),
), loop=asyncio.get_event_loop())
return input_queue
async def handle_reply(self, protocol, message):
"""
Coroutine that jumps the reply message from asyncio to Twisted
"""
# Don't do anything if the connection is closed or does not exist
if protocol not in self.connections or self.connections[protocol].get(
"disconnected", None
):
# Don't do anything if the connection is closed
if self.connections[protocol].get("disconnected", None):
return
try:
self.check_headers_type(message)
except ValueError:
# Ensure to send SOME reply.
protocol.basic_error(500, b"Server Error", "Server Error")
raise
# Let the protocol handle it
protocol.handle_reply(message)
@staticmethod
def check_headers_type(message):
if not message["type"] == "http.response.start":
return
for k, v in message.get("headers", []):
if not isinstance(k, bytes):
raise ValueError(
"Header name '{}' expected to be `bytes`, but got `{}`".format(
k, type(k)
)
)
if not isinstance(v, bytes):
raise ValueError(
"Header value '{}' expected to be `bytes`, but got `{}`".format(
v, type(v)
)
)
### Utility
def application_checker(self):
@ -270,10 +215,7 @@ class Server:
application_instance = details.get("application_instance", None)
# First, see if the protocol disconnected and the app has taken
# too long to close up
if (
disconnected
and time.time() - disconnected > self.application_close_timeout
):
if disconnected and time.time() - disconnected > self.application_close_timeout:
if application_instance and not application_instance.done():
logger.warning(
"Application instance %r for connection %s took too long to shut down and was killed.",
@ -285,7 +227,7 @@ class Server:
if application_instance and application_instance.done():
try:
exception = application_instance.exception()
except (CancelledError, asyncio.CancelledError):
except CancelledError:
# Future cancellation. We can ignore this.
pass
else:
@ -294,10 +236,16 @@ class Server:
# Protocol is asking the server to exit (likely during test)
self.stop()
else:
exception_output = "{}\n{}{}".format(
exception,
"".join(traceback.format_tb(
exception.__traceback__,
)),
" {}".format(exception),
)
logger.error(
"Exception inside application: %s",
exception,
exc_info=exception,
exception_output,
)
if not disconnected:
protocol.handle_exception(exception)

View File

@ -6,8 +6,13 @@ import tempfile
import traceback
from concurrent.futures import CancelledError
from twisted.internet import reactor
class BaseDaphneTestingInstance:
from .endpoints import build_endpoint_description_strings
from .server import Server
class DaphneTestingInstance:
"""
Launches an instance of Daphne in a subprocess, with a host and port
attribute allowing you to call it.
@ -17,23 +22,17 @@ class BaseDaphneTestingInstance:
startup_timeout = 2
def __init__(
self, xff=False, http_timeout=None, request_buffer_size=None, *, application
):
def __init__(self, xff=False, http_timeout=None):
self.xff = xff
self.http_timeout = http_timeout
self.host = "127.0.0.1"
self.request_buffer_size = request_buffer_size
self.application = application
def get_application(self):
return self.application
def __enter__(self):
# Clear result storage
TestApplication.delete_setup()
TestApplication.delete_result()
# Option Daphne features
kwargs = {}
if self.request_buffer_size:
kwargs["request_buffer_size"] = self.request_buffer_size
# Optionally enable X-Forwarded-For support.
if self.xff:
kwargs["proxy_forwarded_address_header"] = "X-Forwarded-For"
@ -44,7 +43,7 @@ class BaseDaphneTestingInstance:
# Start up process
self.process = DaphneProcess(
host=self.host,
get_application=self.get_application,
application=TestApplication,
kwargs=kwargs,
setup=self.process_setup,
teardown=self.process_teardown,
@ -78,21 +77,6 @@ class BaseDaphneTestingInstance:
"""
pass
def get_received(self):
pass
class DaphneTestingInstance(BaseDaphneTestingInstance):
def __init__(self, *args, **kwargs):
self.lock = multiprocessing.Lock()
super().__init__(*args, **kwargs, application=TestApplication(lock=self.lock))
def __enter__(self):
# Clear result storage
TestApplication.delete_setup()
TestApplication.delete_result()
return super().__enter__()
def get_received(self):
"""
Returns the scope and messages the test application has received
@ -103,8 +87,7 @@ class DaphneTestingInstance(BaseDaphneTestingInstance):
raises them.
"""
try:
with self.lock:
inner_result = TestApplication.load_result()
inner_result = TestApplication.load_result()
except FileNotFoundError:
raise ValueError("No results available yet.")
# Check for exception
@ -117,7 +100,9 @@ class DaphneTestingInstance(BaseDaphneTestingInstance):
Adds messages for the application to send back.
The next time it receives an incoming message, it will reply with these.
"""
TestApplication.save_setup(response_messages=messages)
TestApplication.save_setup(
response_messages=messages,
)
class DaphneProcess(multiprocessing.Process):
@ -126,61 +111,40 @@ class DaphneProcess(multiprocessing.Process):
port it ends up listening on back to the parent process.
"""
def __init__(
self, host, get_application, kwargs=None, setup=None, teardown=None, port=None
):
def __init__(self, host, application, kwargs=None, setup=None, teardown=None):
super().__init__()
self.host = host
self.get_application = get_application
self.application = application
self.kwargs = kwargs or {}
self.setup = setup
self.teardown = teardown
self.port = multiprocessing.Value("i", port if port is not None else 0)
self.setup = setup or (lambda: None)
self.teardown = teardown or (lambda: None)
self.port = multiprocessing.Value("i")
self.ready = multiprocessing.Event()
self.errors = multiprocessing.Queue()
def run(self):
# OK, now we are in a forked child process, and want to use the reactor.
# However, FreeBSD systems like MacOS do not fork the underlying Kqueue,
# which asyncio (hence asyncioreactor) is built on.
# Therefore, we should uninstall the broken reactor and install a new one.
_reinstall_reactor()
from twisted.internet import reactor
from .endpoints import build_endpoint_description_strings
from .server import Server
application = self.get_application()
try:
# Create the server class
endpoints = build_endpoint_description_strings(
host=self.host, port=self.port.value
)
endpoints = build_endpoint_description_strings(host=self.host, port=0)
self.server = Server(
application=application,
application=self.application,
endpoints=endpoints,
signal_handlers=False,
**self.kwargs,
**self.kwargs
)
# Set up a poller to look for the port
reactor.callLater(0.1, self.resolve_port)
# Run with setup/teardown
if self.setup is not None:
self.setup()
self.setup()
try:
self.server.run()
finally:
if self.teardown is not None:
self.teardown()
except BaseException as e:
self.teardown()
except Exception as e:
# Put the error on our queue so the parent gets it
self.errors.put((e, traceback.format_exc()))
def resolve_port(self):
from twisted.internet import reactor
if self.server.listening_addresses:
self.port.value = self.server.listening_addresses[0][1]
self.ready.set()
@ -197,22 +161,19 @@ class TestApplication:
setup_storage = os.path.join(tempfile.gettempdir(), "setup.testio")
result_storage = os.path.join(tempfile.gettempdir(), "result.testio")
def __init__(self, lock):
self.lock = lock
def __init__(self, scope):
self.scope = scope
self.messages = []
async def __call__(self, scope, receive, send):
self.scope = scope
async def __call__(self, send, receive):
# Receive input and send output
logging.debug("test app coroutine alive")
try:
while True:
# Receive a message and save it into the result store
self.messages.append(await receive())
self.lock.acquire()
logging.debug("test app received %r", self.messages[-1])
self.save_result(self.scope, self.messages)
self.lock.release()
# See if there are any messages to send back
setup = self.load_setup()
self.delete_setup()
@ -232,7 +193,12 @@ class TestApplication:
Stores setup information.
"""
with open(cls.setup_storage, "wb") as fh:
pickle.dump({"response_messages": response_messages}, fh)
pickle.dump(
{
"response_messages": response_messages,
},
fh,
)
@classmethod
def load_setup(cls):
@ -252,7 +218,13 @@ class TestApplication:
We could use pickle here, but that seems wrong, still, somehow.
"""
with open(cls.result_storage, "wb") as fh:
pickle.dump({"scope": scope, "messages": messages}, fh)
pickle.dump(
{
"scope": scope,
"messages": messages,
},
fh,
)
@classmethod
def save_exception(cls, exception):
@ -261,7 +233,12 @@ class TestApplication:
We could use pickle here, but that seems wrong, still, somehow.
"""
with open(cls.result_storage, "wb") as fh:
pickle.dump({"exception": exception}, fh)
pickle.dump(
{
"exception": exception,
},
fh,
)
@classmethod
def load_result(cls):
@ -290,24 +267,3 @@ class TestApplication:
os.unlink(cls.result_storage)
except OSError:
pass
def _reinstall_reactor():
import asyncio
import sys
from twisted.internet import asyncioreactor
# Uninstall the reactor.
if "twisted.internet.reactor" in sys.modules:
del sys.modules["twisted.internet.reactor"]
# The daphne.server module may have already installed the reactor.
# If so, using this module will use uninstalled one, thus we should
# reimport this module too.
if "daphne.server" in sys.modules:
del sys.modules["daphne.server"]
event_loop = asyncio.new_event_loop()
asyncioreactor.install(event_loop)
asyncio.set_event_loop(event_loop)

View File

@ -7,7 +7,7 @@ from zope.interface import implementer
@implementer(IPlugin, IStreamServerEndpointStringParser)
class _FDParser:
class _FDParser(object):
prefix = "fd"
def _parseServer(self, reactor, fileno, domain=socket.AF_INET):

View File

@ -1,12 +1,7 @@
import importlib
import re
from twisted.web.http_headers import Headers
# Header name regex as per h11.
# https://github.com/python-hyper/h11/blob/a2c68948accadc3876dffcf979d98002e4a4ed27/h11/_abnf.py#L10-L21
HEADER_NAME_RE = re.compile(rb"[-!#$%&'*+.^_`|~0-9a-zA-Z]+")
def import_by_path(path):
"""
@ -27,14 +22,12 @@ def header_value(headers, header_name):
return value.decode("utf-8")
def parse_x_forwarded_for(
headers,
address_header_name="X-Forwarded-For",
port_header_name="X-Forwarded-Port",
proto_header_name="X-Forwarded-Proto",
original_addr=None,
original_scheme=None,
):
def parse_x_forwarded_for(headers,
address_header_name="X-Forwarded-For",
port_header_name="X-Forwarded-Port",
proto_header_name="X-Forwarded-Proto",
original_addr=None,
original_scheme=None):
"""
Parses an X-Forwarded-For header and returns a host/port pair as a list.

View File

@ -3,11 +3,7 @@ import time
import traceback
from urllib.parse import unquote
from autobahn.twisted.websocket import (
ConnectionDeny,
WebSocketServerFactory,
WebSocketServerProtocol,
)
from autobahn.twisted.websocket import ConnectionDeny, WebSocketServerFactory, WebSocketServerProtocol
from twisted.internet import defer
from .utils import parse_x_forwarded_for
@ -31,21 +27,17 @@ class WebSocketProtocol(WebSocketServerProtocol):
self.server.protocol_connected(self)
self.request = request
self.protocol_to_accept = None
self.root_path = self.server.root_path
self.socket_opened = time.time()
self.last_ping = time.time()
try:
# Sanitize and decode headers, potentially extracting root path
# Sanitize and decode headers
self.clean_headers = []
for name, value in request.headers.items():
name = name.encode("ascii")
# Prevent CVE-2015-0219
if b"_" in name:
continue
if name.lower() == b"daphne-root-path":
self.root_path = unquote(value)
else:
self.clean_headers.append((name.lower(), value.encode("latin1")))
self.clean_headers.append((name.lower(), value.encode("latin1")))
# Get client address if possible
peer = self.transport.getPeer()
host = self.transport.getHost()
@ -62,36 +54,32 @@ class WebSocketProtocol(WebSocketServerProtocol):
self.server.proxy_forwarded_address_header,
self.server.proxy_forwarded_port_header,
self.server.proxy_forwarded_proto_header,
self.client_addr,
self.client_addr
)
# Decode websocket subprotocol options
subprotocols = []
for header, value in self.clean_headers:
if header == b"sec-websocket-protocol":
subprotocols = [
x.strip() for x in unquote(value.decode("ascii")).split(",")
x.strip()
for x in
unquote(value.decode("ascii")).split(",")
]
# Make new application instance with scope
self.path = request.path.encode("ascii")
self.application_deferred = defer.maybeDeferred(
self.server.create_application,
self,
{
"type": "websocket",
"path": unquote(self.path.decode("ascii")),
"raw_path": self.path,
"root_path": self.root_path,
"headers": self.clean_headers,
"query_string": self._raw_query_string, # Passed by HTTP protocol
"client": self.client_addr,
"server": self.server_addr,
"subprotocols": subprotocols,
},
)
self.application_deferred = defer.maybeDeferred(self.server.create_application, self, {
"type": "websocket",
"path": unquote(self.path.decode("ascii")),
"headers": self.clean_headers,
"query_string": self._raw_query_string, # Passed by HTTP protocol
"client": self.client_addr,
"server": self.server_addr,
"subprotocols": subprotocols,
})
if self.application_deferred is not None:
self.application_deferred.addCallback(self.applicationCreateWorked)
self.application_deferred.addErrback(self.applicationCreateFailed)
except Exception:
except Exception as e:
# Exceptions here are not displayed right, just 500.
# Turn them into an ERROR log.
logger.error(traceback.format_exc())
@ -110,16 +98,10 @@ class WebSocketProtocol(WebSocketServerProtocol):
self.application_queue = application_queue
# Send over the connect message
self.application_queue.put_nowait({"type": "websocket.connect"})
self.server.log_action(
"websocket",
"connecting",
{
"path": self.request.path,
"client": (
"%s:%s" % tuple(self.client_addr) if self.client_addr else None
),
},
)
self.server.log_action("websocket", "connecting", {
"path": self.request.path,
"client": "%s:%s" % tuple(self.client_addr) if self.client_addr else None,
})
def applicationCreateFailed(self, failure):
"""
@ -133,16 +115,10 @@ class WebSocketProtocol(WebSocketServerProtocol):
def onOpen(self):
# Send news that this channel is open
logger.debug("WebSocket %s open and established", self.client_addr)
self.server.log_action(
"websocket",
"connected",
{
"path": self.request.path,
"client": (
"%s:%s" % tuple(self.client_addr) if self.client_addr else None
),
},
)
self.server.log_action("websocket", "connected", {
"path": self.request.path,
"client": "%s:%s" % tuple(self.client_addr) if self.client_addr else None,
})
def onMessage(self, payload, isBinary):
# If we're muted, do nothing.
@ -152,13 +128,15 @@ class WebSocketProtocol(WebSocketServerProtocol):
logger.debug("WebSocket incoming frame on %s", self.client_addr)
self.last_ping = time.time()
if isBinary:
self.application_queue.put_nowait(
{"type": "websocket.receive", "bytes": payload}
)
self.application_queue.put_nowait({
"type": "websocket.receive",
"bytes": payload,
})
else:
self.application_queue.put_nowait(
{"type": "websocket.receive", "text": payload.decode("utf8")}
)
self.application_queue.put_nowait({
"type": "websocket.receive",
"text": payload.decode("utf8"),
})
def onClose(self, wasClean, code, reason):
"""
@ -167,19 +145,14 @@ class WebSocketProtocol(WebSocketServerProtocol):
self.server.protocol_disconnected(self)
logger.debug("WebSocket closed for %s", self.client_addr)
if not self.muted and hasattr(self, "application_queue"):
self.application_queue.put_nowait(
{"type": "websocket.disconnect", "code": code}
)
self.server.log_action(
"websocket",
"disconnected",
{
"path": self.request.path,
"client": (
"%s:%s" % tuple(self.client_addr) if self.client_addr else None
),
},
)
self.application_queue.put_nowait({
"type": "websocket.disconnect",
"code": code,
})
self.server.log_action("websocket", "disconnected", {
"path": self.request.path,
"client": "%s:%s" % tuple(self.client_addr) if self.client_addr else None,
})
### Internal event handling
@ -198,8 +171,9 @@ class WebSocketProtocol(WebSocketServerProtocol):
raise ValueError("Socket has not been accepted, so cannot send over it")
if message.get("bytes", None) and message.get("text", None):
raise ValueError(
"Got invalid WebSocket reply message on %s - contains both bytes and text keys"
% (message,)
"Got invalid WebSocket reply message on %s - contains both bytes and text keys" % (
message,
)
)
if message.get("bytes", None):
self.serverSend(message["bytes"], True)
@ -213,9 +187,7 @@ class WebSocketProtocol(WebSocketServerProtocol):
if hasattr(self, "handshake_deferred"):
# If the handshake is still ongoing, we need to emit a HTTP error
# code rather than a WebSocket one.
self.handshake_deferred.errback(
ConnectionDeny(code=500, reason="Internal server error")
)
self.handshake_deferred.errback(ConnectionDeny(code=500, reason="Internal server error"))
else:
self.sendCloseFrame(code=1011)
@ -231,22 +203,14 @@ class WebSocketProtocol(WebSocketServerProtocol):
"""
Called when we get a message saying to reject the connection.
"""
self.handshake_deferred.errback(
ConnectionDeny(code=403, reason="Access denied")
)
self.handshake_deferred.errback(ConnectionDeny(code=403, reason="Access denied"))
del self.handshake_deferred
self.server.protocol_disconnected(self)
logger.debug("WebSocket %s rejected by application", self.client_addr)
self.server.log_action(
"websocket",
"rejected",
{
"path": self.request.path,
"client": (
"%s:%s" % tuple(self.client_addr) if self.client_addr else None
),
},
)
self.server.log_action("websocket", "rejected", {
"path": self.request.path,
"client": "%s:%s" % tuple(self.client_addr) if self.client_addr else None,
})
def serverSend(self, content, binary=False):
"""
@ -280,10 +244,7 @@ class WebSocketProtocol(WebSocketServerProtocol):
Called periodically to see if we should timeout something
"""
# Web timeout checking
if (
self.duration() > self.server.websocket_timeout
and self.server.websocket_timeout >= 0
):
if self.duration() > self.server.websocket_timeout and self.server.websocket_timeout >= 0:
self.serverClose()
# Ping check
# If we're still connecting, deny the connection
@ -302,7 +263,7 @@ class WebSocketProtocol(WebSocketServerProtocol):
return id(self) == id(other)
def __repr__(self):
return f"<WebSocketProtocol client={self.client_addr!r} path={self.path!r}>"
return "<WebSocketProtocol client=%r path=%r>" % (self.client_addr, self.path)
class WebSocketFactory(WebSocketServerFactory):
@ -323,9 +284,9 @@ class WebSocketFactory(WebSocketServerFactory):
Builds protocol instances. We use this to inject the factory object into the protocol.
"""
try:
protocol = super().buildProtocol(addr)
protocol = super(WebSocketFactory, self).buildProtocol(addr)
protocol.factory = self
return protocol
except Exception:
except Exception as e:
logger.error("Cannot build protocol: %s" % traceback.format_exc())
raise

View File

@ -1,81 +0,0 @@
[project]
name = "daphne"
dynamic = ["version"]
description = "Django ASGI (HTTP/WebSocket) server"
requires-python = ">=3.9"
authors = [
{ name = "Django Software Foundation", email = "foundation@djangoproject.com" },
]
license = { text = "BSD" }
classifiers = [
"Development Status :: 4 - Beta",
"Environment :: Web Environment",
"Intended Audience :: Developers",
"License :: OSI Approved :: BSD License",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Topic :: Internet :: WWW/HTTP",
]
dependencies = ["asgiref>=3.5.2,<4", "autobahn>=22.4.2", "twisted[tls]>=22.4"]
[project.optional-dependencies]
tests = [
"django",
"hypothesis",
"pytest",
"pytest-asyncio",
"pytest-cov",
"black",
"tox",
"flake8",
"flake8-bugbear",
"mypy",
]
[project.urls]
homepage = "https://github.com/django/daphne"
documentation = "https://channels.readthedocs.io"
repository = "https://github.com/django/daphne.git"
changelog = "https://github.com/django/daphne/blob/main/CHANGELOG.txt"
issues = "https://github.com/django/daphne/issues"
[project.scripts]
daphne = "daphne.cli:CommandLineInterface.entrypoint"
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[tool.setuptools]
package-dir = { daphne = "daphne", twisted = "daphne/twisted" }
[tool.setuptools.dynamic]
version = { attr = "daphne.__version__" }
readme = { file = "README.rst", content-type = "text/x-rst" }
[tool.isort]
profile = "black"
[tool.pytest]
testpaths = ["tests"]
asyncio_mode = "strict"
filterwarnings = ["ignore::pytest.PytestDeprecationWarning"]
[tool.coverage.run]
omit = ["tests/*"]
concurrency = ["multiprocessing"]
[tool.coverage.report]
show_missing = "true"
skip_covered = "true"
[tool.coverage.html]
directory = "reports/coverage_html_report"

15
setup.cfg Normal file
View File

@ -0,0 +1,15 @@
[bdist_wheel]
universal=1
[tool:pytest]
addopts = tests/
[isort]
line_length = 120
multi_line_output = 3
known_first_party = channels,daphne,asgiref
[flake8]
exclude = venv/*,tox/*,docs/*,testproject/*,js_client/*,.eggs/*
ignore = E123,E128,E266,E402,W503,E731,W601
max-line-length = 120

55
setup.py Executable file
View File

@ -0,0 +1,55 @@
import os
from setuptools import find_packages, setup
from daphne import __version__
# We use the README as the long_description
readme_path = os.path.join(os.path.dirname(__file__), "README.rst")
with open(readme_path) as fp:
long_description = fp.read()
setup(
name="daphne",
version=__version__,
url="https://github.com/django/daphne",
author="Django Software Foundation",
author_email="foundation@djangoproject.com",
description="Django ASGI (HTTP/WebSocket) server",
long_description=long_description,
license="BSD",
zip_safe=False,
package_dir={"twisted": "daphne/twisted"},
packages=find_packages() + ["twisted.plugins"],
include_package_data=True,
install_requires=[
"twisted>=18.7",
"autobahn>=0.18",
],
setup_requires=[
"pytest-runner",
],
extras_require={
"tests": [
"hypothesis",
"pytest",
"pytest-asyncio~=0.8",
],
},
entry_points={"console_scripts": [
"daphne = daphne.cli:CommandLineInterface.entrypoint",
]},
classifiers=[
"Development Status :: 4 - Beta",
"Environment :: Web Environment",
"Intended Audience :: Developers",
"License :: OSI Approved :: BSD License",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Topic :: Internet :: WWW/HTTP",
],
)

View File

@ -19,37 +19,26 @@ class DaphneTestCase(unittest.TestCase):
### Plain HTTP helpers
def run_daphne_http(
self,
method,
path,
params,
body,
responses,
headers=None,
timeout=1,
xff=False,
request_buffer_size=None,
):
def run_daphne_http(self, method, path, params, body, responses, headers=None, timeout=1, xff=False):
"""
Runs Daphne with the given request callback (given the base URL)
and response messages.
"""
with DaphneTestingInstance(
xff=xff, request_buffer_size=request_buffer_size
) as test_app:
with DaphneTestingInstance(xff=xff) as test_app:
# Add the response messages
test_app.add_send_messages(responses)
# Send it the request. We have to do this the long way to allow
# duplicate headers.
conn = HTTPConnection(test_app.host, test_app.port, timeout=timeout)
# Make sure path is urlquoted and add any params
path = parse.quote(path)
if params:
path += "?" + parse.urlencode(params, doseq=True)
conn.putrequest(method, path, skip_accept_encoding=True, skip_host=True)
# Manually send over headers
# Manually send over headers (encoding any non-safe values as best we can)
if headers:
for header_name, header_value in headers:
conn.putheader(header_name, header_value)
conn.putheader(header_name.encode("utf8"), header_value.encode("utf8"))
# Send body if provided.
if body:
conn.putheader("Content-Length", str(len(body)))
@ -61,22 +50,16 @@ class DaphneTestCase(unittest.TestCase):
except socket.timeout:
# See if they left an exception for us to load
test_app.get_received()
raise RuntimeError(
"Daphne timed out handling request, no exception found."
)
raise RuntimeError("Daphne timed out handling request, no exception found.")
# Return scope, messages, response
return test_app.get_received() + (response,)
return test_app.get_received() + (response, )
def run_daphne_raw(self, data, *, responses=None, timeout=1):
def run_daphne_raw(self, data, timeout=1):
"""
Runs Daphne and sends it the given raw bytestring over a socket.
Accepts list of response messages the application will reply with.
Returns what Daphne sends back.
Runs daphne and sends it the given raw bytestring over a socket. Returns what it sends back.
"""
assert isinstance(data, bytes)
with DaphneTestingInstance() as test_app:
if responses is not None:
test_app.add_send_messages(responses)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.settimeout(timeout)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
@ -85,20 +68,9 @@ class DaphneTestCase(unittest.TestCase):
try:
return s.recv(1000000)
except socket.timeout:
raise RuntimeError(
"Daphne timed out handling raw request, no exception found."
)
raise RuntimeError("Daphne timed out handling raw request, no exception found.")
def run_daphne_request(
self,
method,
path,
params=None,
body=None,
headers=None,
xff=False,
request_buffer_size=None,
):
def run_daphne_request(self, method, path, params=None, body=None, headers=None, xff=False):
"""
Convenience method for just testing request handling.
Returns (scope, messages)
@ -110,7 +82,6 @@ class DaphneTestCase(unittest.TestCase):
body=body,
headers=headers,
xff=xff,
request_buffer_size=request_buffer_size,
responses=[
{"type": "http.response.start", "status": 200},
{"type": "http.response.body", "body": b"OK"},
@ -124,21 +95,17 @@ class DaphneTestCase(unittest.TestCase):
Returns (scope, messages)
"""
_, _, response = self.run_daphne_http(
method="GET", path="/", params={}, body=b"", responses=response_messages
method="GET",
path="/",
params={},
body=b"",
responses=response_messages,
)
return response
### WebSocket helpers
def websocket_handshake(
self,
test_app,
path="/",
params=None,
headers=None,
subprotocols=None,
timeout=1,
):
def websocket_handshake(self, test_app, path="/", params=None, headers=None, subprotocols=None, timeout=1):
"""
Runs a WebSocket handshake negotiation and returns the raw socket
object & the selected subprotocol.
@ -149,27 +116,27 @@ class DaphneTestCase(unittest.TestCase):
# Send it the request. We have to do this the long way to allow
# duplicate headers.
conn = HTTPConnection(test_app.host, test_app.port, timeout=timeout)
# Make sure path is urlquoted and add any params
path = parse.quote(path)
if params:
path += "?" + parse.urlencode(params, doseq=True)
conn.putrequest("GET", path, skip_accept_encoding=True, skip_host=True)
# Do WebSocket handshake headers + any other headers
if headers is None:
headers = []
headers.extend(
[
(b"Host", b"example.com"),
(b"Upgrade", b"websocket"),
(b"Connection", b"Upgrade"),
(b"Sec-WebSocket-Key", b"x3JJHMbDL1EzLkh9GBhXDw=="),
(b"Sec-WebSocket-Version", b"13"),
(b"Origin", b"http://example.com"),
]
)
headers.extend([
("Host", "example.com"),
("Upgrade", "websocket"),
("Connection", "Upgrade"),
("Sec-WebSocket-Key", "x3JJHMbDL1EzLkh9GBhXDw=="),
("Sec-WebSocket-Version", "13"),
("Origin", "http://example.com")
])
if subprotocols:
headers.append((b"Sec-WebSocket-Protocol", ", ".join(subprotocols)))
headers.append(("Sec-WebSocket-Protocol", ", ".join(subprotocols)))
if headers:
for header_name, header_value in headers:
conn.putheader(header_name, header_value)
conn.putheader(header_name.encode("utf8"), header_value.encode("utf8"))
conn.endheaders()
# Read out the response
try:
@ -182,7 +149,10 @@ class DaphneTestCase(unittest.TestCase):
if response.status != 101:
raise RuntimeError("WebSocket upgrade did not result in status code 101")
# Prepare headers for subprotocol searching
response_headers = {n.lower(): v for n, v in response.getheaders()}
response_headers = dict(
(n.lower(), v)
for n, v in response.getheaders()
)
response.read()
assert not response.closed
# Return the raw socket and any subprotocol
@ -252,7 +222,7 @@ class DaphneTestCase(unittest.TestCase):
"""
try:
socket.inet_aton(address)
except OSError:
except socket.error:
self.fail("'%s' is not a valid IP address." % address)
def assert_key_sets(self, required_keys, optional_keys, actual_keys):
@ -264,13 +234,17 @@ class DaphneTestCase(unittest.TestCase):
# Make sure all required keys are present
self.assertTrue(required_keys <= present_keys)
# Assert that no other keys are present
self.assertEqual(set(), present_keys - required_keys - optional_keys)
self.assertEqual(
set(),
present_keys - required_keys - optional_keys,
)
def assert_valid_path(self, path):
def assert_valid_path(self, path, request_path):
"""
Checks the path is valid and already url-decoded.
"""
self.assertIsInstance(path, str)
self.assertEqual(path, request_path)
# Assert that it's already url decoded
self.assertEqual(path, parse.unquote(path))

View File

@ -6,9 +6,7 @@ from hypothesis import strategies
HTTP_METHODS = ["OPTIONS", "GET", "HEAD", "POST", "PUT", "DELETE", "TRACE", "CONNECT"]
# Unicode characters of the "Letter" category
letters = strategies.characters(
whitelist_categories=("Lu", "Ll", "Lt", "Lm", "Lo", "Nl")
)
letters = strategies.characters(whitelist_categories=("Lu", "Ll", "Lt", "Lm", "Lo", "Nl"))
def http_method():
@ -17,23 +15,25 @@ def http_method():
def _http_path_portion():
alphabet = string.ascii_letters + string.digits + "-._~"
return strategies.text(min_size=1, max_size=128, alphabet=alphabet)
return strategies.text(min_size=1, average_size=10, max_size=128, alphabet=alphabet)
def http_path():
"""
Returns a URL path (not encoded).
"""
return strategies.lists(_http_path_portion(), min_size=0, max_size=10).map(
lambda s: "/" + "/".join(s)
)
return strategies.lists(
_http_path_portion(),
min_size=0,
max_size=10,
).map(lambda s: "/" + "/".join(s))
def http_body():
"""
Returns random binary body data.
"""
return strategies.binary(min_size=0, max_size=1500)
return strategies.binary(min_size=0, average_size=600, max_size=1500)
def valid_bidi(value):
@ -52,22 +52,32 @@ def valid_bidi(value):
def _domain_label():
return strategies.text(alphabet=letters, min_size=1, max_size=63).filter(valid_bidi)
return strategies.text(
alphabet=letters,
min_size=1,
average_size=6,
max_size=63,
).filter(valid_bidi)
def international_domain_name():
"""
Returns a byte string of a domain name, IDNA-encoded.
"""
return strategies.lists(_domain_label(), min_size=2).map(
lambda s: (".".join(s)).encode("idna")
)
return strategies.lists(
_domain_label(),
min_size=2,
average_size=2,
).map(lambda s: (".".join(s)).encode("idna"))
def _query_param():
return strategies.text(alphabet=letters, min_size=1, max_size=255).map(
lambda s: s.encode("utf8")
)
return strategies.text(
alphabet=letters,
min_size=1,
average_size=10,
max_size=255,
).map(lambda s: s.encode("utf8"))
def query_params():
@ -77,7 +87,9 @@ def query_params():
ensures that the total urlencoded query string is not longer than 1500 characters.
"""
return strategies.lists(
strategies.tuples(_query_param(), _query_param()), min_size=0
strategies.tuples(_query_param(), _query_param()),
min_size=0,
average_size=5,
).filter(lambda x: len(parse.urlencode(x)) < 1500)
@ -89,8 +101,10 @@ def header_name():
and 20 characters long
"""
return strategies.text(
alphabet=string.ascii_letters + string.digits + "-", min_size=1, max_size=30
).map(lambda s: s.encode("utf-8"))
alphabet=string.ascii_letters + string.digits + "-",
min_size=1,
max_size=30,
)
def header_value():
@ -100,18 +114,12 @@ def header_value():
"For example, the Apache 2.3 server by default limits the size of each field to 8190 bytes"
https://en.wikipedia.org/wiki/List_of_HTTP_header_fields
"""
return (
strategies.text(
alphabet=string.ascii_letters
+ string.digits
+ string.punctuation.replace(",", "")
+ " /t",
min_size=1,
max_size=8190,
)
.map(lambda s: s.encode("utf-8"))
.filter(lambda s: len(s) < 8190)
)
return strategies.text(
alphabet=string.ascii_letters + string.digits + string.punctuation.replace(",", "") + " /t",
min_size=1,
average_size=40,
max_size=8190,
).filter(lambda s: len(s.encode("utf8")) < 8190)
def headers():
@ -122,5 +130,8 @@ def headers():
https://en.wikipedia.org/wiki/List_of_HTTP_header_fields
"""
return strategies.lists(
strategies.tuples(header_name(), header_value()), min_size=0, max_size=100
strategies.tuples(header_name(), header_value()),
min_size=0,
average_size=10,
max_size=100,
)

View File

@ -1,21 +0,0 @@
import django
from django.conf import settings
from django.test.utils import override_settings
from daphne.checks import check_daphne_installed
def test_check_daphne_installed():
"""
Test check error is raised if daphne is not listed before staticfiles, and vice versa.
"""
settings.configure(
INSTALLED_APPS=["daphne.apps.DaphneConfig", "django.contrib.staticfiles"]
)
django.setup()
errors = check_daphne_installed(None)
assert len(errors) == 0
with override_settings(INSTALLED_APPS=["django.contrib.staticfiles", "daphne"]):
errors = check_daphne_installed(None)
assert len(errors) == 1
assert errors[0].id == "daphne.E001"

View File

@ -1,7 +1,7 @@
# coding: utf8
import logging
import os
from argparse import ArgumentError
from unittest import TestCase, skipUnless
from unittest import TestCase
from daphne.cli import CommandLineInterface
from daphne.endpoints import build_endpoint_description_strings as build
@ -18,32 +18,45 @@ class TestEndpointDescriptions(TestCase):
def testTcpPortBindings(self):
self.assertEqual(
build(port=1234, host="example.com"),
["tcp:port=1234:interface=example.com"],
["tcp:port=1234:interface=example.com"]
)
self.assertEqual(
build(port=8000, host="127.0.0.1"), ["tcp:port=8000:interface=127.0.0.1"]
build(port=8000, host="127.0.0.1"),
["tcp:port=8000:interface=127.0.0.1"]
)
self.assertEqual(
build(port=8000, host="[200a::1]"), [r"tcp:port=8000:interface=200a\:\:1"]
build(port=8000, host="[200a::1]"),
[r'tcp:port=8000:interface=200a\:\:1']
)
self.assertEqual(
build(port=8000, host="200a::1"), [r"tcp:port=8000:interface=200a\:\:1"]
build(port=8000, host="200a::1"),
[r'tcp:port=8000:interface=200a\:\:1']
)
# incomplete port/host kwargs raise errors
self.assertRaises(ValueError, build, port=123)
self.assertRaises(ValueError, build, host="example.com")
self.assertRaises(
ValueError,
build, port=123
)
self.assertRaises(
ValueError,
build, host="example.com"
)
def testUnixSocketBinding(self):
self.assertEqual(
build(unix_socket="/tmp/daphne.sock"), ["unix:/tmp/daphne.sock"]
build(unix_socket="/tmp/daphne.sock"),
["unix:/tmp/daphne.sock"]
)
def testFileDescriptorBinding(self):
self.assertEqual(build(file_descriptor=5), ["fd:fileno=5"])
self.assertEqual(
build(file_descriptor=5),
["fd:fileno=5"]
)
def testMultipleEnpoints(self):
self.assertEqual(
@ -52,16 +65,14 @@ class TestEndpointDescriptions(TestCase):
file_descriptor=123,
unix_socket="/tmp/daphne.sock",
port=8080,
host="10.0.0.1",
host="10.0.0.1"
)
),
sorted(
[
"tcp:port=8080:interface=10.0.0.1",
"unix:/tmp/daphne.sock",
"fd:fileno=123",
]
),
sorted([
"tcp:port=8080:interface=10.0.0.1",
"unix:/tmp/daphne.sock",
"fd:fileno=123"
])
)
@ -81,8 +92,6 @@ class TestCLIInterface(TestCase):
Mock server object for testing.
"""
abort_start = False
def __init__(self, **kwargs):
self.init_kwargs = kwargs
@ -103,9 +112,7 @@ class TestCLIInterface(TestCase):
Passes in a fake application automatically.
"""
cli = self.TestedCLI()
cli.run(
args + ["daphne:__version__"]
) # We just pass something importable as app
cli.run(args + ["daphne:__version__"]) # We just pass something importable as app
# Check the server got all arguments as intended
for key, value in server_kwargs.items():
# Get the value and sort it if it's a list (for endpoint checking)
@ -116,30 +123,52 @@ class TestCLIInterface(TestCase):
self.assertEqual(
value,
actual_value,
"Wrong value for server kwarg %s: %r != %r"
% (key, value, actual_value),
"Wrong value for server kwarg %s: %r != %r" % (
key,
value,
actual_value,
),
)
def testCLIBasics(self):
"""
Tests basic endpoint generation.
"""
self.assertCLI([], {"endpoints": ["tcp:port=8000:interface=127.0.0.1"]})
self.assertCLI(
["-p", "123"], {"endpoints": ["tcp:port=123:interface=127.0.0.1"]}
[],
{
"endpoints": ["tcp:port=8000:interface=127.0.0.1"],
},
)
self.assertCLI(
["-b", "10.0.0.1"], {"endpoints": ["tcp:port=8000:interface=10.0.0.1"]}
["-p", "123"],
{
"endpoints": ["tcp:port=123:interface=127.0.0.1"],
},
)
self.assertCLI(
["-b", "200a::1"], {"endpoints": [r"tcp:port=8000:interface=200a\:\:1"]}
["-b", "10.0.0.1"],
{
"endpoints": ["tcp:port=8000:interface=10.0.0.1"],
},
)
self.assertCLI(
["-b", "[200a::1]"], {"endpoints": [r"tcp:port=8000:interface=200a\:\:1"]}
["-b", "200a::1"],
{
"endpoints": [r'tcp:port=8000:interface=200a\:\:1'],
},
)
self.assertCLI(
["-b", "[200a::1]"],
{
"endpoints": [r'tcp:port=8000:interface=200a\:\:1'],
},
)
self.assertCLI(
["-p", "8080", "-b", "example.com"],
{"endpoints": ["tcp:port=8080:interface=example.com"]},
{
"endpoints": ["tcp:port=8080:interface=example.com"],
},
)
def testUnixSockets(self):
@ -149,7 +178,7 @@ class TestCLIInterface(TestCase):
"endpoints": [
"tcp:port=8080:interface=127.0.0.1",
"unix:/tmp/daphne.sock",
]
],
},
)
self.assertCLI(
@ -158,12 +187,17 @@ class TestCLIInterface(TestCase):
"endpoints": [
"tcp:port=8000:interface=example.com",
"unix:/tmp/daphne.sock",
]
],
},
)
self.assertCLI(
["-u", "/tmp/daphne.sock", "--fd", "5"],
{"endpoints": ["fd:fileno=5", "unix:/tmp/daphne.sock"]},
{
"endpoints": [
"fd:fileno=5",
"unix:/tmp/daphne.sock"
],
},
)
def testMixedCLIEndpointCreation(self):
@ -175,8 +209,8 @@ class TestCLIInterface(TestCase):
{
"endpoints": [
"tcp:port=8080:interface=127.0.0.1",
"unix:/tmp/daphne.sock",
]
"unix:/tmp/daphne.sock"
],
},
)
self.assertCLI(
@ -185,7 +219,7 @@ class TestCLIInterface(TestCase):
"endpoints": [
"tcp:port=8080:interface=127.0.0.1",
"tcp:port=8080:interface=127.0.0.1",
]
],
},
)
@ -193,77 +227,11 @@ class TestCLIInterface(TestCase):
"""
Tests entirely custom endpoints
"""
self.assertCLI(["-e", "imap:"], {"endpoints": ["imap:"]})
def test_default_proxyheaders(self):
"""
Passing `--proxy-headers` without a parameter will use the
`X-Forwarded-For` header.
"""
self.assertCLI(
["--proxy-headers"], {"proxy_forwarded_address_header": "X-Forwarded-For"}
["-e", "imap:"],
{
"endpoints": [
"imap:",
],
},
)
def test_custom_proxyhost(self):
"""
Passing `--proxy-headers-host` will set the used host header to
the passed one, and `--proxy-headers` is mandatory.
"""
self.assertCLI(
["--proxy-headers", "--proxy-headers-host", "blah"],
{"proxy_forwarded_address_header": "blah"},
)
with self.assertRaises(expected_exception=ArgumentError) as exc:
self.assertCLI(
["--proxy-headers-host", "blah"],
{"proxy_forwarded_address_header": "blah"},
)
self.assertEqual(exc.exception.argument_name, "--proxy-headers-host")
self.assertEqual(
exc.exception.message,
"--proxy-headers has to be passed for this parameter.",
)
def test_custom_proxyport(self):
"""
Passing `--proxy-headers-port` will set the used port header to
the passed one, and `--proxy-headers` is mandatory.
"""
self.assertCLI(
["--proxy-headers", "--proxy-headers-port", "blah2"],
{"proxy_forwarded_port_header": "blah2"},
)
with self.assertRaises(expected_exception=ArgumentError) as exc:
self.assertCLI(
["--proxy-headers-port", "blah2"],
{"proxy_forwarded_address_header": "blah2"},
)
self.assertEqual(exc.exception.argument_name, "--proxy-headers-port")
self.assertEqual(
exc.exception.message,
"--proxy-headers has to be passed for this parameter.",
)
def test_custom_servername(self):
"""
Passing `--server-name` will set the default server header
from 'daphne' to the passed one.
"""
self.assertCLI([], {"server_name": "daphne"})
self.assertCLI(["--server-name", ""], {"server_name": ""})
self.assertCLI(["--server-name", "python"], {"server_name": "python"})
def test_no_servername(self):
"""
Passing `--no-server-name` will set server name to '' (empty string)
"""
self.assertCLI(["--no-server-name"], {"server_name": ""})
@skipUnless(os.getenv("ASGI_THREADS"), "ASGI_THREADS environment variable not set.")
class TestASGIThreads(TestCase):
def test_default_executor(self):
from daphne.server import twisted_loop
executor = twisted_loop._default_executor
self.assertEqual(executor._max_workers, int(os.getenv("ASGI_THREADS")))

View File

@ -1,49 +0,0 @@
import unittest
from daphne.http_protocol import WebRequest
class MockServer:
"""
Mock server object for testing.
"""
def protocol_connected(self, *args, **kwargs):
pass
class MockFactory:
"""
Mock factory object for testing.
"""
def __init__(self):
self.server = MockServer()
class MockChannel:
"""
Mock channel object for testing.
"""
def __init__(self):
self.factory = MockFactory()
self.transport = None
def getPeer(self, *args, **kwargs):
return "peer"
def getHost(self, *args, **kwargs):
return "host"
class TestHTTPProtocol(unittest.TestCase):
"""
Tests the HTTP protocol classes.
"""
def test_web_request_initialisation(self):
channel = MockChannel()
request = WebRequest(channel)
self.assertIsNone(request.client_addr)
self.assertIsNone(request.server_addr)

View File

@ -1,10 +1,12 @@
# coding: utf8
import collections
from urllib import parse
from hypothesis import assume, given, settings
import http_strategies
from http_base import DaphneTestCase
from hypothesis import assume, given, settings
from hypothesis.strategies import integers
class TestHTTPRequest(DaphneTestCase):
@ -13,7 +15,13 @@ class TestHTTPRequest(DaphneTestCase):
"""
def assert_valid_http_scope(
self, scope, method, path, params=None, headers=None, scheme=None
self,
scope,
method,
path,
params=None,
headers=None,
scheme=None,
):
"""
Checks that the passed scope is a valid ASGI HTTP scope regarding types
@ -21,27 +29,17 @@ class TestHTTPRequest(DaphneTestCase):
"""
# Check overall keys
self.assert_key_sets(
required_keys={
"asgi",
"type",
"http_version",
"method",
"path",
"raw_path",
"query_string",
"headers",
},
required_keys={"type", "http_version", "method", "path", "query_string", "headers"},
optional_keys={"scheme", "root_path", "client", "server"},
actual_keys=scope.keys(),
)
self.assertEqual(scope["asgi"]["version"], "3.0")
# Check that it is the right type
self.assertEqual(scope["type"], "http")
# Method (uppercased unicode string)
self.assertIsInstance(scope["method"], str)
self.assertEqual(scope["method"], method.upper())
# Path
self.assert_valid_path(scope["path"])
self.assert_valid_path(scope["path"], path)
# HTTP version
self.assertIn(scope["http_version"], ["1.0", "1.1", "1.2"])
# Scheme
@ -52,9 +50,7 @@ class TestHTTPRequest(DaphneTestCase):
query_string = scope["query_string"]
self.assertIsInstance(query_string, bytes)
if params:
self.assertEqual(
query_string, parse.urlencode(params or []).encode("ascii")
)
self.assertEqual(query_string, parse.urlencode(params or []).encode("ascii"))
# Ordering of header names is not important, but the order of values for a header
# name is. To assert whether that order is kept, we transform both the request
# headers and the channel message headers into a dictionary
@ -63,9 +59,9 @@ class TestHTTPRequest(DaphneTestCase):
for name, value in scope["headers"]:
transformed_scope_headers[name].append(value)
transformed_request_headers = collections.defaultdict(list)
for name, value in headers or []:
expected_name = name.lower().strip()
expected_value = value.strip()
for name, value in (headers or []):
expected_name = name.lower().strip().encode("ascii")
expected_value = value.strip().encode("ascii")
transformed_request_headers[expected_name].append(expected_value)
for name, value in transformed_request_headers.items():
self.assertIn(name, transformed_scope_headers)
@ -107,75 +103,39 @@ class TestHTTPRequest(DaphneTestCase):
@given(
request_path=http_strategies.http_path(),
request_params=http_strategies.query_params(),
request_params=http_strategies.query_params()
)
@settings(max_examples=5, deadline=5000)
def test_get_request(self, request_path, request_params):
"""
Tests a typical HTTP GET request, with a path and query parameters
"""
scope, messages = self.run_daphne_request(
"GET", request_path, params=request_params
)
scope, messages = self.run_daphne_request("GET", request_path, params=request_params)
self.assert_valid_http_scope(scope, "GET", request_path, params=request_params)
self.assert_valid_http_request_message(messages[0], body=b"")
@given(request_path=http_strategies.http_path(), chunk_size=integers(min_value=1))
@settings(max_examples=5, deadline=5000)
def test_request_body_chunking(self, request_path, chunk_size):
"""
Tests request body chunking logic.
"""
body = b"The quick brown fox jumps over the lazy dog"
_, messages = self.run_daphne_request(
"POST",
request_path,
body=body,
request_buffer_size=chunk_size,
)
# Avoid running those asserts when there's a single "http.disconnect"
if len(messages) > 1:
assert messages[0]["body"].decode() == body.decode()[:chunk_size]
assert not messages[-2]["more_body"]
assert messages[-1] == {"type": "http.disconnect"}
@given(
request_path=http_strategies.http_path(),
request_body=http_strategies.http_body(),
request_body=http_strategies.http_body()
)
@settings(max_examples=5, deadline=5000)
def test_post_request(self, request_path, request_body):
"""
Tests a typical HTTP POST request, with a path and body.
"""
scope, messages = self.run_daphne_request(
"POST", request_path, body=request_body
)
scope, messages = self.run_daphne_request("POST", request_path, body=request_body)
self.assert_valid_http_scope(scope, "POST", request_path)
self.assert_valid_http_request_message(messages[0], body=request_body)
def test_raw_path(self):
"""
Tests that /foo%2Fbar produces raw_path and a decoded path
"""
scope, _ = self.run_daphne_request("GET", "/foo%2Fbar")
self.assertEqual(scope["path"], "/foo/bar")
self.assertEqual(scope["raw_path"], b"/foo%2Fbar")
@given(request_headers=http_strategies.headers())
@settings(max_examples=5, deadline=5000)
def test_headers(self, request_headers):
"""
Tests that HTTP header fields are handled as specified
"""
request_path = parse.quote("/te st-à/")
scope, messages = self.run_daphne_request(
"OPTIONS", request_path, headers=request_headers
)
self.assert_valid_http_scope(
scope, "OPTIONS", request_path, headers=request_headers
)
request_path = "/te st-à/"
scope, messages = self.run_daphne_request("OPTIONS", request_path, headers=request_headers)
self.assert_valid_http_scope(scope, "OPTIONS", request_path, headers=request_headers)
self.assert_valid_http_request_message(messages[0], body=b"")
@given(request_headers=http_strategies.headers())
@ -189,13 +149,9 @@ class TestHTTPRequest(DaphneTestCase):
header_name = request_headers[0][0]
duplicated_headers = [(header_name, header[1]) for header in request_headers]
# Run the request
request_path = parse.quote("/te st-à/")
scope, messages = self.run_daphne_request(
"OPTIONS", request_path, headers=duplicated_headers
)
self.assert_valid_http_scope(
scope, "OPTIONS", request_path, headers=duplicated_headers
)
request_path = "/te st-à/"
scope, messages = self.run_daphne_request("OPTIONS", request_path, headers=duplicated_headers)
self.assert_valid_http_scope(scope, "OPTIONS", request_path, headers=duplicated_headers)
self.assert_valid_http_request_message(messages[0], body=b"")
@given(
@ -238,7 +194,7 @@ class TestHTTPRequest(DaphneTestCase):
"""
Make sure headers are normalized as the spec says they are.
"""
headers = [(b"MYCUSTOMHEADER", b" foobar ")]
headers = [("MYCUSTOMHEADER", " foobar ")]
scope, messages = self.run_daphne_request("GET", "/", headers=headers)
self.assert_valid_http_scope(scope, "GET", "/", headers=headers)
self.assert_valid_http_request_message(messages[0], body=b"")
@ -266,7 +222,10 @@ class TestHTTPRequest(DaphneTestCase):
"""
Make sure that, by default, X-Forwarded-For is ignored.
"""
headers = [[b"X-Forwarded-For", b"10.1.2.3"], [b"X-Forwarded-Port", b"80"]]
headers = [
["X-Forwarded-For", "10.1.2.3"],
["X-Forwarded-Port", "80"],
]
scope, messages = self.run_daphne_request("GET", "/", headers=headers)
self.assert_valid_http_scope(scope, "GET", "/", headers=headers)
self.assert_valid_http_request_message(messages[0], body=b"")
@ -277,7 +236,10 @@ class TestHTTPRequest(DaphneTestCase):
"""
When X-Forwarded-For is enabled, make sure it is respected.
"""
headers = [[b"X-Forwarded-For", b"10.1.2.3"], [b"X-Forwarded-Port", b"80"]]
headers = [
["X-Forwarded-For", "10.1.2.3"],
["X-Forwarded-Port", "80"],
]
scope, messages = self.run_daphne_request("GET", "/", headers=headers, xff=True)
self.assert_valid_http_scope(scope, "GET", "/", headers=headers)
self.assert_valid_http_request_message(messages[0], body=b"")
@ -289,7 +251,9 @@ class TestHTTPRequest(DaphneTestCase):
When X-Forwarded-For is enabled but only the host is passed, make sure
that at least makes it through.
"""
headers = [[b"X-Forwarded-For", b"10.1.2.3"]]
headers = [
["X-Forwarded-For", "10.1.2.3"],
]
scope, messages = self.run_daphne_request("GET", "/", headers=headers, xff=True)
self.assert_valid_http_scope(scope, "GET", "/", headers=headers)
self.assert_valid_http_request_message(messages[0], body=b"")
@ -301,24 +265,8 @@ class TestHTTPRequest(DaphneTestCase):
Tests that requests with invalid (non-ASCII) characters fail.
"""
# Bad path
response = self.run_daphne_raw(
b"GET /\xc3\xa4\xc3\xb6\xc3\xbc HTTP/1.0\r\n\r\n"
)
self.assertTrue(b"400 Bad Request" in response)
response = self.run_daphne_raw(b"GET /\xc3\xa4\xc3\xb6\xc3\xbc HTTP/1.0\r\n\r\n")
self.assertTrue(response.startswith(b"HTTP/1.0 400 Bad Request"))
# Bad querystring
response = self.run_daphne_raw(
b"GET /?\xc3\xa4\xc3\xb6\xc3\xbc HTTP/1.0\r\n\r\n"
)
self.assertTrue(b"400 Bad Request" in response)
def test_invalid_header_name(self):
"""
Tests that requests with invalid header names fail.
"""
# Test cases follow those used by h11
# https://github.com/python-hyper/h11/blob/a2c68948accadc3876dffcf979d98002e4a4ed27/h11/tests/test_headers.py#L24-L35
for header_name in [b"foo bar", b"foo\x00bar", b"foo\xffbar", b"foo\x01bar"]:
response = self.run_daphne_raw(
f"GET / HTTP/1.0\r\n{header_name}: baz\r\n\r\n".encode("ascii")
)
self.assertTrue(b"400 Bad Request" in response)
response = self.run_daphne_raw(b"GET /?\xc3\xa4\xc3\xb6\xc3\xbc HTTP/1.0\r\n\r\n")
self.assertTrue(response.startswith(b"HTTP/1.0 400 Bad Request"))

View File

@ -1,6 +1,9 @@
# coding: utf8
from hypothesis import given, settings
import http_strategies
from http_base import DaphneTestCase
from hypothesis import given, settings
class TestHTTPResponse(DaphneTestCase):
@ -12,31 +15,26 @@ class TestHTTPResponse(DaphneTestCase):
"""
Lowercases and sorts headers, and strips transfer-encoding ones.
"""
return sorted(
[(b"server", b"daphne")]
+ [
(name.lower(), value.strip())
for name, value in headers
if name.lower() not in (b"server", b"transfer-encoding")
]
)
def encode_headers(self, headers):
def encode(s):
return s if isinstance(s, bytes) else s.encode("utf-8")
return [[encode(k), encode(v)] for k, v in headers]
return sorted([
(name.lower(), value.strip())
for name, value in headers
if name.lower() != "transfer-encoding"
])
def test_minimal_response(self):
"""
Smallest viable example. Mostly verifies that our response building works.
"""
response = self.run_daphne_response(
[
{"type": "http.response.start", "status": 200},
{"type": "http.response.body", "body": b"hello world"},
]
)
response = self.run_daphne_response([
{
"type": "http.response.start",
"status": 200,
},
{
"type": "http.response.body",
"body": b"hello world",
},
])
self.assertEqual(response.status, 200)
self.assertEqual(response.read(), b"hello world")
@ -48,23 +46,30 @@ class TestHTTPResponse(DaphneTestCase):
to make sure it stays required.
"""
with self.assertRaises(ValueError):
self.run_daphne_response(
[
{"type": "http.response.start"},
{"type": "http.response.body", "body": b"hello world"},
]
)
self.run_daphne_response([
{
"type": "http.response.start",
},
{
"type": "http.response.body",
"body": b"hello world",
},
])
def test_custom_status_code(self):
"""
Tries a non-default status code.
"""
response = self.run_daphne_response(
[
{"type": "http.response.start", "status": 201},
{"type": "http.response.body", "body": b"i made a thing!"},
]
)
response = self.run_daphne_response([
{
"type": "http.response.start",
"status": 201,
},
{
"type": "http.response.body",
"body": b"i made a thing!",
},
])
self.assertEqual(response.status, 201)
self.assertEqual(response.read(), b"i made a thing!")
@ -72,13 +77,21 @@ class TestHTTPResponse(DaphneTestCase):
"""
Tries sending a response in multiple parts.
"""
response = self.run_daphne_response(
[
{"type": "http.response.start", "status": 201},
{"type": "http.response.body", "body": b"chunk 1 ", "more_body": True},
{"type": "http.response.body", "body": b"chunk 2"},
]
)
response = self.run_daphne_response([
{
"type": "http.response.start",
"status": 201,
},
{
"type": "http.response.body",
"body": b"chunk 1 ",
"more_body": True,
},
{
"type": "http.response.body",
"body": b"chunk 2",
},
])
self.assertEqual(response.status, 201)
self.assertEqual(response.read(), b"chunk 1 chunk 2")
@ -86,14 +99,25 @@ class TestHTTPResponse(DaphneTestCase):
"""
Tries sending a response in multiple parts and an empty end.
"""
response = self.run_daphne_response(
[
{"type": "http.response.start", "status": 201},
{"type": "http.response.body", "body": b"chunk 1 ", "more_body": True},
{"type": "http.response.body", "body": b"chunk 2", "more_body": True},
{"type": "http.response.body"},
]
)
response = self.run_daphne_response([
{
"type": "http.response.start",
"status": 201,
},
{
"type": "http.response.body",
"body": b"chunk 1 ",
"more_body": True,
},
{
"type": "http.response.body",
"body": b"chunk 2",
"more_body": True,
},
{
"type": "http.response.body",
},
])
self.assertEqual(response.status, 201)
self.assertEqual(response.read(), b"chunk 1 chunk 2")
@ -103,12 +127,16 @@ class TestHTTPResponse(DaphneTestCase):
"""
Tries body variants.
"""
response = self.run_daphne_response(
[
{"type": "http.response.start", "status": 200},
{"type": "http.response.body", "body": body},
]
)
response = self.run_daphne_response([
{
"type": "http.response.start",
"status": 200,
},
{
"type": "http.response.body",
"body": body,
},
])
self.assertEqual(response.status, 200)
self.assertEqual(response.read(), body)
@ -116,72 +144,18 @@ class TestHTTPResponse(DaphneTestCase):
@settings(max_examples=5, deadline=5000)
def test_headers(self, headers):
# The ASGI spec requires us to lowercase our header names
response = self.run_daphne_response(
[
{
"type": "http.response.start",
"status": 200,
"headers": self.normalize_headers(headers),
},
{"type": "http.response.body"},
]
)
response = self.run_daphne_response([
{
"type": "http.response.start",
"status": 200,
"headers": self.normalize_headers(headers),
},
{
"type": "http.response.body",
},
])
# Check headers in a sensible way. Ignore transfer-encoding.
self.assertEqual(
self.normalize_headers(self.encode_headers(response.getheaders())),
self.normalize_headers(response.getheaders()),
self.normalize_headers(headers),
)
def test_headers_type(self):
"""
Headers should be `bytes`
"""
with self.assertRaises(ValueError) as context:
self.run_daphne_response(
[
{
"type": "http.response.start",
"status": 200,
"headers": [["foo", b"bar"]],
},
{"type": "http.response.body", "body": b""},
]
)
self.assertEqual(
str(context.exception),
"Header name 'foo' expected to be `bytes`, but got `<class 'str'>`",
)
with self.assertRaises(ValueError) as context:
self.run_daphne_response(
[
{
"type": "http.response.start",
"status": 200,
"headers": [[b"foo", True]],
},
{"type": "http.response.body", "body": b""},
]
)
self.assertEqual(
str(context.exception),
"Header value 'True' expected to be `bytes`, but got `<class 'bool'>`",
)
def test_headers_type_raw(self):
"""
Daphne returns a 500 error response if the application sends invalid
headers.
"""
response = self.run_daphne_raw(
b"GET / HTTP/1.0\r\n\r\n",
responses=[
{
"type": "http.response.start",
"status": 200,
"headers": [["foo", b"bar"]],
},
{"type": "http.response.body", "body": b""},
],
)
self.assertTrue(response.startswith(b"HTTP/1.0 500 Internal Server Error"))

View File

@ -1,15 +0,0 @@
import sys
from pathlib import Path
def test_fd_endpoint_plugin_installed():
# Find the site-packages directory
for path in sys.path:
if "site-packages" in path:
site_packages = Path(path)
break
else:
raise AssertionError("Could not find site-packages in sys.path")
plugin_path = site_packages / "twisted" / "plugins" / "fd_endpoint.py"
assert plugin_path.exists(), f"fd_endpoint.py not found at {plugin_path}"

View File

@ -1,3 +1,5 @@
# coding: utf8
from unittest import TestCase
from twisted.web.http_headers import Headers
@ -11,35 +13,48 @@ class TestXForwardedForHttpParsing(TestCase):
"""
def test_basic(self):
headers = Headers(
{
b"X-Forwarded-For": [b"10.1.2.3"],
b"X-Forwarded-Port": [b"1234"],
b"X-Forwarded-Proto": [b"https"],
}
)
headers = Headers({
b"X-Forwarded-For": [b"10.1.2.3"],
b"X-Forwarded-Port": [b"1234"],
b"X-Forwarded-Proto": [b"https"]
})
result = parse_x_forwarded_for(headers)
self.assertEqual(result, (["10.1.2.3", 1234], "https"))
self.assertIsInstance(result[0][0], str)
self.assertIsInstance(result[1], str)
def test_address_only(self):
headers = Headers({b"X-Forwarded-For": [b"10.1.2.3"]})
self.assertEqual(parse_x_forwarded_for(headers), (["10.1.2.3", 0], None))
headers = Headers({
b"X-Forwarded-For": [b"10.1.2.3"],
})
self.assertEqual(
parse_x_forwarded_for(headers),
(["10.1.2.3", 0], None)
)
def test_v6_address(self):
headers = Headers({b"X-Forwarded-For": [b"1043::a321:0001, 10.0.5.6"]})
self.assertEqual(parse_x_forwarded_for(headers), (["1043::a321:0001", 0], None))
headers = Headers({
b"X-Forwarded-For": [b"1043::a321:0001, 10.0.5.6"],
})
self.assertEqual(
parse_x_forwarded_for(headers),
(["1043::a321:0001", 0], None)
)
def test_multiple_proxys(self):
headers = Headers({b"X-Forwarded-For": [b"10.1.2.3, 10.1.2.4"]})
self.assertEqual(parse_x_forwarded_for(headers), (["10.1.2.3", 0], None))
headers = Headers({
b"X-Forwarded-For": [b"10.1.2.3, 10.1.2.4"],
})
self.assertEqual(
parse_x_forwarded_for(headers),
(["10.1.2.3", 0], None)
)
def test_original(self):
headers = Headers({})
self.assertEqual(
parse_x_forwarded_for(headers, original_addr=["127.0.0.1", 80]),
(["127.0.0.1", 80], None),
(["127.0.0.1", 80], None)
)
def test_no_original(self):
@ -58,25 +73,43 @@ class TestXForwardedForWsParsing(TestCase):
b"X-Forwarded-Port": b"1234",
b"X-Forwarded-Proto": b"https",
}
self.assertEqual(parse_x_forwarded_for(headers), (["10.1.2.3", 1234], "https"))
self.assertEqual(
parse_x_forwarded_for(headers),
(["10.1.2.3", 1234], "https")
)
def test_address_only(self):
headers = {b"X-Forwarded-For": b"10.1.2.3"}
self.assertEqual(parse_x_forwarded_for(headers), (["10.1.2.3", 0], None))
headers = {
b"X-Forwarded-For": b"10.1.2.3",
}
self.assertEqual(
parse_x_forwarded_for(headers),
(["10.1.2.3", 0], None)
)
def test_v6_address(self):
headers = {b"X-Forwarded-For": [b"1043::a321:0001, 10.0.5.6"]}
self.assertEqual(parse_x_forwarded_for(headers), (["1043::a321:0001", 0], None))
headers = {
b"X-Forwarded-For": [b"1043::a321:0001, 10.0.5.6"],
}
self.assertEqual(
parse_x_forwarded_for(headers),
(["1043::a321:0001", 0], None)
)
def test_multiple_proxies(self):
headers = {b"X-Forwarded-For": b"10.1.2.3, 10.1.2.4"}
self.assertEqual(parse_x_forwarded_for(headers), (["10.1.2.3", 0], None))
headers = {
b"X-Forwarded-For": b"10.1.2.3, 10.1.2.4",
}
self.assertEqual(
parse_x_forwarded_for(headers),
(["10.1.2.3", 0], None)
)
def test_original(self):
headers = {}
self.assertEqual(
parse_x_forwarded_for(headers, original_addr=["127.0.0.1", 80]),
(["127.0.0.1", 80], None),
(["127.0.0.1", 80], None)
)
def test_no_original(self):

View File

@ -1,12 +1,13 @@
# coding: utf8
import collections
import time
from urllib import parse
import http_strategies
from http_base import DaphneTestCase, DaphneTestingInstance
from hypothesis import given, settings
from daphne.testing import BaseDaphneTestingInstance
import http_strategies
from http_base import DaphneTestCase, DaphneTestingInstance
class TestWebsocket(DaphneTestCase):
@ -15,7 +16,13 @@ class TestWebsocket(DaphneTestCase):
"""
def assert_valid_websocket_scope(
self, scope, path="/", params=None, headers=None, scheme=None, subprotocols=None
self,
scope,
path="/",
params=None,
headers=None,
scheme=None,
subprotocols=None,
):
"""
Checks that the passed scope is a valid ASGI HTTP scope regarding types
@ -23,22 +30,14 @@ class TestWebsocket(DaphneTestCase):
"""
# Check overall keys
self.assert_key_sets(
required_keys={
"asgi",
"type",
"path",
"raw_path",
"query_string",
"headers",
},
required_keys={"type", "path", "query_string", "headers"},
optional_keys={"scheme", "root_path", "client", "server", "subprotocols"},
actual_keys=scope.keys(),
)
self.assertEqual(scope["asgi"]["version"], "3.0")
# Check that it is the right type
self.assertEqual(scope["type"], "websocket")
# Path
self.assert_valid_path(scope["path"])
self.assert_valid_path(scope["path"], path)
# Scheme
self.assertIn(scope.get("scheme", "ws"), ["ws", "wss"])
if scheme:
@ -47,9 +46,7 @@ class TestWebsocket(DaphneTestCase):
query_string = scope["query_string"]
self.assertIsInstance(query_string, bytes)
if params:
self.assertEqual(
query_string, parse.urlencode(params or []).encode("ascii")
)
self.assertEqual(query_string, parse.urlencode(params or []).encode("ascii"))
# Ordering of header names is not important, but the order of values for a header
# name is. To assert whether that order is kept, we transform both the request
# headers and the channel message headers into a dictionary
@ -62,9 +59,9 @@ class TestWebsocket(DaphneTestCase):
if bit.strip():
transformed_scope_headers[name].append(bit.strip())
transformed_request_headers = collections.defaultdict(list)
for name, value in headers or []:
expected_name = name.lower().strip()
expected_value = value.strip()
for name, value in (headers or []):
expected_name = name.lower().strip().encode("ascii")
expected_value = value.strip().encode("ascii")
# Make sure to split out any headers collapsed with commas
transformed_request_headers.setdefault(expected_name, [])
for bit in expected_value.split(b","):
@ -95,7 +92,9 @@ class TestWebsocket(DaphneTestCase):
"""
# Check overall keys
self.assert_key_sets(
required_keys={"type"}, optional_keys=set(), actual_keys=message.keys()
required_keys={"type"},
optional_keys=set(),
actual_keys=message.keys(),
)
# Check that it is the right type
self.assertEqual(message["type"], "websocket.connect")
@ -105,7 +104,11 @@ class TestWebsocket(DaphneTestCase):
Tests we can open and accept a socket.
"""
with DaphneTestingInstance() as test_app:
test_app.add_send_messages([{"type": "websocket.accept"}])
test_app.add_send_messages([
{
"type": "websocket.accept",
}
])
self.websocket_handshake(test_app)
# Validate the scope and messages we got
scope, messages = test_app.get_received()
@ -117,7 +120,11 @@ class TestWebsocket(DaphneTestCase):
Tests we can reject a socket and it won't complete the handshake.
"""
with DaphneTestingInstance() as test_app:
test_app.add_send_messages([{"type": "websocket.close"}])
test_app.add_send_messages([
{
"type": "websocket.close",
}
])
with self.assertRaises(RuntimeError):
self.websocket_handshake(test_app)
@ -127,12 +134,13 @@ class TestWebsocket(DaphneTestCase):
"""
subprotocols = ["proto1", "proto2"]
with DaphneTestingInstance() as test_app:
test_app.add_send_messages(
[{"type": "websocket.accept", "subprotocol": "proto2"}]
)
_, subprotocol = self.websocket_handshake(
test_app, subprotocols=subprotocols
)
test_app.add_send_messages([
{
"type": "websocket.accept",
"subprotocol": "proto2",
}
])
_, subprotocol = self.websocket_handshake(test_app, subprotocols=subprotocols)
# Validate the scope and messages we got
assert subprotocol == "proto2"
scope, messages = test_app.get_received()
@ -143,9 +151,16 @@ class TestWebsocket(DaphneTestCase):
"""
Tests that X-Forwarded-For headers get parsed right
"""
headers = [["X-Forwarded-For", "10.1.2.3"], ["X-Forwarded-Port", "80"]]
headers = [
["X-Forwarded-For", "10.1.2.3"],
["X-Forwarded-Port", "80"],
]
with DaphneTestingInstance(xff=True) as test_app:
test_app.add_send_messages([{"type": "websocket.accept"}])
test_app.add_send_messages([
{
"type": "websocket.accept",
}
])
self.websocket_handshake(test_app, headers=headers)
# Validate the scope and messages we got
scope, messages = test_app.get_received()
@ -159,87 +174,66 @@ class TestWebsocket(DaphneTestCase):
request_headers=http_strategies.headers(),
)
@settings(max_examples=5, deadline=2000)
def test_http_bits(self, request_path, request_params, request_headers):
def test_http_bits(
self,
request_path,
request_params,
request_headers,
):
"""
Tests that various HTTP-level bits (query string params, path, headers)
carry over into the scope.
"""
with DaphneTestingInstance() as test_app:
test_app.add_send_messages([{"type": "websocket.accept"}])
test_app.add_send_messages([
{
"type": "websocket.accept",
}
])
self.websocket_handshake(
test_app,
path=parse.quote(request_path),
path=request_path,
params=request_params,
headers=request_headers,
)
# Validate the scope and messages we got
scope, messages = test_app.get_received()
self.assert_valid_websocket_scope(
scope, path=request_path, params=request_params, headers=request_headers
scope,
path=request_path,
params=request_params,
headers=request_headers,
)
self.assert_valid_websocket_connect_message(messages[0])
def test_raw_path(self):
"""
Tests that /foo%2Fbar produces raw_path and a decoded path
"""
with DaphneTestingInstance() as test_app:
test_app.add_send_messages([{"type": "websocket.accept"}])
self.websocket_handshake(test_app, path="/foo%2Fbar")
# Validate the scope and messages we got
scope, _ = test_app.get_received()
self.assertEqual(scope["path"], "/foo/bar")
self.assertEqual(scope["raw_path"], b"/foo%2Fbar")
@given(daphne_path=http_strategies.http_path())
@settings(max_examples=5, deadline=2000)
def test_root_path(self, *, daphne_path):
"""
Tests root_path handling.
"""
headers = [("Daphne-Root-Path", parse.quote(daphne_path))]
with DaphneTestingInstance() as test_app:
test_app.add_send_messages([{"type": "websocket.accept"}])
self.websocket_handshake(
test_app,
path="/",
headers=headers,
)
# Validate the scope and messages we got
scope, _ = test_app.get_received()
# Daphne-Root-Path is not included in the returned 'headers' section.
self.assertNotIn(
"daphne-root-path", (header[0].lower() for header in scope["headers"])
)
# And what we're looking for, root_path being set.
self.assertEqual(scope["root_path"], daphne_path)
def test_text_frames(self):
"""
Tests we can send and receive text frames.
"""
with DaphneTestingInstance() as test_app:
# Connect
test_app.add_send_messages([{"type": "websocket.accept"}])
test_app.add_send_messages([
{
"type": "websocket.accept",
}
])
sock, _ = self.websocket_handshake(test_app)
_, messages = test_app.get_received()
self.assert_valid_websocket_connect_message(messages[0])
# Prep frame for it to send
test_app.add_send_messages(
[{"type": "websocket.send", "text": "here be dragons 🐉"}]
)
test_app.add_send_messages([
{
"type": "websocket.send",
"text": "here be dragons 🐉",
}
])
# Send it a frame
self.websocket_send_frame(sock, "what is here? 🌍")
# Receive a frame and make sure it's correct
assert self.websocket_receive_frame(sock) == "here be dragons 🐉"
# Make sure it got our frame
_, messages = test_app.get_received()
assert messages[1] == {
"type": "websocket.receive",
"text": "what is here? 🌍",
}
assert messages[1] == {"type": "websocket.receive", "text": "what is here? 🌍"}
def test_binary_frames(self):
"""
@ -248,24 +242,28 @@ class TestWebsocket(DaphneTestCase):
"""
with DaphneTestingInstance() as test_app:
# Connect
test_app.add_send_messages([{"type": "websocket.accept"}])
test_app.add_send_messages([
{
"type": "websocket.accept",
}
])
sock, _ = self.websocket_handshake(test_app)
_, messages = test_app.get_received()
self.assert_valid_websocket_connect_message(messages[0])
# Prep frame for it to send
test_app.add_send_messages(
[{"type": "websocket.send", "bytes": b"here be \xe2 bytes"}]
)
test_app.add_send_messages([
{
"type": "websocket.send",
"bytes": b"here be \xe2 bytes",
}
])
# Send it a frame
self.websocket_send_frame(sock, b"what is here? \xe2")
# Receive a frame and make sure it's correct
assert self.websocket_receive_frame(sock) == b"here be \xe2 bytes"
# Make sure it got our frame
_, messages = test_app.get_received()
assert messages[1] == {
"type": "websocket.receive",
"bytes": b"what is here? \xe2",
}
assert messages[1] == {"type": "websocket.receive", "bytes": b"what is here? \xe2"}
def test_http_timeout(self):
"""
@ -273,66 +271,24 @@ class TestWebsocket(DaphneTestCase):
"""
with DaphneTestingInstance(http_timeout=1) as test_app:
# Connect
test_app.add_send_messages([{"type": "websocket.accept"}])
test_app.add_send_messages([
{
"type": "websocket.accept",
}
])
sock, _ = self.websocket_handshake(test_app)
_, messages = test_app.get_received()
self.assert_valid_websocket_connect_message(messages[0])
# Wait 2 seconds
time.sleep(2)
# Prep frame for it to send
test_app.add_send_messages([{"type": "websocket.send", "text": "cake"}])
test_app.add_send_messages([
{
"type": "websocket.send",
"text": "cake",
}
])
# Send it a frame
self.websocket_send_frame(sock, "still alive?")
# Receive a frame and make sure it's correct
assert self.websocket_receive_frame(sock) == "cake"
def test_application_checker_handles_asyncio_cancellederror(self):
with CancellingTestingInstance() as app:
# Connect to the websocket app, it will immediately raise
# asyncio.CancelledError
sock, _ = self.websocket_handshake(app)
# Disconnect from the socket
sock.close()
# Wait for application_checker to clean up the applications for
# disconnected clients, and for the server to be stopped.
time.sleep(3)
# Make sure we received either no error, or a ConnectionsNotEmpty
while not app.process.errors.empty():
err, _tb = app.process.errors.get()
if not isinstance(err, ConnectionsNotEmpty):
raise err
self.fail(
"Server connections were not cleaned up after an asyncio.CancelledError was raised"
)
async def cancelling_application(scope, receive, send):
import asyncio
from twisted.internet import reactor
# Stop the server after a short delay so that the teardown is run.
reactor.callLater(2, reactor.stop)
await send({"type": "websocket.accept"})
raise asyncio.CancelledError()
class ConnectionsNotEmpty(Exception):
pass
class CancellingTestingInstance(BaseDaphneTestingInstance):
def __init__(self):
super().__init__(application=cancelling_application)
def process_teardown(self):
import multiprocessing
# Get a hold of the enclosing DaphneProcess (we're currently running in
# the same process as the application).
proc = multiprocessing.current_process()
# By now the (only) socket should have disconnected, and the
# application_checker should have run. If there are any connections
# still, it means that the application_checker did not clean them up.
if proc.server.connections:
raise ConnectionsNotEmpty()

View File

@ -1,8 +0,0 @@
[tox]
envlist =
py{39,310,311,312,313}
[testenv]
extras = tests
commands =
pytest -v {posargs}