Compare commits

..

No commits in common. "master" and "v2.1.7" have entirely different histories.

173 changed files with 3696 additions and 6323 deletions

View File

@ -1,34 +0,0 @@
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: "\U0001F41B bug"
assignees: ''
---
**Note: for support questions, please use stackoverflow**. This repository's issues are reserved for feature requests and bug reports.
* **What is the current behavior?**
* **If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem** via
a github repo, https://repl.it or similar.
* **What is the expected behavior?**
* **What is the motivation / use case for changing the behavior?**
* **Please tell us about your environment:**
- Version:
- Platform:
* **Other information** (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links for us to have context, eg. stackoverflow)

View File

@ -1 +0,0 @@
blank_issues_enabled: false

View File

@ -1,20 +0,0 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: "✨ enhancement"
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

24
.github/stale.yml vendored
View File

@ -1,24 +0,0 @@
# Number of days of inactivity before an issue becomes stale
daysUntilStale: false
# Number of days of inactivity before a stale issue is closed
daysUntilClose: false
# Issues with these labels will never be considered stale
exemptLabels:
- pinned
- security
- 🐛 bug
- 📖 documentation
- 🙋 help wanted
- ✨ enhancement
- good first issue
- work in progress
# Label to use when marking an issue as stale
staleLabel: wontfix
# Comment to post when marking an issue as stale. Set to `false` to disable
markComment: false
# markComment: >
# This issue has been automatically marked as stale because it has not had
# recent activity. It will be closed if no further activity occurs. Thank you
# for your contributions.
# Comment to post when closing a stale issue. Set to `false` to disable
closeComment: false

View File

@ -1,21 +0,0 @@
name: 📦 Build
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.10
uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install build twine
- name: Building package
run: python3 -m build
- name: Check package with Twine
run: twine check dist/*

View File

@ -1,26 +0,0 @@
name: 🚀 Deploy to PyPI
on:
push:
tags:
- 'v*'
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.10
uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Build wheel and source tarball
run: |
pip install wheel
python setup.py sdist bdist_wheel
- name: Publish a Python distribution to PyPI
uses: pypa/gh-action-pypi-publish@v1.1.0
with:
user: __token__
password: ${{ secrets.pypi_password }}

View File

@ -1,26 +0,0 @@
name: 💅 Lint
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.10
uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox
- name: Run lint
run: tox
env:
TOXENV: pre-commit
- name: Run mypy
run: tox
env:
TOXENV: mypy

View File

@ -1,64 +0,0 @@
name: 📄 Tests
on:
push:
branches:
- master
- '*.x'
paths-ignore:
- 'docs/**'
- '*.md'
- '*.rst'
pull_request:
branches:
- master
- '*.x'
paths-ignore:
- 'docs/**'
- '*.md'
- '*.rst'
jobs:
tests:
# runs the test suite
name: ${{ matrix.name }}
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
include:
- {name: '3.13', python: '3.13', os: ubuntu-latest, tox: py313}
- {name: '3.12', python: '3.12', os: ubuntu-latest, tox: py312}
- {name: '3.11', python: '3.11', os: ubuntu-latest, tox: py311}
- {name: '3.10', python: '3.10', os: ubuntu-latest, tox: py310}
- {name: '3.9', python: '3.9', os: ubuntu-latest, tox: py39}
- {name: '3.8', python: '3.8', os: ubuntu-latest, tox: py38}
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python }}
- name: update pip
run: |
python -m pip install --upgrade pip
pip install --upgrade setuptools wheel
- name: get pip cache dir
id: pip-cache
run: echo "dir=$(pip cache dir)" >> $GITHUB_OUTPUT
- name: cache pip dependencies
uses: actions/cache@v3
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: pip|${{ runner.os }}|${{ matrix.python }}|${{ hashFiles('setup.py') }}
- run: pip install tox
- run: tox -e ${{ matrix.tox }}
- name: Upload coverage.xml
if: ${{ matrix.python == '3.10' }}
uses: actions/upload-artifact@v4
with:
name: graphene-coverage
path: coverage.xml
if-no-files-found: error
- name: Upload coverage.xml to codecov
if: ${{ matrix.python == '3.10' }}
uses: codecov/codecov-action@v4

15
.gitignore vendored
View File

@ -10,6 +10,9 @@ __pycache__/
# Distribution / packaging # Distribution / packaging
.Python .Python
env/
venv/
.venv/
build/ build/
develop-eggs/ develop-eggs/
dist/ dist/
@ -44,8 +47,7 @@ htmlcov/
.pytest_cache .pytest_cache
nosetests.xml nosetests.xml
coverage.xml coverage.xml
*.cover *,cover
.pytest_cache/
# Translations # Translations
*.mo *.mo
@ -60,14 +62,6 @@ docs/_build/
# PyBuilder # PyBuilder
target/ target/
# VirtualEnv
.env
.venv
env/
venv/
# Typing
.mypy_cache/
/tests/django.sqlite /tests/django.sqlite
@ -90,4 +84,3 @@ venv/
*.sqlite3 *.sqlite3
.vscode .vscode
.mypy_cache .mypy_cache
.ruff_cache

2
.isort.cfg Normal file
View File

@ -0,0 +1,2 @@
[settings]
known_third_party = aniso8601,graphql,graphql_relay,promise,pytest,pytz,pyutils,setuptools,six,snapshottest,sphinx_graphene_theme

View File

@ -1,9 +1,6 @@
default_language_version:
python: python3.10
repos: repos:
- repo: https://github.com/pre-commit/pre-commit-hooks - repo: git://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0 rev: v2.1.0
hooks: hooks:
- id: check-merge-conflict - id: check-merge-conflict
- id: check-json - id: check-json
@ -17,13 +14,15 @@ repos:
- id: trailing-whitespace - id: trailing-whitespace
exclude: README.md exclude: README.md
- repo: https://github.com/asottile/pyupgrade - repo: https://github.com/asottile/pyupgrade
rev: v2.37.3 rev: v1.12.0
hooks: hooks:
- id: pyupgrade - id: pyupgrade
- repo: https://github.com/astral-sh/ruff-pre-commit - repo: https://github.com/ambv/black
# Ruff version. rev: 18.9b0
rev: v0.5.0
hooks: hooks:
- id: ruff - id: black
- id: ruff-format language_version: python3
args: [ --check ] - repo: https://github.com/PyCQA/flake8
rev: 3.7.7
hooks:
- id: flake8

43
.travis.yml Normal file
View File

@ -0,0 +1,43 @@
language: python
dist: xenial
python:
- "2.7"
- "3.5"
- "3.6"
- "3.7"
install:
- pip install tox tox-travis
script: tox
after_success:
- pip install coveralls
- coveralls
cache:
directories:
- $HOME/.cache/pip
- $HOME/.cache/pre-commit
stages:
- test
- name: deploy
if: tag IS present
jobs:
fast_finish: true
include:
- env: TOXENV=pre-commit
python: 3.7
- env: TOXENV=mypy
python: 3.7
- stage: deploy
python: 3.7
after_success: true
deploy:
provider: pypi
user: syrusakbary
on:
tags: true
password:
secure: LHOp9DvYR+70vj4YVY8+JRNCKUOfYZREEUY3+4lMUpY7Zy5QwDfgEMXG64ybREH9dFldpUqVXRj53eeU3spfudSfh8NHkgqW7qihez2AhSnRc4dK6ooNfB+kLcSoJ4nUFGxdYImABc4V1hJvflGaUkTwDNYVxJF938bPaO797IvSbuI86llwqkvuK2Vegv9q/fy9sVGaF9VZIs4JgXwR5AyDR7FBArl+S84vWww4vTFD33hoE88VR4QvFY3/71BwRtQrnCMm7AOm31P9u29yi3bpzQpiOR2rHsgrsYdm597QzFKVxYwsmf9uAx2bpbSPy2WibunLePIvOFwm8xcfwnz4/J4ONBc5PSFmUytTWpzEnxb0bfUNLuYloIS24V6OZ8BfAhiYZ1AwySeJCQDM4Vk1V8IF6trTtyx5EW/uV9jsHCZ3LFsAD7UnFRTosIgN3SAK3ZWCEk5oF2IvjecsolEfkRXB3q9EjMkkuXRUeFDH2lWJLgNE27BzY6myvZVzPmfwZUsPBlPD/6w+WLSp97Rjgr9zS3T1d4ddqFM4ZYu04f2i7a/UUQqG+itzzuX5DWLPvzuNt37JB45mB9IsvxPyXZ6SkAcLl48NGyKok1f3vQnvphkfkl4lni29woKhaau8xlsuEDrcwOoeAsVcZXiItg+l+z2SlIwM0A06EvQ=
distributions: "sdist bdist_wheel"

3
CODEOWNERS Normal file
View File

@ -0,0 +1,3 @@
* @ekampf @dan98765 @projectcheshire @jkimbo
/docs/ @dvndrsn @phalt @changeling
/examples/ @dvndrsn @phalt @changeling

View File

@ -5,11 +5,10 @@ help:
.PHONY: install-dev ## Install development dependencies .PHONY: install-dev ## Install development dependencies
install-dev: install-dev:
pip install -e ".[dev]" pip install -e ".[test]"
.PHONY: test ## Run tests
test: test:
py.test graphene examples py.test graphene
.PHONY: docs ## Generate docs .PHONY: docs ## Generate docs
docs: install-dev docs: install-dev
@ -18,11 +17,3 @@ docs: install-dev
.PHONY: docs-live ## Generate docs with live reloading .PHONY: docs-live ## Generate docs with live reloading
docs-live: install-dev docs-live: install-dev
cd docs && make install && make livehtml cd docs && make install && make livehtml
.PHONY: format
format:
black graphene examples setup.py
.PHONY: lint
lint:
flake8 graphene examples setup.py

View File

@ -1,16 +1,16 @@
# ![Graphene Logo](http://graphene-python.org/favicon.png) [Graphene](http://graphene-python.org) [![PyPI version](https://badge.fury.io/py/graphene.svg)](https://badge.fury.io/py/graphene) [![Coverage Status](https://coveralls.io/repos/graphql-python/graphene/badge.svg?branch=master&service=github)](https://coveralls.io/github/graphql-python/graphene?branch=master) [![](https://dcbadge.vercel.app/api/server/T6Gp6NFYHe?style=flat)](https://discord.gg/T6Gp6NFYHe) **We are looking for contributors**! Please check the [ROADMAP](https://github.com/graphql-python/graphene/blob/master/ROADMAP.md) to see how you can help ❤️
[💬 Join the community on Discord](https://discord.gg/T6Gp6NFYHe) ---
**We are looking for contributors**! Please check the current issues to see how you can help ❤️ # ![Graphene Logo](http://graphene-python.org/favicon.png) [Graphene](http://graphene-python.org) [![Build Status](https://travis-ci.org/graphql-python/graphene.svg?branch=master)](https://travis-ci.org/graphql-python/graphene) [![PyPI version](https://badge.fury.io/py/graphene.svg)](https://badge.fury.io/py/graphene) [![Coverage Status](https://coveralls.io/repos/graphql-python/graphene/badge.svg?branch=master&service=github)](https://coveralls.io/github/graphql-python/graphene?branch=master)
## Introduction ## Introduction
[Graphene](http://graphene-python.org) is an opinionated Python library for building GraphQL schemas/types fast and easily. [Graphene](http://graphene-python.org) is a Python library for building GraphQL schemas/types fast and easily.
- **Easy to use:** Graphene helps you use GraphQL in Python without effort. - **Easy to use:** Graphene helps you use GraphQL in Python without effort.
- **Relay:** Graphene has builtin support for Relay. - **Relay:** Graphene has builtin support for Relay.
- **Data agnostic:** Graphene supports any kind of data source: SQL (Django, SQLAlchemy), Mongo, custom Python objects, etc. - **Data agnostic:** Graphene supports any kind of data source: SQL (Django, SQLAlchemy), NoSQL, custom Python objects, etc.
We believe that by providing a complete API you could plug Graphene anywhere your data lives and make your data available We believe that by providing a complete API you could plug Graphene anywhere your data lives and make your data available
through GraphQL. through GraphQL.
@ -20,28 +20,30 @@ Graphene has multiple integrations with different frameworks:
| integration | Package | | integration | Package |
| ----------------- | --------------------------------------------------------------------------------------- | | ----------------- | --------------------------------------------------------------------------------------- |
| SQLAlchemy | [graphene-sqlalchemy](https://github.com/graphql-python/graphene-sqlalchemy/) |
| Mongo | [graphene-mongo](https://github.com/graphql-python/graphene-mongo/) |
| Apollo Federation | [graphene-federation](https://github.com/graphql-python/graphene-federation/) |
| Django | [graphene-django](https://github.com/graphql-python/graphene-django/) | | Django | [graphene-django](https://github.com/graphql-python/graphene-django/) |
| SQLAlchemy | [graphene-sqlalchemy](https://github.com/graphql-python/graphene-sqlalchemy/) |
| Google App Engine | [graphene-gae](https://github.com/graphql-python/graphene-gae/) |
| Peewee | _In progress_ ([Tracking Issue](https://github.com/graphql-python/graphene/issues/289)) |
Also, Graphene is fully compatible with the GraphQL spec, working seamlessly with all GraphQL clients, such as [Relay](https://github.com/facebook/relay), [Apollo](https://github.com/apollographql/apollo-client) and [gql](https://github.com/graphql-python/gql). Also, Graphene is fully compatible with the GraphQL spec, working seamlessly with all GraphQL clients, such as [Relay](https://github.com/facebook/relay), [Apollo](https://github.com/apollographql/apollo-client) and [gql](https://github.com/graphql-python/gql).
## Installation ## Installation
To install `graphene`, just run this command in your shell For instaling graphene, just run this command in your shell
```bash ```bash
pip install "graphene>=3.1" pip install "graphene>=2.0"
``` ```
## 2.0 Upgrade Guide
Please read [UPGRADE-v2.0.md](/UPGRADE-v2.0.md) to learn how to upgrade.
## Examples ## Examples
Here is one example for you to get started: Here is one example for you to get started:
```python ```python
import graphene
class Query(graphene.ObjectType): class Query(graphene.ObjectType):
hello = graphene.String(description='A typical hello world') hello = graphene.String(description='A typical hello world')
@ -85,24 +87,18 @@ pip install -e ".[test]"
Well-written tests and maintaining good test coverage is important to this project. While developing, run new and existing tests with: Well-written tests and maintaining good test coverage is important to this project. While developing, run new and existing tests with:
```sh ```sh
pytest graphene/relay/tests/test_node.py # Single file py.test graphene/relay/tests/test_node.py # Single file
pytest graphene/relay # All tests in directory py.test graphene/relay # All tests in directory
``` ```
Add the `-s` flag if you have introduced breakpoints into the code for debugging. Add the `-s` flag if you have introduced breakpoints into the code for debugging.
Add the `-v` ("verbose") flag to get more detailed test output. For even more detailed output, use `-vv`. Add the `-v` ("verbose") flag to get more detailed test output. For even more detailed output, use `-vv`.
Check out the [pytest documentation](https://docs.pytest.org/en/latest/) for more options and test running controls. Check out the [pytest documentation](https://docs.pytest.org/en/latest/) for more options and test running controls.
Regularly ensure your `pre-commit` hooks are up to date and enabled:
```sh
pre-commit install
```
You can also run the benchmarks with: You can also run the benchmarks with:
```sh ```sh
pytest graphene --benchmark-only py.test graphene --benchmark-only
``` ```
Graphene supports several versions of Python. To make sure that changes do not break compatibility with any of those versions, we use `tox` to create virtualenvs for each Python version and run tests with that version. To run against all Python versions defined in the `tox.ini` config file, just run: Graphene supports several versions of Python. To make sure that changes do not break compatibility with any of those versions, we use `tox` to create virtualenvs for each Python version and run tests with that version. To run against all Python versions defined in the `tox.ini` config file, just run:
@ -114,10 +110,10 @@ tox
If you wish to run against a specific version defined in the `tox.ini` file: If you wish to run against a specific version defined in the `tox.ini` file:
```sh ```sh
tox -e py39 tox -e py36
``` ```
Tox can only use whatever versions of Python are installed on your system. When you create a pull request, GitHub Actions pipelines will also be running the same tests and report the results, so there is no need for potential contributors to try to install every single version of Python on their own system ahead of time. We appreciate opening issues and pull requests to make graphene even more stable & useful! Tox can only use whatever versions of Python are installed on your system. When you create a pull request, Travis will also be running the same tests and report the results, so there is no need for potential contributors to try to install every single version of Python on their own system ahead of time. We appreciate opening issues and pull requests to make graphene even more stable & useful!
### Building Documentation ### Building Documentation

177
README.rst Normal file
View File

@ -0,0 +1,177 @@
**We are looking for contributors**! Please check the
`ROADMAP <https://github.com/graphql-python/graphene/blob/master/ROADMAP.md>`__
to see how you can help ❤️
--------------
|Graphene Logo| `Graphene <http://graphene-python.org>`__ |Build Status| |PyPI version| |Coverage Status|
=========================================================================================================
Introduction
------------
`Graphene <http://graphene-python.org>`__ is a Python library for
building GraphQL schemas/types fast and easily.
- **Easy to use:** Graphene helps you use GraphQL in Python without
effort.
- **Relay:** Graphene has builtin support for Relay.
- **Data agnostic:** Graphene supports any kind of data source: SQL
(Django, SQLAlchemy), NoSQL, custom Python objects, etc. We believe
that by providing a complete API you could plug Graphene anywhere
your data lives and make your data available through GraphQL.
Integrations
------------
Graphene has multiple integrations with different frameworks:
+---------------------+----------------------------------------------------------------------------------------------+
| integration | Package |
+=====================+==============================================================================================+
| Django | `graphene-django <https://github.com/graphql-python/graphene-django/>`__ |
+---------------------+----------------------------------------------------------------------------------------------+
| SQLAlchemy | `graphene-sqlalchemy <https://github.com/graphql-python/graphene-sqlalchemy/>`__ |
+---------------------+----------------------------------------------------------------------------------------------+
| Google App Engine | `graphene-gae <https://github.com/graphql-python/graphene-gae/>`__ |
+---------------------+----------------------------------------------------------------------------------------------+
| Peewee | *In progress* (`Tracking Issue <https://github.com/graphql-python/graphene/issues/289>`__) |
+---------------------+----------------------------------------------------------------------------------------------+
Also, Graphene is fully compatible with the GraphQL spec, working
seamlessly with all GraphQL clients, such as
`Relay <https://github.com/facebook/relay>`__,
`Apollo <https://github.com/apollographql/apollo-client>`__ and
`gql <https://github.com/graphql-python/gql>`__.
Installation
------------
For instaling graphene, just run this command in your shell
.. code:: bash
pip install "graphene>=2.0"
2.0 Upgrade Guide
-----------------
Please read `UPGRADE-v2.0.md </UPGRADE-v2.0.md>`__ to learn how to
upgrade.
Examples
--------
Here is one example for you to get started:
.. code:: python
class Query(graphene.ObjectType):
hello = graphene.String(description='A typical hello world')
def resolve_hello(self, info):
return 'World'
schema = graphene.Schema(query=Query)
Then Querying ``graphene.Schema`` is as simple as:
.. code:: python
query = '''
query SayHello {
hello
}
'''
result = schema.execute(query)
If you want to learn even more, you can also check the following
`examples <examples/>`__:
- **Basic Schema**: `Starwars example <examples/starwars>`__
- **Relay Schema**: `Starwars Relay
example <examples/starwars_relay>`__
Documentation
-------------
Documentation and links to additional resources are available at
https://docs.graphene-python.org/en/latest/
Contributing
------------
After cloning this repo, create a
`virtualenv <https://virtualenv.pypa.io/en/stable/>`__ and ensure
dependencies are installed by running:
.. code:: sh
virtualenv venv
source venv/bin/activate
pip install -e ".[test]"
Well-written tests and maintaining good test coverage is important to
this project. While developing, run new and existing tests with:
.. code:: sh
py.test graphene/relay/tests/test_node.py # Single file
py.test graphene/relay # All tests in directory
Add the ``-s`` flag if you have introduced breakpoints into the code for
debugging. Add the ``-v`` ("verbose") flag to get more detailed test
output. For even more detailed output, use ``-vv``. Check out the
`pytest documentation <https://docs.pytest.org/en/latest/>`__ for more
options and test running controls.
You can also run the benchmarks with:
.. code:: sh
py.test graphene --benchmark-only
Graphene supports several versions of Python. To make sure that changes
do not break compatibility with any of those versions, we use ``tox`` to
create virtualenvs for each Python version and run tests with that
version. To run against all Python versions defined in the ``tox.ini``
config file, just run:
.. code:: sh
tox
If you wish to run against a specific version defined in the ``tox.ini``
file:
.. code:: sh
tox -e py36
Tox can only use whatever versions of Python are installed on your
system. When you create a pull request, Travis will also be running the
same tests and report the results, so there is no need for potential
contributors to try to install every single version of Python on their
own system ahead of time. We appreciate opening issues and pull requests
to make graphene even more stable & useful!
Building Documentation
~~~~~~~~~~~~~~~~~~~~~~
The documentation is generated using the excellent
`Sphinx <http://www.sphinx-doc.org/>`__ and a custom theme.
An HTML version of the documentation is produced by running:
.. code:: sh
make docs
.. |Graphene Logo| image:: http://graphene-python.org/favicon.png
.. |Build Status| image:: https://travis-ci.org/graphql-python/graphene.svg?branch=master
:target: https://travis-ci.org/graphql-python/graphene
.. |PyPI version| image:: https://badge.fury.io/py/graphene.svg
:target: https://badge.fury.io/py/graphene
.. |Coverage Status| image:: https://coveralls.io/repos/graphql-python/graphene/badge.svg?branch=master&service=github
:target: https://coveralls.io/github/graphql-python/graphene?branch=master

54
ROADMAP.md Normal file
View File

@ -0,0 +1,54 @@
# GraphQL Python Roadmap
In order to move Graphene and the GraphQL Python ecosystem forward it's essential to be clear with the community on next steps, so we can move uniformly.
_👋 If you have more ideas on how to move the Graphene ecosystem forward, don't hesistate to [open a PR](https://github.com/graphql-python/graphene/edit/master/ROADMAP.md)_
## Now
- [ ] Continue to support v2.x with security releases
- [ ] Last major/feature release is cut and graphene-* libraries should pin to that version number
## Next
New features will only be developed on version 3 of ecosystem libraries.
### [Core-Next](https://github.com/graphql-python/graphql-core-next)
Targeted as v3 of [graphql-core](https://pypi.org/project/graphql-core/), Python 3 only
### Graphene
- [ ] Integrate with the core-next API and resolve all breaking changes
- [ ] GraphQL types from type annotations - [See issue](https://github.com/graphql-python/graphene/issues/729)
- [ ] Add support for coroutines in Connection, Mutation (abstracting out Promise requirement) - [See PR](https://github.com/graphql-python/graphene/pull/824)
### Graphene-*
- [ ] Integrate with the graphene core-next API and resolve all breaking changes
### *-graphql
- [ ] Integrate with the graphql core-next API and resolve all breaking changes
## Ongoing Initiatives
- [ ] Improve documentation, especially for new users to the library
- [ ] Recipes for “quick start” that people can ideally use/run
## Dependent Libraries
| Repo | Release Manager | CODEOWNERS | Pinned | next/master created | Labels Standardized |
| ---------------------------------------------------------------------------- | --------------- | ---------- | ---------- | ------------------- | ------------------- |
| [graphene](https://github.com/graphql-python/graphene) | ekampf | ✅ | | ✅ | |
| [graphql-core](https://github.com/graphql-python/graphql-core) | Cito | ✅ | N/A | N/A | |
| [graphql-core-next](https://github.com/graphql-python/graphql-core-next) | Cito | ✅ | N/A | N/A | |
| [graphql-server-core](https://github.com/graphql-python/graphql-server-core) | Cito | | ✅ | ✅ | |
| [gql](https://github.com/graphql-python/gql) | ekampf | | | | |
| [gql-next](https://github.com/graphql-python/gql-next) | ekampf | | N/A | N/A | |
| ...[aiohttp](https://github.com/graphql-python/aiohttp-graphql) | | | | | |
| ...[django](https://github.com/graphql-python/graphene-django) | mvanlonden | | ✅ | ✅ | |
| ...[sanic](https://github.com/graphql-python/sanic-graphql) | ekampf | | | | |
| ...[flask](https://github.com/graphql-python/flask-graphql) | | | | | |
| ...[webob](https://github.com/graphql-python/webob-graphql) | | | | | |
| ...[tornado](https://github.com/graphql-python/graphene-tornado) | ewhauser | | PR created | ✅ | |
| ...[ws](https://github.com/graphql-python/graphql-ws) | Cito/dfee | | ✅ | ✅ | |
| ...[gae](https://github.com/graphql-python/graphene-gae) | ekampf | | PR created | ✅ | |
| ...[sqlalchemy](https://github.com/graphql-python/graphene-sqlalchemy) | jnak/Nabell | ✅ | ✅ | ✅ | |
| ...[mongo](https://github.com/graphql-python/graphene-mongo) | | | ✅ | ✅ | |
| ...[relay-py](https://github.com/graphql-python/graphql-relay-py) | Cito | | | | |
| ...[wsgi](https://github.com/moritzmhmk/wsgi-graphql) | | | | | |

View File

@ -1,15 +0,0 @@
# Security Policy
## Supported Versions
Support for security issues is currently provided for Graphene 3.0 and above. Support on earlier versions cannot be guaranteed by the maintainers of this library, but community PRs may be accepted in critical cases.
The preferred mitigation strategy is via an upgrade to Graphene 3.
| Version | Supported |
| ------- | ------------------ |
| 3.x | :white_check_mark: |
| <3.x | :x: |
## Reporting a Vulnerability
Please use responsible disclosure by contacting a core maintainer via Discord or E-Mail.

View File

@ -153,7 +153,7 @@ class Query(ObjectType):
``` ```
Also, if you wanted to create an `ObjectType` that implements `Node`, you have to do it Also, if you wanted to create an `ObjectType` that implements `Node`, you have to do it
explicitly. explicity.
## Django ## Django

View File

@ -123,7 +123,7 @@ def resolve_my_field(root, info, my_arg):
return ... return ...
``` ```
**PS.: Take care with receiving args like `my_arg` as above. This doesn't work for optional (non-required) arguments as standard `Connection`'s arguments (first, last, after, before).** **PS.: Take care with receiving args like `my_arg` as above. This doesn't work for optional (non-required) arguments as stantard `Connection`'s arguments (first, before, after, before).**
You may need something like this: You may need something like this:
```python ```python
@ -377,7 +377,10 @@ class Base(ObjectType):
id = ID() id = ID()
def resolve_id(root, info): def resolve_id(root, info):
return f"{root.__class__.__name__}_{root.id}" return "{type}_{id}".format(
type=root.__class__.__name__,
id=root.id
)
``` ```
### UUID Scalar ### UUID Scalar

7
bin/autolinter Executable file
View File

@ -0,0 +1,7 @@
#!/bin/bash
# Install the required scripts with
# pip install autoflake autopep8 isort
autoflake ./examples/ ./graphene/ -r --remove-unused-variables --remove-all-unused-imports --in-place
autopep8 ./examples/ ./graphene/ -r --in-place --experimental --aggressive --max-line-length 120
isort -rc ./examples/ ./graphene/

View File

@ -20,8 +20,6 @@ Object types
.. autoclass:: graphene.Mutation .. autoclass:: graphene.Mutation
:members: :members:
.. _fields-mounted-types:
Fields (Mounted Types) Fields (Mounted Types)
---------------------- ----------------------
@ -64,8 +62,6 @@ Graphene Scalars
.. autoclass:: graphene.JSONString() .. autoclass:: graphene.JSONString()
.. autoclass:: graphene.Base64()
Enum Enum
---- ----
@ -92,7 +88,7 @@ Execution Metadata
.. autoclass:: graphene.Context .. autoclass:: graphene.Context
.. autoclass:: graphql.ExecutionResult .. autoclass:: graphql.execution.base.ExecutionResult
.. Relay .. Relay
.. ----- .. -----

View File

@ -1,5 +1,4 @@
import os import os
import sys
import sphinx_graphene_theme import sphinx_graphene_theme
@ -23,6 +22,8 @@ on_rtd = os.environ.get("READTHEDOCS", None) == "True"
# add these directories to sys.path here. If the directory is relative to the # add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here. # documentation root, use os.path.abspath to make it absolute, like shown here.
# #
import os
import sys
sys.path.insert(0, os.path.abspath("..")) sys.path.insert(0, os.path.abspath(".."))
@ -63,25 +64,25 @@ source_suffix = ".rst"
master_doc = "index" master_doc = "index"
# General information about the project. # General information about the project.
project = "Graphene" project = u"Graphene"
copyright = "Graphene 2016" copyright = u"Graphene 2016"
author = "Syrus Akbary" author = u"Syrus Akbary"
# The version info for the project you're documenting, acts as replacement for # The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the # |version| and |release|, also used in various other places throughout the
# built documents. # built documents.
# #
# The short X.Y version. # The short X.Y version.
version = "1.0" version = u"1.0"
# The full version, including alpha/beta/rc tags. # The full version, including alpha/beta/rc tags.
release = "1.0" release = u"1.0"
# The language for content autogenerated by Sphinx. Refer to documentation # The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages. # for a list of supported languages.
# #
# This is also used if you do content translation via gettext catalogs. # This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases. # Usually you set "language" from the command line for these cases.
# language = None language = None
# There are two options for replacing |today|: either, you set today to some # There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used: # non-false value, then it is used:
@ -277,7 +278,7 @@ latex_elements = {
# (source start file, target name, title, # (source start file, target name, title,
# author, documentclass [howto, manual, or own class]). # author, documentclass [howto, manual, or own class]).
latex_documents = [ latex_documents = [
(master_doc, "Graphene.tex", "Graphene Documentation", "Syrus Akbary", "manual") (master_doc, "Graphene.tex", u"Graphene Documentation", u"Syrus Akbary", "manual")
] ]
# The name of an image file (relative to this directory) to place at the top of # The name of an image file (relative to this directory) to place at the top of
@ -317,7 +318,7 @@ latex_documents = [
# One entry per manual page. List of tuples # One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section). # (source start file, name, description, authors, manual section).
man_pages = [(master_doc, "graphene", "Graphene Documentation", [author], 1)] man_pages = [(master_doc, "graphene", u"Graphene Documentation", [author], 1)]
# If true, show URL addresses after external links. # If true, show URL addresses after external links.
# #
@ -333,7 +334,7 @@ texinfo_documents = [
( (
master_doc, master_doc,
"Graphene", "Graphene",
"Graphene Documentation", u"Graphene Documentation",
author, author,
"Graphene", "Graphene",
"One line description of project.", "One line description of project.",
@ -455,4 +456,5 @@ intersphinx_mapping = {
"http://docs.graphene-python.org/projects/sqlalchemy/en/latest/", "http://docs.graphene-python.org/projects/sqlalchemy/en/latest/",
None, None,
), ),
"graphene_gae": ("http://docs.graphene-python.org/projects/gae/en/latest/", None),
} }

View File

@ -4,7 +4,7 @@ Dataloader
DataLoader is a generic utility to be used as part of your application's DataLoader is a generic utility to be used as part of your application's
data fetching layer to provide a simplified and consistent API over data fetching layer to provide a simplified and consistent API over
various remote data sources such as databases or web services via batching various remote data sources such as databases or web services via batching
and caching. It is provided by a separate package `aiodataloader <https://pypi.org/project/aiodataloader/>`. and caching.
Batching Batching
@ -15,31 +15,32 @@ Create loaders by providing a batch loading function.
.. code:: python .. code:: python
from aiodataloader import DataLoader from promise import Promise
from promise.dataloader import DataLoader
class UserLoader(DataLoader): class UserLoader(DataLoader):
async def batch_load_fn(self, keys): def batch_load_fn(self, keys):
# Here we call a function to return a user for each key in keys # Here we return a promise that will result on the
return [get_user(id=key) for key in keys] # corresponding user for each key in keys
return Promise.resolve([get_user(id=key) for key in keys])
A batch loading async function accepts a list of keys, and returns a list of ``values``. A batch loading function accepts a list of keys, and returns a ``Promise``
which resolves to a list of ``values``.
Then load individual values from the loader. ``DataLoader`` will coalesce all
``DataLoader`` will coalesce all individual loads which occur within a individual loads which occur within a single frame of execution (executed once
single frame of execution (executed once the wrapping event loop is resolved) the wrapping promise is resolved) and then call your batch function with all
and then call your batch function with all requested keys. requested keys.
.. code:: python .. code:: python
user_loader = UserLoader() user_loader = UserLoader()
user1 = await user_loader.load(1) user_loader.load(1).then(lambda user: user_loader.load(user.best_friend_id))
user1_best_friend = await user_loader.load(user1.best_friend_id)
user2 = await user_loader.load(2) user_loader.load(2).then(lambda user: user_loader.load(user.best_friend_id))
user2_best_friend = await user_loader.load(user2.best_friend_id)
A naive application may have issued *four* round-trips to a backend for the A naive application may have issued *four* round-trips to a backend for the
@ -53,9 +54,9 @@ make sure that you then order the query result for the results to match the keys
.. code:: python .. code:: python
class UserLoader(DataLoader): class UserLoader(DataLoader):
async def batch_load_fn(self, keys): def batch_load_fn(self, keys):
users = {user.id: user for user in User.objects.filter(id__in=keys)} users = {user.id: user for user in User.objects.filter(id__in=keys)}
return [users.get(user_id) for user_id in keys] return Promise.resolve([users.get(user_id) for user_id in keys])
``DataLoader`` allows you to decouple unrelated parts of your application without ``DataLoader`` allows you to decouple unrelated parts of your application without
@ -95,7 +96,7 @@ Consider the following GraphQL request:
} }
If ``me``, ``bestFriend`` and ``friends`` each need to send a request to the backend, Naively, if ``me``, ``bestFriend`` and ``friends`` each need to request the backend,
there could be at most 13 database requests! there could be at most 13 database requests!
@ -110,8 +111,8 @@ leaner code and at most 4 database requests, and possibly fewer if there are cac
best_friend = graphene.Field(lambda: User) best_friend = graphene.Field(lambda: User)
friends = graphene.List(lambda: User) friends = graphene.List(lambda: User)
async def resolve_best_friend(root, info): def resolve_best_friend(root, info):
return await user_loader.load(root.best_friend_id) return user_loader.load(root.best_friend_id)
async def resolve_friends(root, info): def resolve_friends(root, info):
return await user_loader.load_many(root.friend_ids) return user_loader.load_many(root.friend_ids)

View File

@ -3,7 +3,8 @@
Executing a query Executing a query
================= =================
For executing a query against a schema, you can directly call the ``execute`` method on it.
For executing a query a schema, you can directly call the ``execute`` method on it.
.. code:: python .. code:: python
@ -85,7 +86,7 @@ Value used for :ref:`ResolverParamParent` in root queries and mutations can be o
return {'id': root.id, 'firstName': root.name} return {'id': root.id, 'firstName': root.name}
schema = Schema(Query) schema = Schema(Query)
user_root = User(id=12, name='bob') user_root = User(id=12, name='bob'}
result = schema.execute( result = schema.execute(
''' '''
query getUser { query getUser {
@ -110,7 +111,7 @@ If there are multiple operations defined in a query string, ``operation_name`` s
from graphene import ObjectType, Field, Schema from graphene import ObjectType, Field, Schema
class Query(ObjectType): class Query(ObjectType):
user = Field(User) me = Field(User)
def resolve_user(root, info): def resolve_user(root, info):
return get_user_by_id(12) return get_user_by_id(12)

View File

@ -1,8 +0,0 @@
File uploading
==============
File uploading is not part of the official GraphQL spec yet and is not natively
implemented in Graphene.
If your server needs to support file uploading then you can use the library: `graphene-file-upload <https://github.com/lmcgartland/graphene-file-upload>`_ which enhances Graphene to add file
uploads and conforms to the unoffical GraphQL `multipart request spec <https://github.com/jaydenseric/graphql-multipart-request-spec>`_.

View File

@ -8,6 +8,3 @@ Execution
execute execute
middleware middleware
dataloader dataloader
fileuploading
subscriptions
queryvalidation

View File

@ -29,7 +29,7 @@ This middleware only continues evaluation if the ``field_name`` is not ``'user'`
.. code:: python .. code:: python
class AuthorizationMiddleware(object): class AuthorizationMiddleware(object):
def resolve(self, next, root, info, **args): def resolve(next, root, info, **args):
if info.field_name == 'user': if info.field_name == 'user':
return None return None
return next(root, info, **args) return next(root, info, **args)
@ -41,14 +41,12 @@ And then execute it with:
result = schema.execute('THE QUERY', middleware=[AuthorizationMiddleware()]) result = schema.execute('THE QUERY', middleware=[AuthorizationMiddleware()])
If the ``middleware`` argument includes multiple middlewares,
these middlewares will be executed bottom-up, i.e. from last to first.
Functional example Functional example
------------------ ------------------
Middleware can also be defined as a function. Here we define a middleware that Middleware can also be defined as a function. Here we define a middleware that
logs the time it takes to resolve each field: logs the time it takes to resolve each field
.. code:: python .. code:: python
@ -57,9 +55,12 @@ logs the time it takes to resolve each field:
def timing_middleware(next, root, info, **args): def timing_middleware(next, root, info, **args):
start = timer() start = timer()
return_value = next(root, info, **args) return_value = next(root, info, **args)
duration = round((timer() - start) * 1000, 2) duration = timer() - start
parent_type_name = root._meta.name if root and hasattr(root, '_meta') else '' logger.debug("{parent_type}.{field_name}: {duration} ms".format(
logger.debug(f"{parent_type_name}.{info.field_name}: {duration} ms") parent_type=root._meta.name if root and hasattr(root, '_meta') else '',
field_name=info.field_name,
duration=round(duration * 1000, 2)
))
return return_value return return_value

View File

@ -1,123 +0,0 @@
Query Validation
================
GraphQL uses query validators to check if Query AST is valid and can be executed. Every GraphQL server implements
standard query validators. For example, there is an validator that tests if queried field exists on queried type, that
makes query fail with "Cannot query field on type" error if it doesn't.
To help with common use cases, graphene provides a few validation rules out of the box.
Depth limit Validator
---------------------
The depth limit validator helps to prevent execution of malicious
queries. It takes in the following arguments.
- ``max_depth`` is the maximum allowed depth for any operation in a GraphQL document.
- ``ignore`` Stops recursive depth checking based on a field name. Either a string or regexp to match the name, or a function that returns a boolean
- ``callback`` Called each time validation runs. Receives an Object which is a map of the depths for each operation.
Usage
-----
Here is how you would implement depth-limiting on your schema.
.. code:: python
from graphql import validate, parse
from graphene import ObjectType, Schema, String
from graphene.validation import depth_limit_validator
class MyQuery(ObjectType):
name = String(required=True)
schema = Schema(query=MyQuery)
# queries which have a depth more than 20
# will not be executed.
validation_errors = validate(
schema=schema.graphql_schema,
document_ast=parse('THE QUERY'),
rules=(
depth_limit_validator(
max_depth=20
),
)
)
Disable Introspection
---------------------
the disable introspection validation rule ensures that your schema cannot be introspected.
This is a useful security measure in production environments.
Usage
-----
Here is how you would disable introspection for your schema.
.. code:: python
from graphql import validate, parse
from graphene import ObjectType, Schema, String
from graphene.validation import DisableIntrospection
class MyQuery(ObjectType):
name = String(required=True)
schema = Schema(query=MyQuery)
# introspection queries will not be executed.
validation_errors = validate(
schema=schema.graphql_schema,
document_ast=parse('THE QUERY'),
rules=(
DisableIntrospection,
)
)
Implementing custom validators
------------------------------
All custom query validators should extend the `ValidationRule <https://github.com/graphql-python/graphql-core/blob/v3.0.5/src/graphql/validation/rules/__init__.py#L37>`_
base class importable from the graphql.validation.rules module. Query validators are visitor classes. They are
instantiated at the time of query validation with one required argument (context: ASTValidationContext). In order to
perform validation, your validator class should define one or more of enter_* and leave_* methods. For possible
enter/leave items as well as details on function documentation, please see contents of the visitor module. To make
validation fail, you should call validator's report_error method with the instance of GraphQLError describing failure
reason. Here is an example query validator that visits field definitions in GraphQL query and fails query validation
if any of those fields are blacklisted:
.. code:: python
from graphql import GraphQLError
from graphql.language import FieldNode
from graphql.validation import ValidationRule
my_blacklist = (
"disallowed_field",
)
def is_blacklisted_field(field_name: str):
return field_name.lower() in my_blacklist
class BlackListRule(ValidationRule):
def enter_field(self, node: FieldNode, *_args):
field_name = node.name.value
if not is_blacklisted_field(field_name):
return
self.report_error(
GraphQLError(
f"Cannot query '{field_name}': field is blacklisted.", node,
)
)

View File

@ -1,40 +0,0 @@
.. _SchemaSubscription:
Subscriptions
=============
To create a subscription, you can directly call the ``subscribe`` method on the
schema. This method is async and must be awaited.
.. code:: python
import asyncio
from datetime import datetime
from graphene import ObjectType, String, Schema, Field
# Every schema requires a query.
class Query(ObjectType):
hello = String()
def resolve_hello(root, info):
return "Hello, world!"
class Subscription(ObjectType):
time_of_day = String()
async def subscribe_time_of_day(root, info):
while True:
yield datetime.now().isoformat()
await asyncio.sleep(1)
schema = Schema(query=Query, subscription=Subscription)
async def main(schema):
subscription = 'subscription { timeOfDay }'
result = await schema.subscribe(subscription)
async for item in result:
print(item.data['timeOfDay'])
asyncio.run(main(schema))
The ``result`` is an async iterator which yields items in the same manner as a query.

View File

@ -21,6 +21,7 @@ Integrations
* `Graphene-Django <http://docs.graphene-python.org/projects/django/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-django/>`_) * `Graphene-Django <http://docs.graphene-python.org/projects/django/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-django/>`_)
* Flask-Graphql (`source <https://github.com/graphql-python/flask-graphql>`_) * Flask-Graphql (`source <https://github.com/graphql-python/flask-graphql>`_)
* `Graphene-SQLAlchemy <http://docs.graphene-python.org/projects/sqlalchemy/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-sqlalchemy/>`_) * `Graphene-SQLAlchemy <http://docs.graphene-python.org/projects/sqlalchemy/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-sqlalchemy/>`_)
* `Graphene-GAE <http://docs.graphene-python.org/projects/gae/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-gae/>`_)
* `Graphene-Mongo <http://graphene-mongo.readthedocs.io/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-mongo>`_) * `Graphene-Mongo <http://graphene-mongo.readthedocs.io/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-mongo>`_)
* `Starlette <https://www.starlette.io/graphql/>`_ (`source <https://github.com/encode/starlette>`_) * `Starlette <https://www.starlette.io/graphql/>`_ (`source <https://github.com/encode/starlette>`_)
* `FastAPI <https://fastapi.tiangolo.com/advanced/graphql/>`_ (`source <https://github.com/tiangolo/fastapi>`_) * `FastAPI <https://fastapi.tiangolo.com/tutorial/graphql/>`_ (`source <https://github.com/tiangolo/fastapi>`_)

View File

@ -24,25 +24,25 @@ What is Graphene?
Graphene is a library that provides tools to implement a GraphQL API in Python using a *code-first* approach. Graphene is a library that provides tools to implement a GraphQL API in Python using a *code-first* approach.
Compare Graphene's *code-first* approach to building a GraphQL API with *schema-first* approaches like `Apollo Server`_ (JavaScript) or Ariadne_ (Python). Instead of writing GraphQL **Schema Definition Language (SDL)**, we write Python code to describe the data provided by your server. Compare Graphene's *code-first* approach to building a GraphQL API with *schema-first* approaches like `Apollo Server`_ (JavaScript) or Ariadne_ (Python). Instead of writing GraphQL **Schema Definition Langauge (SDL)**, we write Python code to describe the data provided by your server.
.. _Apollo Server: https://www.apollographql.com/docs/apollo-server/ .. _Apollo Server: https://www.apollographql.com/docs/apollo-server/
.. _Ariadne: https://ariadnegraphql.org/ .. _Ariadne: https://ariadne.readthedocs.io
Graphene is fully featured with integrations for the most popular web frameworks and ORMs. Graphene produces schemas that are fully compliant with the GraphQL spec and provides tools and patterns for building a Relay-Compliant API as well. Graphene is fully featured with integrations for the most popular web frameworks and ORMs. Graphene produces schemas tha are fully compliant with the GraphQL spec and provides tools and patterns for building a Relay-Compliant API as well.
An example in Graphene An example in Graphene
---------------------- ----------------------
Lets build a basic GraphQL schema to say "hello" and "goodbye" in Graphene. Lets build a basic GraphQL schema to say "hello" and "goodbye" in Graphene.
When we send a **Query** requesting only one **Field**, ``hello``, and specify a value for the ``firstName`` **Argument**... When we send a **Query** requesting only one **Field**, ``hello``, and specify a value for the ``name`` **Argument**...
.. code:: .. code::
{ {
hello(firstName: "friend") hello(name: "friend")
} }
...we would expect the following Response containing only the data requested (the ``goodbye`` field is not resolved). ...we would expect the following Response containing only the data requested (the ``goodbye`` field is not resolved).
@ -59,15 +59,15 @@ When we send a **Query** requesting only one **Field**, ``hello``, and specify a
Requirements Requirements
~~~~~~~~~~~~ ~~~~~~~~~~~~
- Python (3.8, 3.9, 3.10, 3.11, 3.12, pypy) - Python (2.7, 3.4, 3.5, 3.6, pypy)
- Graphene (3.0) - Graphene (2.0)
Project setup Project setup
~~~~~~~~~~~~~ ~~~~~~~~~~~~~
.. code:: bash .. code:: bash
pip install "graphene>=3.0" pip install "graphene>=2.0"
Creating a basic Schema Creating a basic Schema
~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~
@ -79,15 +79,14 @@ In Graphene, we can define a simple schema using the following code:
from graphene import ObjectType, String, Schema from graphene import ObjectType, String, Schema
class Query(ObjectType): class Query(ObjectType):
# this defines a Field `hello` in our Schema with a single Argument `first_name` # this defines a Field `hello` in our Schema with a single Argument `name`
# By default, the argument name will automatically be camel-based into firstName in the generated schema hello = String(name=String(default_value="stranger"))
hello = String(first_name=String(default_value="stranger"))
goodbye = String() goodbye = String()
# our Resolver method takes the GraphQL context (root, info) as well as # our Resolver method takes the GraphQL context (root, info) as well as
# Argument (first_name) for the Field and returns data for the query Response # Argument (name) for the Field and returns data for the query Response
def resolve_hello(root, info, first_name): def resolve_hello(root, info, name):
return f'Hello {first_name}!' return f'Hello {name}!'
def resolve_goodbye(root, info): def resolve_goodbye(root, info):
return 'See ya!' return 'See ya!'
@ -104,14 +103,14 @@ For each **Field** in our **Schema**, we write a **Resolver** method to fetch da
Schema Definition Language (SDL) Schema Definition Language (SDL)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In the `GraphQL Schema Definition Language`_, we could describe the fields defined by our example code as shown below. In the `GraphQL Schema Definition Language`_, we could describe the fields defined by our example code as show below.
.. _GraphQL Schema Definition Language: https://graphql.org/learn/schema/ .. _GraphQL Schema Definition Language: https://graphql.org/learn/schema/
.. code:: .. code::
type Query { type Query {
hello(firstName: String = "stranger"): String hello(name: String = "stranger"): String
goodbye: String goodbye: String
} }
@ -128,10 +127,10 @@ Then we can start querying our **Schema** by passing a GraphQL query string to `
query_string = '{ hello }' query_string = '{ hello }'
result = schema.execute(query_string) result = schema.execute(query_string)
print(result.data['hello']) print(result.data['hello'])
# "Hello stranger!" # "Hello stranger"
# or passing the argument in the query # or passing the argument in the query
query_with_argument = '{ hello(firstName: "GraphQL") }' query_with_argument = '{ hello(name: "GraphQL") }'
result = schema.execute(query_with_argument) result = schema.execute(query_with_argument)
print(result.data['hello']) print(result.data['hello'])
# "Hello GraphQL!" # "Hello GraphQL!"

View File

@ -19,8 +19,11 @@ Useful links
- `Getting started with Relay`_ - `Getting started with Relay`_
- `Relay Global Identification Specification`_ - `Relay Global Identification Specification`_
- `Relay Cursor Connection Specification`_ - `Relay Cursor Connection Specification`_
- `Relay input Object Mutation`_
.. _Relay: https://relay.dev/docs/guides/graphql-server-specification/ .. _Relay: https://facebook.github.io/relay/docs/en/graphql-server-specification.html
.. _Getting started with Relay: https://relay.dev/docs/getting-started/step-by-step-guide/ .. _Relay specification: https://facebook.github.io/relay/graphql/objectidentification.htm#sec-Node-root-field
.. _Relay Global Identification Specification: https://relay.dev/graphql/objectidentification.htm .. _Getting started with Relay: https://facebook.github.io/relay/docs/en/quick-start-guide.html
.. _Relay Cursor Connection Specification: https://relay.dev/graphql/connections.htm .. _Relay Global Identification Specification: https://facebook.github.io/relay/graphql/objectidentification.htm
.. _Relay Cursor Connection Specification: https://facebook.github.io/relay/graphql/connections.htm
.. _Relay input Object Mutation: https://facebook.github.io/relay/graphql/mutations.htm

View File

@ -51,20 +51,20 @@ Example of a custom node:
name = 'Node' name = 'Node'
@staticmethod @staticmethod
def to_global_id(type_, id): def to_global_id(type, id):
return f"{type_}:{id}" return '{}:{}'.format(type, id)
@staticmethod @staticmethod
def get_node_from_global_id(info, global_id, only_type=None): def get_node_from_global_id(info, global_id, only_type=None):
type_, id = global_id.split(':') type, id = global_id.split(':')
if only_type: if only_type:
# We assure that the node type that we want to retrieve # We assure that the node type that we want to retrieve
# is the same that was indicated in the field type # is the same that was indicated in the field type
assert type_ == only_type._meta.name, 'Received not compatible node.' assert type == only_type._meta.name, 'Received not compatible node.'
if type_ == 'User': if type == 'User':
return get_user(id) return get_user(id)
elif type_ == 'Photo': elif type == 'Photo':
return get_photo(id) return get_photo(id)

View File

@ -1,5 +1,5 @@
# Required library # Required library
Sphinx==6.1.3 Sphinx==1.5.3
sphinx-autobuild==2021.3.14 sphinx-autobuild==0.7.1
# Docs template # Docs template
http://graphene-python.org/sphinx_graphene_theme.zip http://graphene-python.org/sphinx_graphene_theme.zip

View File

@ -69,3 +69,43 @@ You can also add extra keyword arguments to the ``execute`` method, such as
'hey': 'hello Peter!' 'hey': 'hello Peter!'
} }
} }
Snapshot testing
~~~~~~~~~~~~~~~~
As our APIs evolve, we need to know when our changes introduce any breaking changes that might break
some of the clients of our GraphQL app.
However, writing tests and replicate the same response we expect from our GraphQL application can be
tedious and repetitive task, and sometimes it's easier to skip this process.
Because of that, we recommend the usage of `SnapshotTest <https://github.com/syrusakbary/snapshottest/>`_.
SnapshotTest let us write all this tests in a breeze, as creates automatically the ``snapshots`` for us
the first time the test is executed.
Here is a simple example on how our tests will look if we use ``pytest``:
.. code:: python
def test_hey(snapshot):
client = Client(my_schema)
# This will create a snapshot dir and a snapshot file
# the first time the test is executed, with the response
# of the execution.
snapshot.assert_match(client.execute('''{ hey }'''))
If we are using ``unittest``:
.. code:: python
from snapshottest import TestCase
class APITestCase(TestCase):
def test_api_me(self):
"""Testing the API for /me"""
client = Client(my_schema)
self.assertMatchSnapshot(client.execute('''{ hey }'''))

View File

@ -0,0 +1,43 @@
AbstractTypes
=============
An AbstractType contains fields that can be shared among
``graphene.ObjectType``, ``graphene.Interface``,
``graphene.InputObjectType`` or other ``graphene.AbstractType``.
The basics:
- Each AbstractType is a Python class that inherits from ``graphene.AbstractType``.
- Each attribute of the AbstractType represents a field (a ``graphene.Field`` or
``graphene.InputField`` depending on where it is mounted)
Quick example
-------------
In this example UserFields is an ``AbstractType`` with a name. ``User`` and
``UserInput`` are two types that have their own fields
plus the ones defined in ``UserFields``.
.. code:: python
import graphene
class UserFields(graphene.AbstractType):
name = graphene.String()
class User(graphene.ObjectType, UserFields):
pass
class UserInput(graphene.InputObjectType, UserFields):
pass
.. code::
type User {
name: String
}
inputtype UserInput {
name: String
}

View File

@ -61,8 +61,7 @@ you can add description etc. to your enum without changing the original:
graphene.Enum.from_enum( graphene.Enum.from_enum(
AlreadyExistingPyEnum, AlreadyExistingPyEnum,
description=lambda v: return 'foo' if v == AlreadyExistingPyEnum.Foo else 'bar' description=lambda v: return 'foo' if v == AlreadyExistingPyEnum.Foo else 'bar')
)
Notes Notes
@ -77,7 +76,6 @@ In the Python ``Enum`` implementation you can access a member by initing the Enu
.. code:: python .. code:: python
from enum import Enum from enum import Enum
class Color(Enum): class Color(Enum):
RED = 1 RED = 1
GREEN = 2 GREEN = 2
@ -86,12 +84,11 @@ In the Python ``Enum`` implementation you can access a member by initing the Enu
assert Color(1) == Color.RED assert Color(1) == Color.RED
However, in Graphene ``Enum`` you need to call `.get` to have the same effect: However, in Graphene ``Enum`` you need to call get to have the same effect:
.. code:: python .. code:: python
from graphene import Enum from graphene import Enum
class Color(Enum): class Color(Enum):
RED = 1 RED = 1
GREEN = 2 GREEN = 2

View File

@ -15,3 +15,4 @@ Types Reference
interfaces interfaces
unions unions
mutations mutations
abstracttypes

View File

@ -44,7 +44,7 @@ Both of these types have all of the fields from the ``Character`` interface,
but also bring in extra fields, ``home_planet``, ``starships`` and but also bring in extra fields, ``home_planet``, ``starships`` and
``primary_function``, that are specific to that particular type of character. ``primary_function``, that are specific to that particular type of character.
The full GraphQL schema definition will look like this: The full GraphQL schema defition will look like this:
.. code:: .. code::

View File

@ -85,9 +85,9 @@ We should receive:
InputFields and InputObjectTypes InputFields and InputObjectTypes
---------------------------------- ----------------------------------
InputFields are used in mutations to allow nested input data for mutations. InputFields are used in mutations to allow nested input data for mutations
To use an InputField you define an InputObjectType that specifies the structure of your input data: To use an InputField you define an InputObjectType that specifies the structure of your input data
.. code:: python .. code:: python
@ -104,6 +104,7 @@ To use an InputField you define an InputObjectType that specifies the structure
person = graphene.Field(Person) person = graphene.Field(Person)
@staticmethod
def mutate(root, info, person_data=None): def mutate(root, info, person_data=None):
person = Person( person = Person(
name=person_data.name, name=person_data.name,
@ -112,7 +113,7 @@ To use an InputField you define an InputObjectType that specifies the structure
return CreatePerson(person=person) return CreatePerson(person=person)
Note that **name** and **age** are part of **person_data** now. Note that **name** and **age** are part of **person_data** now
Using the above mutation your new query would look like this: Using the above mutation your new query would look like this:
@ -128,7 +129,7 @@ Using the above mutation your new query would look like this:
} }
InputObjectTypes can also be fields of InputObjectTypes allowing you to have InputObjectTypes can also be fields of InputObjectTypes allowing you to have
as complex of input data as you need: as complex of input data as you need
.. code:: python .. code:: python
@ -160,7 +161,7 @@ To return an existing ObjectType instead of a mutation-specific type, set the **
def mutate(root, info, name): def mutate(root, info, name):
return Person(name=name) return Person(name=name)
Then, if we query (``schema.execute(query_str)``) with the following: Then, if we query (``schema.execute(query_str)``) the following:
.. code:: .. code::

View File

@ -52,7 +52,6 @@ Resolvers are lazily executed, so if a field is not included in a query, its res
Each field on an *ObjectType* in Graphene should have a corresponding resolver method to fetch data. This resolver method should match the field name. For example, in the ``Person`` type above, the ``full_name`` field is resolved by the method ``resolve_full_name``. Each field on an *ObjectType* in Graphene should have a corresponding resolver method to fetch data. This resolver method should match the field name. For example, in the ``Person`` type above, the ``full_name`` field is resolved by the method ``resolve_full_name``.
Each resolver method takes the parameters: Each resolver method takes the parameters:
* :ref:`ResolverParamParent` for the value object use to resolve most fields * :ref:`ResolverParamParent` for the value object use to resolve most fields
* :ref:`ResolverParamInfo` for query and schema meta information and per-request context * :ref:`ResolverParamInfo` for query and schema meta information and per-request context
* :ref:`ResolverParamGraphQLArguments` as defined on the **Field**. * :ref:`ResolverParamGraphQLArguments` as defined on the **Field**.
@ -80,10 +79,6 @@ If we have a schema with Person type and one field on the root query.
from graphene import ObjectType, String, Field from graphene import ObjectType, String, Field
def get_human(name):
first_name, last_name = name.split()
return Person(first_name, last_name)
class Person(ObjectType): class Person(ObjectType):
full_name = String() full_name = String()
@ -106,7 +101,7 @@ When we execute a query against that schema.
query_string = "{ me { fullName } }" query_string = "{ me { fullName } }"
result = schema.execute(query_string) result = schema.execute(query_string)
assert result.data["me"] == {"fullName": "Luke Skywalker"} assert result["data"]["me"] == {"fullName": "Luke Skywalker")
Then we go through the following steps to resolve this query: Then we go through the following steps to resolve this query:
@ -163,22 +158,6 @@ You can then execute the following query:
} }
} }
*Note:* There are several arguments to a field that are "reserved" by Graphene
(see :ref:`fields-mounted-types`).
You can still define an argument that clashes with one of these fields by using
the ``args`` parameter like so:
.. code:: python
from graphene import ObjectType, Field, String
class Query(ObjectType):
answer = String(args={'description': String()})
def resolve_answer(parent, info, description):
return description
Convenience Features of Graphene Resolvers Convenience Features of Graphene Resolvers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@ -216,7 +195,7 @@ The two resolvers in this example are effectively the same.
# ... # ...
If you prefer your code to be more explicit, feel free to use ``@staticmethod`` decorators. Otherwise, your code may be cleaner without them! If you prefer your code to be more explict, feel free to use ``@staticmethod`` decorators. Otherwise, your code may be cleaner without them!
.. _DefaultResolver: .. _DefaultResolver:
@ -233,7 +212,7 @@ If the :ref:`ResolverParamParent` is a dictionary, the resolver will look for a
from graphene import ObjectType, String, Field, Schema from graphene import ObjectType, String, Field, Schema
PersonValueObject = namedtuple("Person", ["first_name", "last_name"]) PersonValueObject = namedtuple('Person', 'first_name', 'last_name')
class Person(ObjectType): class Person(ObjectType):
first_name = String() first_name = String()
@ -245,7 +224,7 @@ If the :ref:`ResolverParamParent` is a dictionary, the resolver will look for a
def resolve_me(parent, info): def resolve_me(parent, info):
# always pass an object for `me` field # always pass an object for `me` field
return PersonValueObject(first_name="Luke", last_name="Skywalker") return PersonValueObject(first_name='Luke', last_name='Skywalker')
def resolve_my_best_friend(parent, info): def resolve_my_best_friend(parent, info):
# always pass a dictionary for `my_best_fiend_field` # always pass a dictionary for `my_best_fiend_field`
@ -259,10 +238,10 @@ If the :ref:`ResolverParamParent` is a dictionary, the resolver will look for a
} }
''') ''')
# With default resolvers we can resolve attributes from an object.. # With default resolvers we can resolve attributes from an object..
assert result.data["me"] == {"firstName": "Luke", "lastName": "Skywalker"} assert result['data']['me'] == {"firstName": "Luke", "lastName": "Skywalker"}
# With default resolvers, we can also resolve keys from a dictionary.. # With default resolvers, we can also resolve keys from a dictionary..
assert result.data["myBestFriend"] == {"firstName": "R2", "lastName": "D2"} assert result['data']['my_best_friend'] == {"firstName": "R2", "lastName": "D2"}
Advanced Advanced
~~~~~~~~ ~~~~~~~~
@ -272,7 +251,7 @@ GraphQL Argument defaults
If you define an argument for a field that is not required (and in a query If you define an argument for a field that is not required (and in a query
execution it is not provided as an argument) it will not be passed to the execution it is not provided as an argument) it will not be passed to the
resolver function at all. This is so that the developer can differentiate resolver function at all. This is so that the developer can differenciate
between a ``undefined`` value for an argument and an explicit ``null`` value. between a ``undefined`` value for an argument and an explicit ``null`` value.
For example, given this schema: For example, given this schema:
@ -301,7 +280,7 @@ An error will be thrown:
TypeError: resolve_hello() missing 1 required positional argument: 'name' TypeError: resolve_hello() missing 1 required positional argument: 'name'
You can fix this error in several ways. Either by combining all keyword arguments You can fix this error in serveral ways. Either by combining all keyword arguments
into a dict: into a dict:
.. code:: python .. code:: python
@ -352,7 +331,7 @@ A field can use a custom resolver from outside the class:
from graphene import ObjectType, String from graphene import ObjectType, String
def resolve_full_name(person, info): def resolve_full_name(person, info):
return f"{person.first_name} {person.last_name}" return '{} {}'.format(person.first_name, person.last_name)
class Person(ObjectType): class Person(ObjectType):
first_name = String() first_name = String()

View File

@ -3,11 +3,6 @@
Scalars Scalars
======= =======
Scalar types represent concrete values at the leaves of a query. There are
several built in types that Graphene provides out of the box which represent common
values in Python. You can also create your own Scalar types to better express
values that you might have in your data model.
All Scalar types accept the following arguments. All are optional: All Scalar types accept the following arguments. All are optional:
``name``: *string* ``name``: *string*
@ -32,39 +27,34 @@ All Scalar types accept the following arguments. All are optional:
Built in scalars Base scalars
---------------- ------------
Graphene defines the following base Scalar Types that match the default `GraphQL types <https://graphql.org/learn/schema/#scalar-types>`_: Graphene defines the following base Scalar Types:
``graphene.String`` ``graphene.String``
^^^^^^^^^^^^^^^^^^^
Represents textual data, represented as UTF-8 Represents textual data, represented as UTF-8
character sequences. The String type is most often used by GraphQL to character sequences. The String type is most often used by GraphQL to
represent free-form human-readable text. represent free-form human-readable text.
``graphene.Int`` ``graphene.Int``
^^^^^^^^^^^^^^^^
Represents non-fractional signed whole numeric Represents non-fractional signed whole numeric
values. Int is a signed 32bit integer per the values. Int is a signed 32bit integer per the
`GraphQL spec <https://facebook.github.io/graphql/June2018/#sec-Int>`_ `GraphQL spec <https://facebook.github.io/graphql/June2018/#sec-Int>`_
``graphene.Float`` ``graphene.Float``
^^^^^^^^^^^^^^^^^^
Represents signed double-precision fractional Represents signed double-precision fractional
values as specified by values as specified by
`IEEE 754 <http://en.wikipedia.org/wiki/IEEE_floating_point>`_. `IEEE 754 <http://en.wikipedia.org/wiki/IEEE_floating_point>`_.
``graphene.Boolean`` ``graphene.Boolean``
^^^^^^^^^^^^^^^^^^^^
Represents `true` or `false`. Represents `true` or `false`.
``graphene.ID`` ``graphene.ID``
^^^^^^^^^^^^^^^
Represents a unique identifier, often used to Represents a unique identifier, often used to
refetch an object or as key for a cache. The ID type appears in a JSON refetch an object or as key for a cache. The ID type appears in a JSON
@ -72,183 +62,24 @@ Graphene defines the following base Scalar Types that match the default `GraphQL
When expected as an input type, any string (such as `"4"`) or integer When expected as an input type, any string (such as `"4"`) or integer
(such as `4`) input value will be accepted as an ID. (such as `4`) input value will be accepted as an ID.
---- Graphene also provides custom scalars for Dates, Times, and JSON:
Graphene also provides custom scalars for common values: ``graphene.types.datetime.Date``
``graphene.Date``
^^^^^^^^^^^^^^^^^
Represents a Date value as specified by `iso8601 <https://en.wikipedia.org/wiki/ISO_8601>`_. Represents a Date value as specified by `iso8601 <https://en.wikipedia.org/wiki/ISO_8601>`_.
.. code:: python ``graphene.types.datetime.DateTime``
import datetime
from graphene import Schema, ObjectType, Date
class Query(ObjectType):
one_week_from = Date(required=True, date_input=Date(required=True))
def resolve_one_week_from(root, info, date_input):
assert date_input == datetime.date(2006, 1, 2)
return date_input + datetime.timedelta(weeks=1)
schema = Schema(query=Query)
results = schema.execute("""
query {
oneWeekFrom(dateInput: "2006-01-02")
}
""")
assert results.data == {"oneWeekFrom": "2006-01-09"}
``graphene.DateTime``
^^^^^^^^^^^^^^^^^^^^^
Represents a DateTime value as specified by `iso8601 <https://en.wikipedia.org/wiki/ISO_8601>`_. Represents a DateTime value as specified by `iso8601 <https://en.wikipedia.org/wiki/ISO_8601>`_.
.. code:: python ``graphene.types.datetime.Time``
import datetime
from graphene import Schema, ObjectType, DateTime
class Query(ObjectType):
one_hour_from = DateTime(required=True, datetime_input=DateTime(required=True))
def resolve_one_hour_from(root, info, datetime_input):
assert datetime_input == datetime.datetime(2006, 1, 2, 15, 4, 5)
return datetime_input + datetime.timedelta(hours=1)
schema = Schema(query=Query)
results = schema.execute("""
query {
oneHourFrom(datetimeInput: "2006-01-02T15:04:05")
}
""")
assert results.data == {"oneHourFrom": "2006-01-02T16:04:05"}
``graphene.Time``
^^^^^^^^^^^^^^^^^
Represents a Time value as specified by `iso8601 <https://en.wikipedia.org/wiki/ISO_8601>`_. Represents a Time value as specified by `iso8601 <https://en.wikipedia.org/wiki/ISO_8601>`_.
.. code:: python ``graphene.types.json.JSONString``
import datetime
from graphene import Schema, ObjectType, Time
class Query(ObjectType):
one_hour_from = Time(required=True, time_input=Time(required=True))
def resolve_one_hour_from(root, info, time_input):
assert time_input == datetime.time(15, 4, 5)
tmp_time_input = datetime.datetime.combine(datetime.date(1, 1, 1), time_input)
return (tmp_time_input + datetime.timedelta(hours=1)).time()
schema = Schema(query=Query)
results = schema.execute("""
query {
oneHourFrom(timeInput: "15:04:05")
}
""")
assert results.data == {"oneHourFrom": "16:04:05"}
``graphene.Decimal``
^^^^^^^^^^^^^^^^^^^^
Represents a Python Decimal value.
.. code:: python
import decimal
from graphene import Schema, ObjectType, Decimal
class Query(ObjectType):
add_one_to = Decimal(required=True, decimal_input=Decimal(required=True))
def resolve_add_one_to(root, info, decimal_input):
assert decimal_input == decimal.Decimal("10.50")
return decimal_input + decimal.Decimal("1")
schema = Schema(query=Query)
results = schema.execute("""
query {
addOneTo(decimalInput: "10.50")
}
""")
assert results.data == {"addOneTo": "11.50"}
``graphene.JSONString``
^^^^^^^^^^^^^^^^^^^^^^^
Represents a JSON string. Represents a JSON string.
.. code:: python
from graphene import Schema, ObjectType, JSONString, String
class Query(ObjectType):
update_json_key = JSONString(
required=True,
json_input=JSONString(required=True),
key=String(required=True),
value=String(required=True)
)
def resolve_update_json_key(root, info, json_input, key, value):
assert json_input == {"name": "Jane"}
json_input[key] = value
return json_input
schema = Schema(query=Query)
results = schema.execute("""
query {
updateJsonKey(jsonInput: "{\\"name\\": \\"Jane\\"}", key: "name", value: "Beth")
}
""")
assert results.data == {"updateJsonKey": "{\"name\": \"Beth\"}"}
``graphene.Base64``
^^^^^^^^^^^^^^^^^^^
Represents a Base64 encoded string.
.. code:: python
from graphene import Schema, ObjectType, Base64
class Query(ObjectType):
increment_encoded_id = Base64(
required=True,
base64_input=Base64(required=True),
)
def resolve_increment_encoded_id(root, info, base64_input):
assert base64_input == "4"
return int(base64_input) + 1
schema = Schema(query=Query)
results = schema.execute("""
query {
incrementEncodedId(base64Input: "NA==")
}
""")
assert results.data == {"incrementEncodedId": "NQ=="}
Custom scalars Custom scalars
-------------- --------------
@ -270,8 +101,8 @@ The following is an example for creating a DateTime scalar:
return dt.isoformat() return dt.isoformat()
@staticmethod @staticmethod
def parse_literal(node, _variables=None): def parse_literal(node):
if isinstance(node, ast.StringValueNode): if isinstance(node, ast.StringValue):
return datetime.datetime.strptime( return datetime.datetime.strptime(
node.value, "%Y-%m-%dT%H:%M:%S.%f") node.value, "%Y-%m-%dT%H:%M:%S.%f")

View File

@ -1,11 +1,11 @@
Schema Schema
====== ======
A GraphQL **Schema** defines the types and relationships between **Fields** in your API. A GraphQL **Schema** defines the types and relationship between **Fields** in your API.
A Schema is created by supplying the root :ref:`ObjectType` of each operation, query (mandatory), mutation and subscription. A Schema is created by supplying the root :ref:`ObjectType` of each operation, query (mandatory), mutation and subscription.
Schema will collect all type definitions related to the root operations and then supply them to the validator and executor. Schema will collect all type definitions related to the root operations and then supplied to the validator and executor.
.. code:: python .. code:: python
@ -15,11 +15,11 @@ Schema will collect all type definitions related to the root operations and then
subscription=MyRootSubscription subscription=MyRootSubscription
) )
A Root Query is just a special :ref:`ObjectType` that defines the fields that are the entrypoint for your API. Root Mutation and Root Subscription are similar to Root Query, but for different operation types: A Root Query is just a special :ref:`ObjectType` that :ref:`defines the fields <Scalars>` that are the entrypoint for your API. Root Mutation and Root Subscription are similar to Root Query, but for different operation types:
* Query fetches data * Query fetches data
* Mutation changes data and retrieves the changes * Mutation to changes data and retrieve the changes
* Subscription sends changes to clients in real-time * Subscription to sends changes to clients in real time
Review the `GraphQL documentation on Schema`_ for a brief overview of fields, schema and operations. Review the `GraphQL documentation on Schema`_ for a brief overview of fields, schema and operations.
@ -44,7 +44,7 @@ There are some cases where the schema cannot access all of the types that we pla
For example, when a field returns an ``Interface``, the schema doesn't know about any of the For example, when a field returns an ``Interface``, the schema doesn't know about any of the
implementations. implementations.
In this case, we need to use the ``types`` argument when creating the Schema: In this case, we need to use the ``types`` argument when creating the Schema.
.. code:: python .. code:: python
@ -56,14 +56,14 @@ In this case, we need to use the ``types`` argument when creating the Schema:
.. _SchemaAutoCamelCase: .. _SchemaAutoCamelCase:
Auto camelCase field names Auto CamelCase field names
-------------------------- --------------------------
By default all field and argument names (that are not By default all field and argument names (that are not
explicitly set with the ``name`` arg) will be converted from explicitly set with the ``name`` arg) will be converted from
``snake_case`` to ``camelCase`` (as the API is usually being consumed by a js/mobile client) ``snake_case`` to ``camelCase`` (as the API is usually being consumed by a js/mobile client)
For example with the ObjectType the ``last_name`` field name is converted to ``lastName``: For example with the ObjectType
.. code:: python .. code:: python
@ -71,10 +71,12 @@ For example with the ObjectType the ``last_name`` field name is converted to ``l
last_name = graphene.String() last_name = graphene.String()
other_name = graphene.String(name='_other_Name') other_name = graphene.String(name='_other_Name')
the ``last_name`` field name is converted to ``lastName``.
In case you don't want to apply this transformation, provide a ``name`` argument to the field constructor. In case you don't want to apply this transformation, provide a ``name`` argument to the field constructor.
``other_name`` converts to ``_other_Name`` (without further transformations). ``other_name`` converts to ``_other_Name`` (without further transformations).
Your query should look like: Your query should look like
.. code:: .. code::
@ -84,7 +86,7 @@ Your query should look like:
} }
To disable this behavior, set the ``auto_camelcase`` to ``False`` upon schema instantiation: To disable this behavior, set the ``auto_camelcase`` to ``False`` upon schema instantiation.
.. code:: python .. code:: python

View File

@ -7,7 +7,7 @@ to specify any common fields between the types.
The basics: The basics:
- Each Union is a Python class that inherits from ``graphene.Union``. - Each Union is a Python class that inherits from ``graphene.Union``.
- Unions don't have any fields on it, just links to the possible ObjectTypes. - Unions don't have any fields on it, just links to the possible objecttypes.
Quick example Quick example
------------- -------------

View File

@ -7,7 +7,7 @@ class GeoInput(graphene.InputObjectType):
@property @property
def latlng(self): def latlng(self):
return f"({self.lat},{self.lng})" return "({},{})".format(self.lat, self.lng)
class Address(graphene.ObjectType): class Address(graphene.ObjectType):
@ -17,7 +17,7 @@ class Address(graphene.ObjectType):
class Query(graphene.ObjectType): class Query(graphene.ObjectType):
address = graphene.Field(Address, geo=GeoInput(required=True)) address = graphene.Field(Address, geo=GeoInput(required=True))
def resolve_address(root, info, geo): def resolve_address(self, info, geo):
return Address(latlng=geo.latlng) return Address(latlng=geo.latlng)
@ -27,7 +27,7 @@ class CreateAddress(graphene.Mutation):
Output = Address Output = Address
def mutate(root, info, geo): def mutate(self, info, geo):
return Address(latlng=geo.latlng) return Address(latlng=geo.latlng)

View File

@ -9,7 +9,7 @@ class User(graphene.ObjectType):
class Query(graphene.ObjectType): class Query(graphene.ObjectType):
me = graphene.Field(User) me = graphene.Field(User)
def resolve_me(root, info): def resolve_me(self, info):
return info.context["user"] return info.context["user"]

View File

@ -8,9 +8,10 @@ class Patron(graphene.ObjectType):
class Query(graphene.ObjectType): class Query(graphene.ObjectType):
patron = graphene.Field(Patron) patron = graphene.Field(Patron)
def resolve_patron(root, info): def resolve_patron(self, info):
return Patron(id=1, name="Syrus", age=27) return Patron(id=1, name="Syrus", age=27)

View File

@ -39,13 +39,13 @@ class Query(graphene.ObjectType):
human = graphene.Field(Human, id=graphene.String()) human = graphene.Field(Human, id=graphene.String())
droid = graphene.Field(Droid, id=graphene.String()) droid = graphene.Field(Droid, id=graphene.String())
def resolve_hero(root, info, episode=None): def resolve_hero(self, info, episode=None):
return get_hero(episode) return get_hero(episode)
def resolve_human(root, info, id): def resolve_human(self, info, id):
return get_human(id) return get_human(id)
def resolve_droid(root, info, id): def resolve_droid(self, info, id):
return get_droid(id) return get_droid(id)

View File

@ -0,0 +1,100 @@
# -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots["test_hero_name_query 1"] = {"data": {"hero": {"name": "R2-D2"}}}
snapshots["test_hero_name_and_friends_query 1"] = {
"data": {
"hero": {
"id": "2001",
"name": "R2-D2",
"friends": [
{"name": "Luke Skywalker"},
{"name": "Han Solo"},
{"name": "Leia Organa"},
],
}
}
}
snapshots["test_nested_query 1"] = {
"data": {
"hero": {
"name": "R2-D2",
"friends": [
{
"name": "Luke Skywalker",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Han Solo"},
{"name": "Leia Organa"},
{"name": "C-3PO"},
{"name": "R2-D2"},
],
},
{
"name": "Han Solo",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Luke Skywalker"},
{"name": "Leia Organa"},
{"name": "R2-D2"},
],
},
{
"name": "Leia Organa",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Luke Skywalker"},
{"name": "Han Solo"},
{"name": "C-3PO"},
{"name": "R2-D2"},
],
},
],
}
}
}
snapshots["test_fetch_luke_query 1"] = {"data": {"human": {"name": "Luke Skywalker"}}}
snapshots["test_fetch_some_id_query 1"] = {
"data": {"human": {"name": "Luke Skywalker"}}
}
snapshots["test_fetch_some_id_query2 1"] = {"data": {"human": {"name": "Han Solo"}}}
snapshots["test_invalid_id_query 1"] = {"data": {"human": None}}
snapshots["test_fetch_luke_aliased 1"] = {"data": {"luke": {"name": "Luke Skywalker"}}}
snapshots["test_fetch_luke_and_leia_aliased 1"] = {
"data": {"luke": {"name": "Luke Skywalker"}, "leia": {"name": "Leia Organa"}}
}
snapshots["test_duplicate_fields 1"] = {
"data": {
"luke": {"name": "Luke Skywalker", "homePlanet": "Tatooine"},
"leia": {"name": "Leia Organa", "homePlanet": "Alderaan"},
}
}
snapshots["test_use_fragment 1"] = {
"data": {
"luke": {"name": "Luke Skywalker", "homePlanet": "Tatooine"},
"leia": {"name": "Leia Organa", "homePlanet": "Alderaan"},
}
}
snapshots["test_check_type_of_r2 1"] = {
"data": {"hero": {"__typename": "Droid", "name": "R2-D2"}}
}
snapshots["test_check_type_of_luke 1"] = {
"data": {"hero": {"__typename": "Human", "name": "Luke Skywalker"}}
}

View File

@ -8,19 +8,19 @@ setup()
client = Client(schema) client = Client(schema)
def test_hero_name_query(): def test_hero_name_query(snapshot):
result = client.execute(""" query = """
query HeroNameQuery { query HeroNameQuery {
hero { hero {
name name
} }
} }
""") """
assert result == {"data": {"hero": {"name": "R2-D2"}}} snapshot.assert_match(client.execute(query))
def test_hero_name_and_friends_query(): def test_hero_name_and_friends_query(snapshot):
result = client.execute(""" query = """
query HeroNameAndFriendsQuery { query HeroNameAndFriendsQuery {
hero { hero {
id id
@ -30,24 +30,12 @@ def test_hero_name_and_friends_query():
} }
} }
} }
""") """
assert result == { snapshot.assert_match(client.execute(query))
"data": {
"hero": {
"id": "2001",
"name": "R2-D2",
"friends": [
{"name": "Luke Skywalker"},
{"name": "Han Solo"},
{"name": "Leia Organa"},
],
}
}
}
def test_nested_query(): def test_nested_query(snapshot):
result = client.execute(""" query = """
query NestedQuery { query NestedQuery {
hero { hero {
name name
@ -60,113 +48,70 @@ def test_nested_query():
} }
} }
} }
""") """
assert result == { snapshot.assert_match(client.execute(query))
"data": {
"hero": {
"name": "R2-D2",
"friends": [
{
"name": "Luke Skywalker",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Han Solo"},
{"name": "Leia Organa"},
{"name": "C-3PO"},
{"name": "R2-D2"},
],
},
{
"name": "Han Solo",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Luke Skywalker"},
{"name": "Leia Organa"},
{"name": "R2-D2"},
],
},
{
"name": "Leia Organa",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Luke Skywalker"},
{"name": "Han Solo"},
{"name": "C-3PO"},
{"name": "R2-D2"},
],
},
],
}
}
}
def test_fetch_luke_query(): def test_fetch_luke_query(snapshot):
result = client.execute(""" query = """
query FetchLukeQuery { query FetchLukeQuery {
human(id: "1000") { human(id: "1000") {
name name
} }
} }
""") """
assert result == {"data": {"human": {"name": "Luke Skywalker"}}} snapshot.assert_match(client.execute(query))
def test_fetch_some_id_query(): def test_fetch_some_id_query(snapshot):
result = client.execute( query = """
"""
query FetchSomeIDQuery($someId: String!) { query FetchSomeIDQuery($someId: String!) {
human(id: $someId) { human(id: $someId) {
name name
} }
} }
""", """
variables={"someId": "1000"}, params = {"someId": "1000"}
) snapshot.assert_match(client.execute(query, variables=params))
assert result == {"data": {"human": {"name": "Luke Skywalker"}}}
def test_fetch_some_id_query2(): def test_fetch_some_id_query2(snapshot):
result = client.execute( query = """
"""
query FetchSomeIDQuery($someId: String!) { query FetchSomeIDQuery($someId: String!) {
human(id: $someId) { human(id: $someId) {
name name
} }
} }
""", """
variables={"someId": "1002"}, params = {"someId": "1002"}
) snapshot.assert_match(client.execute(query, variables=params))
assert result == {"data": {"human": {"name": "Han Solo"}}}
def test_invalid_id_query(): def test_invalid_id_query(snapshot):
result = client.execute( query = """
"""
query humanQuery($id: String!) { query humanQuery($id: String!) {
human(id: $id) { human(id: $id) {
name name
} }
} }
""", """
variables={"id": "not a valid id"}, params = {"id": "not a valid id"}
) snapshot.assert_match(client.execute(query, variables=params))
assert result == {"data": {"human": None}}
def test_fetch_luke_aliased(): def test_fetch_luke_aliased(snapshot):
result = client.execute(""" query = """
query FetchLukeAliased { query FetchLukeAliased {
luke: human(id: "1000") { luke: human(id: "1000") {
name name
} }
} }
""") """
assert result == {"data": {"luke": {"name": "Luke Skywalker"}}} snapshot.assert_match(client.execute(query))
def test_fetch_luke_and_leia_aliased(): def test_fetch_luke_and_leia_aliased(snapshot):
result = client.execute(""" query = """
query FetchLukeAndLeiaAliased { query FetchLukeAndLeiaAliased {
luke: human(id: "1000") { luke: human(id: "1000") {
name name
@ -175,14 +120,12 @@ def test_fetch_luke_and_leia_aliased():
name name
} }
} }
""") """
assert result == { snapshot.assert_match(client.execute(query))
"data": {"luke": {"name": "Luke Skywalker"}, "leia": {"name": "Leia Organa"}}
}
def test_duplicate_fields(): def test_duplicate_fields(snapshot):
result = client.execute(""" query = """
query DuplicateFields { query DuplicateFields {
luke: human(id: "1000") { luke: human(id: "1000") {
name name
@ -193,17 +136,12 @@ def test_duplicate_fields():
homePlanet homePlanet
} }
} }
""") """
assert result == { snapshot.assert_match(client.execute(query))
"data": {
"luke": {"name": "Luke Skywalker", "homePlanet": "Tatooine"},
"leia": {"name": "Leia Organa", "homePlanet": "Alderaan"},
}
}
def test_use_fragment(): def test_use_fragment(snapshot):
result = client.execute(""" query = """
query UseFragment { query UseFragment {
luke: human(id: "1000") { luke: human(id: "1000") {
...HumanFragment ...HumanFragment
@ -216,36 +154,29 @@ def test_use_fragment():
name name
homePlanet homePlanet
} }
""") """
assert result == { snapshot.assert_match(client.execute(query))
"data": {
"luke": {"name": "Luke Skywalker", "homePlanet": "Tatooine"},
"leia": {"name": "Leia Organa", "homePlanet": "Alderaan"},
}
}
def test_check_type_of_r2(): def test_check_type_of_r2(snapshot):
result = client.execute(""" query = """
query CheckTypeOfR2 { query CheckTypeOfR2 {
hero { hero {
__typename __typename
name name
} }
} }
""") """
assert result == {"data": {"hero": {"__typename": "Droid", "name": "R2-D2"}}} snapshot.assert_match(client.execute(query))
def test_check_type_of_luke(): def test_check_type_of_luke(snapshot):
result = client.execute(""" query = """
query CheckTypeOfLuke { query CheckTypeOfLuke {
hero(episode: EMPIRE) { hero(episode: EMPIRE) {
__typename __typename
name name
} }
} }
""") """
assert result == { snapshot.assert_match(client.execute(query))
"data": {"hero": {"__typename": "Human", "name": "Luke Skywalker"}}
}

View File

@ -14,7 +14,7 @@ def setup():
# Yeah, technically it's Corellian. But it flew in the service of the rebels, # Yeah, technically it's Corellian. But it flew in the service of the rebels,
# so for the purposes of this demo it's a rebel ship. # so for the purposes of this demo it's a rebel ship.
falcon = Ship(id="4", name="Millennium Falcon") falcon = Ship(id="4", name="Millenium Falcon")
homeOne = Ship(id="5", name="Home One") homeOne = Ship(id="5", name="Home One")

View File

@ -64,10 +64,10 @@ class Query(graphene.ObjectType):
empire = graphene.Field(Faction) empire = graphene.Field(Faction)
node = relay.Node.Field() node = relay.Node.Field()
def resolve_rebels(root, info): def resolve_rebels(self, info):
return get_rebels() return get_rebels()
def resolve_empire(root, info): def resolve_empire(self, info):
return get_empire() return get_empire()

View File

@ -0,0 +1,26 @@
# -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots["test_correct_fetch_first_ship_rebels 1"] = {
"data": {
"rebels": {
"name": "Alliance to Restore the Republic",
"ships": {
"pageInfo": {
"startCursor": "YXJyYXljb25uZWN0aW9uOjA=",
"endCursor": "YXJyYXljb25uZWN0aW9uOjA=",
"hasNextPage": True,
"hasPreviousPage": False,
},
"edges": [
{"cursor": "YXJyYXljb25uZWN0aW9uOjA=", "node": {"name": "X-Wing"}}
],
},
}
}
}

View File

@ -0,0 +1,28 @@
# -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots["test_mutations 1"] = {
"data": {
"introduceShip": {
"ship": {"id": "U2hpcDo5", "name": "Peter"},
"faction": {
"name": "Alliance to Restore the Republic",
"ships": {
"edges": [
{"node": {"id": "U2hpcDox", "name": "X-Wing"}},
{"node": {"id": "U2hpcDoy", "name": "Y-Wing"}},
{"node": {"id": "U2hpcDoz", "name": "A-Wing"}},
{"node": {"id": "U2hpcDo0", "name": "Millenium Falcon"}},
{"node": {"id": "U2hpcDo1", "name": "Home One"}},
{"node": {"id": "U2hpcDo5", "name": "Peter"}},
]
},
},
}
}
}

View File

@ -0,0 +1,91 @@
# -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots["test_correctly_fetches_id_name_rebels 1"] = {
"data": {
"rebels": {"id": "RmFjdGlvbjox", "name": "Alliance to Restore the Republic"}
}
}
snapshots["test_correctly_refetches_rebels 1"] = {
"data": {"node": {"id": "RmFjdGlvbjox", "name": "Alliance to Restore the Republic"}}
}
snapshots["test_correctly_fetches_id_name_empire 1"] = {
"data": {"empire": {"id": "RmFjdGlvbjoy", "name": "Galactic Empire"}}
}
snapshots["test_correctly_refetches_empire 1"] = {
"data": {"node": {"id": "RmFjdGlvbjoy", "name": "Galactic Empire"}}
}
snapshots["test_correctly_refetches_xwing 1"] = {
"data": {"node": {"id": "U2hpcDox", "name": "X-Wing"}}
}
snapshots[
"test_str_schema 1"
] = """schema {
query: Query
mutation: Mutation
}
type Faction implements Node {
id: ID!
name: String
ships(before: String, after: String, first: Int, last: Int): ShipConnection
}
input IntroduceShipInput {
shipName: String!
factionId: String!
clientMutationId: String
}
type IntroduceShipPayload {
ship: Ship
faction: Faction
clientMutationId: String
}
type Mutation {
introduceShip(input: IntroduceShipInput!): IntroduceShipPayload
}
interface Node {
id: ID!
}
type PageInfo {
hasNextPage: Boolean!
hasPreviousPage: Boolean!
startCursor: String
endCursor: String
}
type Query {
rebels: Faction
empire: Faction
node(id: ID!): Node
}
type Ship implements Node {
id: ID!
name: String
}
type ShipConnection {
pageInfo: PageInfo!
edges: [ShipEdge]!
}
type ShipEdge {
node: Ship
cursor: String!
}
"""

View File

@ -8,46 +8,26 @@ setup()
client = Client(schema) client = Client(schema)
def test_correct_fetch_first_ship_rebels(): def test_correct_fetch_first_ship_rebels(snapshot):
result = client.execute(""" query = """
query RebelsShipsQuery { query RebelsShipsQuery {
rebels { rebels {
name, name,
ships(first: 1) { ships(first: 1) {
pageInfo { pageInfo {
startCursor startCursor
endCursor endCursor
hasNextPage hasNextPage
hasPreviousPage hasPreviousPage
} }
edges { edges {
cursor cursor
node { node {
name name
}
}
} }
} }
} }
""") }
assert result == {
"data": {
"rebels": {
"name": "Alliance to Restore the Republic",
"ships": {
"pageInfo": {
"startCursor": "YXJyYXljb25uZWN0aW9uOjA=",
"endCursor": "YXJyYXljb25uZWN0aW9uOjA=",
"hasNextPage": True,
"hasPreviousPage": False,
},
"edges": [
{
"cursor": "YXJyYXljb25uZWN0aW9uOjA=",
"node": {"name": "X-Wing"},
}
],
},
}
}
} }
"""
snapshot.assert_match(client.execute(query))

View File

@ -8,45 +8,26 @@ setup()
client = Client(schema) client = Client(schema)
def test_mutations(): def test_mutations(snapshot):
result = client.execute(""" query = """
mutation MyMutation { mutation MyMutation {
introduceShip(input:{clientMutationId:"abc", shipName: "Peter", factionId: "1"}) { introduceShip(input:{clientMutationId:"abc", shipName: "Peter", factionId: "1"}) {
ship { ship {
id id
name name
} }
faction { faction {
name name
ships { ships {
edges { edges {
node { node {
id id
name name
}
}
} }
} }
} }
} }
""") }
assert result == {
"data": {
"introduceShip": {
"ship": {"id": "U2hpcDo5", "name": "Peter"},
"faction": {
"name": "Alliance to Restore the Republic",
"ships": {
"edges": [
{"node": {"id": "U2hpcDox", "name": "X-Wing"}},
{"node": {"id": "U2hpcDoy", "name": "Y-Wing"}},
{"node": {"id": "U2hpcDoz", "name": "A-Wing"}},
{"node": {"id": "U2hpcDo0", "name": "Millennium Falcon"}},
{"node": {"id": "U2hpcDo1", "name": "Home One"}},
{"node": {"id": "U2hpcDo5", "name": "Peter"}},
]
},
},
}
}
} }
"""
snapshot.assert_match(client.execute(query))

View File

@ -1,5 +1,3 @@
import textwrap
from graphene.test import Client from graphene.test import Client
from ..data import setup from ..data import setup
@ -10,115 +8,24 @@ setup()
client = Client(schema) client = Client(schema)
def test_str_schema(): def test_str_schema(snapshot):
assert str(schema).strip() == textwrap.dedent( snapshot.assert_match(str(schema))
'''\
type Query {
rebels: Faction
empire: Faction
node(
"""The ID of the object"""
id: ID!
): Node
}
"""A faction in the Star Wars saga"""
type Faction implements Node {
"""The ID of the object"""
id: ID!
"""The name of the faction."""
name: String
"""The ships used by the faction."""
ships(before: String, after: String, first: Int, last: Int): ShipConnection
}
"""An object with an ID"""
interface Node {
"""The ID of the object"""
id: ID!
}
type ShipConnection {
"""Pagination data for this connection."""
pageInfo: PageInfo!
"""Contains the nodes in this connection."""
edges: [ShipEdge]!
}
"""
The Relay compliant `PageInfo` type, containing data necessary to paginate this connection.
"""
type PageInfo {
"""When paginating forwards, are there more items?"""
hasNextPage: Boolean!
"""When paginating backwards, are there more items?"""
hasPreviousPage: Boolean!
"""When paginating backwards, the cursor to continue."""
startCursor: String
"""When paginating forwards, the cursor to continue."""
endCursor: String
}
"""A Relay edge containing a `Ship` and its cursor."""
type ShipEdge {
"""The item at the end of the edge"""
node: Ship
"""A cursor for use in pagination"""
cursor: String!
}
"""A ship in the Star Wars saga"""
type Ship implements Node {
"""The ID of the object"""
id: ID!
"""The name of the ship."""
name: String
}
type Mutation {
introduceShip(input: IntroduceShipInput!): IntroduceShipPayload
}
type IntroduceShipPayload {
ship: Ship
faction: Faction
clientMutationId: String
}
input IntroduceShipInput {
shipName: String!
factionId: String!
clientMutationId: String
}'''
)
def test_correctly_fetches_id_name_rebels(): def test_correctly_fetches_id_name_rebels(snapshot):
result = client.execute(""" query = """
query RebelsQuery { query RebelsQuery {
rebels { rebels {
id id
name name
} }
} }
""") """
assert result == { snapshot.assert_match(client.execute(query))
"data": {
"rebels": {"id": "RmFjdGlvbjox", "name": "Alliance to Restore the Republic"}
}
}
def test_correctly_refetches_rebels(): def test_correctly_refetches_rebels(snapshot):
result = client.execute(""" query = """
query RebelsRefetchQuery { query RebelsRefetchQuery {
node(id: "RmFjdGlvbjox") { node(id: "RmFjdGlvbjox") {
id id
@ -127,30 +34,24 @@ def test_correctly_refetches_rebels():
} }
} }
} }
""") """
assert result == { snapshot.assert_match(client.execute(query))
"data": {
"node": {"id": "RmFjdGlvbjox", "name": "Alliance to Restore the Republic"}
}
}
def test_correctly_fetches_id_name_empire(): def test_correctly_fetches_id_name_empire(snapshot):
result = client.execute(""" query = """
query EmpireQuery { query EmpireQuery {
empire { empire {
id id
name name
} }
} }
""") """
assert result == { snapshot.assert_match(client.execute(query))
"data": {"empire": {"id": "RmFjdGlvbjoy", "name": "Galactic Empire"}}
}
def test_correctly_refetches_empire(): def test_correctly_refetches_empire(snapshot):
result = client.execute(""" query = """
query EmpireRefetchQuery { query EmpireRefetchQuery {
node(id: "RmFjdGlvbjoy") { node(id: "RmFjdGlvbjoy") {
id id
@ -159,14 +60,12 @@ def test_correctly_refetches_empire():
} }
} }
} }
""") """
assert result == { snapshot.assert_match(client.execute(query))
"data": {"node": {"id": "RmFjdGlvbjoy", "name": "Galactic Empire"}}
}
def test_correctly_refetches_xwing(): def test_correctly_refetches_xwing(snapshot):
result = client.execute(""" query = """
query XWingRefetchQuery { query XWingRefetchQuery {
node(id: "U2hpcDox") { node(id: "U2hpcDox") {
id id
@ -175,5 +74,5 @@ def test_correctly_refetches_xwing():
} }
} }
} }
""") """
assert result == {"data": {"node": {"id": "U2hpcDox", "name": "X-Wing"}}} snapshot.assert_match(client.execute(query))

View File

@ -1,98 +1,90 @@
from .pyutils.version import get_version from .pyutils.version import get_version
from .types import (
AbstractType,
ObjectType,
InputObjectType,
Interface,
Mutation,
Field,
InputField,
Schema,
Scalar,
String,
ID,
Int,
Float,
Boolean,
Date,
DateTime,
Time,
Decimal,
JSONString,
UUID,
List,
NonNull,
Enum,
Argument,
Dynamic,
Union,
Context,
ResolveInfo,
)
from .relay import ( from .relay import (
BaseGlobalIDType, Node,
is_node,
GlobalID,
ClientIDMutation, ClientIDMutation,
Connection, Connection,
ConnectionField, ConnectionField,
DefaultGlobalIDType,
GlobalID,
Node,
PageInfo, PageInfo,
SimpleGlobalIDType,
UUIDGlobalIDType,
is_node,
) )
from .types import (
ID,
UUID,
Argument,
Base64,
BigInt,
Boolean,
Context,
Date,
DateTime,
Decimal,
Dynamic,
Enum,
Field,
Float,
InputField,
InputObjectType,
Int,
Interface,
JSONString,
List,
Mutation,
NonNull,
ObjectType,
ResolveInfo,
Scalar,
Schema,
String,
Time,
Union,
)
from .utils.module_loading import lazy_import
from .utils.resolve_only_args import resolve_only_args from .utils.resolve_only_args import resolve_only_args
from .utils.module_loading import lazy_import
VERSION = (3, 4, 3, "final", 0)
VERSION = (2, 1, 7, "final", 0)
__version__ = get_version(VERSION) __version__ = get_version(VERSION)
__all__ = [ __all__ = [
"__version__", "__version__",
"Argument", "ObjectType",
"Base64", "InputObjectType",
"BigInt", "Interface",
"BaseGlobalIDType", "Mutation",
"Field",
"InputField",
"Schema",
"Scalar",
"String",
"ID",
"Int",
"Float",
"Enum",
"Boolean", "Boolean",
"Date",
"DateTime",
"Time",
"Decimal",
"JSONString",
"UUID",
"List",
"NonNull",
"Argument",
"Dynamic",
"Union",
"resolve_only_args",
"Node",
"is_node",
"GlobalID",
"ClientIDMutation", "ClientIDMutation",
"Connection", "Connection",
"ConnectionField", "ConnectionField",
"Context",
"Date",
"DateTime",
"Decimal",
"DefaultGlobalIDType",
"Dynamic",
"Enum",
"Field",
"Float",
"GlobalID",
"ID",
"InputField",
"InputObjectType",
"Int",
"Interface",
"JSONString",
"List",
"Mutation",
"Node",
"NonNull",
"ObjectType",
"PageInfo", "PageInfo",
"ResolveInfo",
"Scalar",
"Schema",
"SimpleGlobalIDType",
"String",
"Time",
"Union",
"UUID",
"UUIDGlobalIDType",
"is_node",
"lazy_import", "lazy_import",
"resolve_only_args", "Context",
"ResolveInfo",
# Deprecated
"AbstractType",
] ]

View File

@ -0,0 +1,21 @@
from __future__ import absolute_import
import six
from graphql.pyutils.compat import Enum
try:
from inspect import signature
except ImportError:
from .signature import signature
if six.PY2:
def func_name(func):
return func.func_name
else:
def func_name(func):
return func.__name__

View File

@ -0,0 +1,23 @@
is_init_subclass_available = hasattr(object, "__init_subclass__")
if not is_init_subclass_available:
class InitSubclassMeta(type):
"""Metaclass that implements PEP 487 protocol"""
def __new__(cls, name, bases, ns, **kwargs):
__init_subclass__ = ns.pop("__init_subclass__", None)
if __init_subclass__:
__init_subclass__ = classmethod(__init_subclass__)
ns["__init_subclass__"] = __init_subclass__
return super(InitSubclassMeta, cls).__new__(cls, name, bases, ns, **kwargs)
def __init__(cls, name, bases, ns, **kwargs):
super(InitSubclassMeta, cls).__init__(name, bases, ns)
super_class = super(cls, cls)
if hasattr(super_class, "__init_subclass__"):
super_class.__init_subclass__.__func__(cls, **kwargs)
else:
InitSubclassMeta = type # type: ignore

View File

@ -0,0 +1,850 @@
# Copyright 2001-2013 Python Software Foundation; All Rights Reserved
"""Function signature objects for callables
Back port of Python 3.3's function signature tools from the inspect module,
modified to be compatible with Python 2.7 and 3.2+.
"""
from __future__ import absolute_import, division, print_function
import functools
import itertools
import re
import types
from collections import OrderedDict
__version__ = "0.4"
__all__ = ["BoundArguments", "Parameter", "Signature", "signature"]
_WrapperDescriptor = type(type.__call__)
_MethodWrapper = type(all.__call__)
_NonUserDefinedCallables = (
_WrapperDescriptor,
_MethodWrapper,
types.BuiltinFunctionType,
)
def formatannotation(annotation, base_module=None):
if isinstance(annotation, type):
if annotation.__module__ in ("builtins", "__builtin__", base_module):
return annotation.__name__
return annotation.__module__ + "." + annotation.__name__
return repr(annotation)
def _get_user_defined_method(cls, method_name, *nested):
try:
if cls is type:
return
meth = getattr(cls, method_name)
for name in nested:
meth = getattr(meth, name, meth)
except AttributeError:
return
else:
if not isinstance(meth, _NonUserDefinedCallables):
# Once '__signature__' will be added to 'C'-level
# callables, this check won't be necessary
return meth
def signature(obj):
"""Get a signature object for the passed callable."""
if not callable(obj):
raise TypeError("{!r} is not a callable object".format(obj))
if isinstance(obj, types.MethodType):
sig = signature(obj.__func__)
if obj.__self__ is None:
# Unbound method: the first parameter becomes positional-only
if sig.parameters:
first = sig.parameters.values()[0].replace(kind=_POSITIONAL_ONLY)
return sig.replace(
parameters=(first,) + tuple(sig.parameters.values())[1:]
)
else:
return sig
else:
# In this case we skip the first parameter of the underlying
# function (usually `self` or `cls`).
return sig.replace(parameters=tuple(sig.parameters.values())[1:])
try:
sig = obj.__signature__
except AttributeError:
pass
else:
if sig is not None:
return sig
try:
# Was this function wrapped by a decorator?
wrapped = obj.__wrapped__
except AttributeError:
pass
else:
return signature(wrapped)
if isinstance(obj, types.FunctionType):
return Signature.from_function(obj)
if isinstance(obj, functools.partial):
sig = signature(obj.func)
new_params = OrderedDict(sig.parameters.items())
partial_args = obj.args or ()
partial_keywords = obj.keywords or {}
try:
ba = sig.bind_partial(*partial_args, **partial_keywords)
except TypeError as ex:
msg = "partial object {!r} has incorrect arguments".format(obj)
raise ValueError(msg)
for arg_name, arg_value in ba.arguments.items():
param = new_params[arg_name]
if arg_name in partial_keywords:
# We set a new default value, because the following code
# is correct:
#
# >>> def foo(a): print(a)
# >>> print(partial(partial(foo, a=10), a=20)())
# 20
# >>> print(partial(partial(foo, a=10), a=20)(a=30))
# 30
#
# So, with 'partial' objects, passing a keyword argument is
# like setting a new default value for the corresponding
# parameter
#
# We also mark this parameter with '_partial_kwarg'
# flag. Later, in '_bind', the 'default' value of this
# parameter will be added to 'kwargs', to simulate
# the 'functools.partial' real call.
new_params[arg_name] = param.replace(
default=arg_value, _partial_kwarg=True
)
elif (
param.kind not in (_VAR_KEYWORD, _VAR_POSITIONAL)
and not param._partial_kwarg
):
new_params.pop(arg_name)
return sig.replace(parameters=new_params.values())
sig = None
if isinstance(obj, type):
# obj is a class or a metaclass
# First, let's see if it has an overloaded __call__ defined
# in its metaclass
call = _get_user_defined_method(type(obj), "__call__")
if call is not None:
sig = signature(call)
else:
# Now we check if the 'obj' class has a '__new__' method
new = _get_user_defined_method(obj, "__new__")
if new is not None:
sig = signature(new)
else:
# Finally, we should have at least __init__ implemented
init = _get_user_defined_method(obj, "__init__")
if init is not None:
sig = signature(init)
elif not isinstance(obj, _NonUserDefinedCallables):
# An object with __call__
# We also check that the 'obj' is not an instance of
# _WrapperDescriptor or _MethodWrapper to avoid
# infinite recursion (and even potential segfault)
call = _get_user_defined_method(type(obj), "__call__", "im_func")
if call is not None:
sig = signature(call)
if sig is not None:
# For classes and objects we skip the first parameter of their
# __call__, __new__, or __init__ methods
return sig.replace(parameters=tuple(sig.parameters.values())[1:])
if isinstance(obj, types.BuiltinFunctionType):
# Raise a nicer error message for builtins
msg = "no signature found for builtin function {!r}".format(obj)
raise ValueError(msg)
raise ValueError("callable {!r} is not supported by signature".format(obj))
class _void(object):
"""A private marker - used in Parameter & Signature"""
class _empty(object):
pass
class _ParameterKind(int):
def __new__(self, *args, **kwargs):
obj = int.__new__(self, *args)
obj._name = kwargs["name"]
return obj
def __str__(self):
return self._name
def __repr__(self):
return "<_ParameterKind: {!r}>".format(self._name)
_POSITIONAL_ONLY = _ParameterKind(0, name="POSITIONAL_ONLY")
_POSITIONAL_OR_KEYWORD = _ParameterKind(1, name="POSITIONAL_OR_KEYWORD")
_VAR_POSITIONAL = _ParameterKind(2, name="VAR_POSITIONAL")
_KEYWORD_ONLY = _ParameterKind(3, name="KEYWORD_ONLY")
_VAR_KEYWORD = _ParameterKind(4, name="VAR_KEYWORD")
class Parameter(object):
"""Represents a parameter in a function signature.
Has the following public attributes:
* name : str
The name of the parameter as a string.
* default : object
The default value for the parameter if specified. If the
parameter has no default value, this attribute is not set.
* annotation
The annotation for the parameter if specified. If the
parameter has no annotation, this attribute is not set.
* kind : str
Describes how argument values are bound to the parameter.
Possible values: `Parameter.POSITIONAL_ONLY`,
`Parameter.POSITIONAL_OR_KEYWORD`, `Parameter.VAR_POSITIONAL`,
`Parameter.KEYWORD_ONLY`, `Parameter.VAR_KEYWORD`.
"""
__slots__ = ("_name", "_kind", "_default", "_annotation", "_partial_kwarg")
POSITIONAL_ONLY = _POSITIONAL_ONLY
POSITIONAL_OR_KEYWORD = _POSITIONAL_OR_KEYWORD
VAR_POSITIONAL = _VAR_POSITIONAL
KEYWORD_ONLY = _KEYWORD_ONLY
VAR_KEYWORD = _VAR_KEYWORD
empty = _empty
def __init__(
self, name, kind, default=_empty, annotation=_empty, _partial_kwarg=False
):
if kind not in (
_POSITIONAL_ONLY,
_POSITIONAL_OR_KEYWORD,
_VAR_POSITIONAL,
_KEYWORD_ONLY,
_VAR_KEYWORD,
):
raise ValueError("invalid value for 'Parameter.kind' attribute")
self._kind = kind
if default is not _empty:
if kind in (_VAR_POSITIONAL, _VAR_KEYWORD):
msg = "{} parameters cannot have default values".format(kind)
raise ValueError(msg)
self._default = default
self._annotation = annotation
if name is None:
if kind != _POSITIONAL_ONLY:
raise ValueError(
"None is not a valid name for a " "non-positional-only parameter"
)
self._name = name
else:
name = str(name)
if kind != _POSITIONAL_ONLY and not re.match(r"[a-z_]\w*$", name, re.I):
msg = "{!r} is not a valid parameter name".format(name)
raise ValueError(msg)
self._name = name
self._partial_kwarg = _partial_kwarg
@property
def name(self):
return self._name
@property
def default(self):
return self._default
@property
def annotation(self):
return self._annotation
@property
def kind(self):
return self._kind
def replace(
self,
name=_void,
kind=_void,
annotation=_void,
default=_void,
_partial_kwarg=_void,
):
"""Creates a customized copy of the Parameter."""
if name is _void:
name = self._name
if kind is _void:
kind = self._kind
if annotation is _void:
annotation = self._annotation
if default is _void:
default = self._default
if _partial_kwarg is _void:
_partial_kwarg = self._partial_kwarg
return type(self)(
name,
kind,
default=default,
annotation=annotation,
_partial_kwarg=_partial_kwarg,
)
def __str__(self):
kind = self.kind
formatted = self._name
if kind == _POSITIONAL_ONLY:
if formatted is None:
formatted = ""
formatted = "<{}>".format(formatted)
# Add annotation and default value
if self._annotation is not _empty:
formatted = "{}:{}".format(formatted, formatannotation(self._annotation))
if self._default is not _empty:
formatted = "{}={}".format(formatted, repr(self._default))
if kind == _VAR_POSITIONAL:
formatted = "*" + formatted
elif kind == _VAR_KEYWORD:
formatted = "**" + formatted
return formatted
def __repr__(self):
return "<{} at {:#x} {!r}>".format(self.__class__.__name__, id(self), self.name)
def __hash__(self):
msg = "unhashable type: '{}'".format(self.__class__.__name__)
raise TypeError(msg)
def __eq__(self, other):
return (
issubclass(other.__class__, Parameter)
and self._name == other._name
and self._kind == other._kind
and self._default == other._default
and self._annotation == other._annotation
)
def __ne__(self, other):
return not self.__eq__(other)
class BoundArguments(object):
"""Result of `Signature.bind` call. Holds the mapping of arguments
to the function's parameters.
Has the following public attributes:
* arguments : OrderedDict
An ordered mutable mapping of parameters' names to arguments' values.
Does not contain arguments' default values.
* signature : Signature
The Signature object that created this instance.
* args : tuple
Tuple of positional arguments values.
* kwargs : dict
Dict of keyword arguments values.
"""
def __init__(self, signature, arguments):
self.arguments = arguments
self._signature = signature
@property
def signature(self):
return self._signature
@property
def args(self):
args = []
for param_name, param in self._signature.parameters.items():
if param.kind in (_VAR_KEYWORD, _KEYWORD_ONLY) or param._partial_kwarg:
# Keyword arguments mapped by 'functools.partial'
# (Parameter._partial_kwarg is True) are mapped
# in 'BoundArguments.kwargs', along with VAR_KEYWORD &
# KEYWORD_ONLY
break
try:
arg = self.arguments[param_name]
except KeyError:
# We're done here. Other arguments
# will be mapped in 'BoundArguments.kwargs'
break
else:
if param.kind == _VAR_POSITIONAL:
# *args
args.extend(arg)
else:
# plain argument
args.append(arg)
return tuple(args)
@property
def kwargs(self):
kwargs = {}
kwargs_started = False
for param_name, param in self._signature.parameters.items():
if not kwargs_started:
if param.kind in (_VAR_KEYWORD, _KEYWORD_ONLY) or param._partial_kwarg:
kwargs_started = True
else:
if param_name not in self.arguments:
kwargs_started = True
continue
if not kwargs_started:
continue
try:
arg = self.arguments[param_name]
except KeyError:
pass
else:
if param.kind == _VAR_KEYWORD:
# **kwargs
kwargs.update(arg)
else:
# plain keyword argument
kwargs[param_name] = arg
return kwargs
def __hash__(self):
msg = "unhashable type: '{}'".format(self.__class__.__name__)
raise TypeError(msg)
def __eq__(self, other):
return (
issubclass(other.__class__, BoundArguments)
and self.signature == other.signature
and self.arguments == other.arguments
)
def __ne__(self, other):
return not self.__eq__(other)
class Signature(object):
"""A Signature object represents the overall signature of a function.
It stores a Parameter object for each parameter accepted by the
function, as well as information specific to the function itself.
A Signature object has the following public attributes and methods:
* parameters : OrderedDict
An ordered mapping of parameters' names to the corresponding
Parameter objects (keyword-only arguments are in the same order
as listed in `code.co_varnames`).
* return_annotation : object
The annotation for the return type of the function if specified.
If the function has no annotation for its return type, this
attribute is not set.
* bind(*args, **kwargs) -> BoundArguments
Creates a mapping from positional and keyword arguments to
parameters.
* bind_partial(*args, **kwargs) -> BoundArguments
Creates a partial mapping from positional and keyword arguments
to parameters (simulating 'functools.partial' behavior.)
"""
__slots__ = ("_return_annotation", "_parameters")
_parameter_cls = Parameter
_bound_arguments_cls = BoundArguments
empty = _empty
def __init__(
self, parameters=None, return_annotation=_empty, __validate_parameters__=True
):
"""Constructs Signature from the given list of Parameter
objects and 'return_annotation'. All arguments are optional.
"""
if parameters is None:
params = OrderedDict()
else:
if __validate_parameters__:
params = OrderedDict()
top_kind = _POSITIONAL_ONLY
for idx, param in enumerate(parameters):
kind = param.kind
if kind < top_kind:
msg = "wrong parameter order: {0} before {1}"
msg = msg.format(top_kind, param.kind)
raise ValueError(msg)
else:
top_kind = kind
name = param.name
if name is None:
name = str(idx)
param = param.replace(name=name)
if name in params:
msg = "duplicate parameter name: {!r}".format(name)
raise ValueError(msg)
params[name] = param
else:
params = OrderedDict(((param.name, param) for param in parameters))
self._parameters = params
self._return_annotation = return_annotation
@classmethod
def from_function(cls, func):
"""Constructs Signature for the given python function"""
if not isinstance(func, types.FunctionType):
raise TypeError("{!r} is not a Python function".format(func))
Parameter = cls._parameter_cls
# Parameter information.
func_code = func.__code__
pos_count = func_code.co_argcount
arg_names = func_code.co_varnames
positional = tuple(arg_names[:pos_count])
keyword_only_count = getattr(func_code, "co_kwonlyargcount", 0)
keyword_only = arg_names[pos_count : (pos_count + keyword_only_count)]
annotations = getattr(func, "__annotations__", {})
defaults = func.__defaults__
kwdefaults = getattr(func, "__kwdefaults__", None)
if defaults:
pos_default_count = len(defaults)
else:
pos_default_count = 0
parameters = []
# Non-keyword-only parameters w/o defaults.
non_default_count = pos_count - pos_default_count
for name in positional[:non_default_count]:
annotation = annotations.get(name, _empty)
parameters.append(
Parameter(name, annotation=annotation, kind=_POSITIONAL_OR_KEYWORD)
)
# ... w/ defaults.
for offset, name in enumerate(positional[non_default_count:]):
annotation = annotations.get(name, _empty)
parameters.append(
Parameter(
name,
annotation=annotation,
kind=_POSITIONAL_OR_KEYWORD,
default=defaults[offset],
)
)
# *args
if func_code.co_flags & 0x04:
name = arg_names[pos_count + keyword_only_count]
annotation = annotations.get(name, _empty)
parameters.append(
Parameter(name, annotation=annotation, kind=_VAR_POSITIONAL)
)
# Keyword-only parameters.
for name in keyword_only:
default = _empty
if kwdefaults is not None:
default = kwdefaults.get(name, _empty)
annotation = annotations.get(name, _empty)
parameters.append(
Parameter(
name, annotation=annotation, kind=_KEYWORD_ONLY, default=default
)
)
# **kwargs
if func_code.co_flags & 0x08:
index = pos_count + keyword_only_count
if func_code.co_flags & 0x04:
index += 1
name = arg_names[index]
annotation = annotations.get(name, _empty)
parameters.append(Parameter(name, annotation=annotation, kind=_VAR_KEYWORD))
return cls(
parameters,
return_annotation=annotations.get("return", _empty),
__validate_parameters__=False,
)
@property
def parameters(self):
try:
return types.MappingProxyType(self._parameters)
except AttributeError:
return OrderedDict(self._parameters.items())
@property
def return_annotation(self):
return self._return_annotation
def replace(self, parameters=_void, return_annotation=_void):
"""Creates a customized copy of the Signature.
Pass 'parameters' and/or 'return_annotation' arguments
to override them in the new copy.
"""
if parameters is _void:
parameters = self.parameters.values()
if return_annotation is _void:
return_annotation = self._return_annotation
return type(self)(parameters, return_annotation=return_annotation)
def __hash__(self):
msg = "unhashable type: '{}'".format(self.__class__.__name__)
raise TypeError(msg)
def __eq__(self, other):
if (
not issubclass(type(other), Signature)
or self.return_annotation != other.return_annotation
or len(self.parameters) != len(other.parameters)
):
return False
other_positions = {
param: idx for idx, param in enumerate(other.parameters.keys())
}
for idx, (param_name, param) in enumerate(self.parameters.items()):
if param.kind == _KEYWORD_ONLY:
try:
other_param = other.parameters[param_name]
except KeyError:
return False
else:
if param != other_param:
return False
else:
try:
other_idx = other_positions[param_name]
except KeyError:
return False
else:
if idx != other_idx or param != other.parameters[param_name]:
return False
return True
def __ne__(self, other):
return not self.__eq__(other)
def _bind(self, args, kwargs, partial=False):
"""Private method. Don't use directly."""
arguments = OrderedDict()
parameters = iter(self.parameters.values())
parameters_ex = ()
arg_vals = iter(args)
if partial:
# Support for binding arguments to 'functools.partial' objects.
# See 'functools.partial' case in 'signature()' implementation
# for details.
for param_name, param in self.parameters.items():
if param._partial_kwarg and param_name not in kwargs:
# Simulating 'functools.partial' behavior
kwargs[param_name] = param.default
while True:
# Let's iterate through the positional arguments and corresponding
# parameters
try:
arg_val = next(arg_vals)
except StopIteration:
# No more positional arguments
try:
param = next(parameters)
except StopIteration:
# No more parameters. That's it. Just need to check that
# we have no `kwargs` after this while loop
break
else:
if param.kind == _VAR_POSITIONAL:
# That's OK, just empty *args. Let's start parsing
# kwargs
break
elif param.name in kwargs:
if param.kind == _POSITIONAL_ONLY:
msg = (
"{arg!r} parameter is positional only, "
"but was passed as a keyword"
)
msg = msg.format(arg=param.name)
raise TypeError(msg)
parameters_ex = (param,)
break
elif param.kind == _VAR_KEYWORD or param.default is not _empty:
# That's fine too - we have a default value for this
# parameter. So, lets start parsing `kwargs`, starting
# with the current parameter
parameters_ex = (param,)
break
else:
if partial:
parameters_ex = (param,)
break
else:
msg = "{arg!r} parameter lacking default value"
msg = msg.format(arg=param.name)
raise TypeError(msg)
else:
# We have a positional argument to process
try:
param = next(parameters)
except StopIteration:
raise TypeError("too many positional arguments")
else:
if param.kind in (_VAR_KEYWORD, _KEYWORD_ONLY):
# Looks like we have no parameter for this positional
# argument
raise TypeError("too many positional arguments")
if param.kind == _VAR_POSITIONAL:
# We have an '*args'-like argument, let's fill it with
# all positional arguments we have left and move on to
# the next phase
values = [arg_val]
values.extend(arg_vals)
arguments[param.name] = tuple(values)
break
if param.name in kwargs:
raise TypeError(
"multiple values for argument "
"{arg!r}".format(arg=param.name)
)
arguments[param.name] = arg_val
# Now, we iterate through the remaining parameters to process
# keyword arguments
kwargs_param = None
for param in itertools.chain(parameters_ex, parameters):
if param.kind == _POSITIONAL_ONLY:
# This should never happen in case of a properly built
# Signature object (but let's have this check here
# to ensure correct behaviour just in case)
raise TypeError(
"{arg!r} parameter is positional only, "
"but was passed as a keyword".format(arg=param.name)
)
if param.kind == _VAR_KEYWORD:
# Memorize that we have a '**kwargs'-like parameter
kwargs_param = param
continue
param_name = param.name
try:
arg_val = kwargs.pop(param_name)
except KeyError:
# We have no value for this parameter. It's fine though,
# if it has a default value, or it is an '*args'-like
# parameter, left alone by the processing of positional
# arguments.
if (
not partial
and param.kind != _VAR_POSITIONAL
and param.default is _empty
):
raise TypeError(
"{arg!r} parameter lacking default value".format(arg=param_name)
)
else:
arguments[param_name] = arg_val
if kwargs:
if kwargs_param is not None:
# Process our '**kwargs'-like parameter
arguments[kwargs_param.name] = kwargs
else:
raise TypeError("too many keyword arguments")
return self._bound_arguments_cls(self, arguments)
def bind(self, *args, **kwargs):
"""Get a BoundArguments object, that maps the passed `args`
and `kwargs` to the function's signature. Raises `TypeError`
if the passed arguments can not be bound.
"""
return self._bind(args, kwargs)
def bind_partial(self, *args, **kwargs):
"""Get a BoundArguments object, that partially maps the
passed `args` and `kwargs` to the function's signature.
Raises `TypeError` if the passed arguments can not be bound.
"""
return self._bind(args, kwargs, partial=True)
def __str__(self):
result = []
render_kw_only_separator = True
for idx, param in enumerate(self.parameters.values()):
formatted = str(param)
kind = param.kind
if kind == _VAR_POSITIONAL:
# OK, we have an '*args'-like parameter, so we won't need
# a '*' to separate keyword-only arguments
render_kw_only_separator = False
elif kind == _KEYWORD_ONLY and render_kw_only_separator:
# We have a keyword-only parameter to render and we haven't
# rendered an '*args'-like parameter before, so add a '*'
# separator to the parameters list ("foo(arg1, *, arg2)" case)
result.append("*")
# This condition should be only triggered once, so
# reset the flag
render_kw_only_separator = False
result.append(formatted)
rendered = "({})".format(", ".join(result))
if self.return_annotation is not _empty:
anno = formatannotation(self.return_annotation)
rendered += " -> {}".format(anno)
return rendered

View File

@ -1,3 +1,5 @@
from __future__ import unicode_literals
import datetime import datetime
import os import os
import subprocess import subprocess
@ -17,7 +19,10 @@ def get_version(version=None):
sub = "" sub = ""
if version[3] == "alpha" and version[4] == 0: if version[3] == "alpha" and version[4] == 0:
git_changeset = get_git_changeset() git_changeset = get_git_changeset()
sub = ".dev%s" % git_changeset if git_changeset else ".dev" if git_changeset:
sub = ".dev%s" % git_changeset
else:
sub = ".dev"
elif version[3] != "final": elif version[3] != "final":
mapping = {"alpha": "a", "beta": "b", "rc": "rc"} mapping = {"alpha": "a", "beta": "b", "rc": "rc"}
sub = mapping[version[3]] + str(version[4]) sub = mapping[version[3]] + str(version[4])
@ -71,6 +76,6 @@ def get_git_changeset():
) )
timestamp = git_log.communicate()[0] timestamp = git_log.communicate()[0]
timestamp = datetime.datetime.utcfromtimestamp(int(timestamp)) timestamp = datetime.datetime.utcfromtimestamp(int(timestamp))
except Exception: except:
return None return None
return timestamp.strftime("%Y%m%d%H%M%S") return timestamp.strftime("%Y%m%d%H%M%S")

View File

@ -1,23 +1,13 @@
from .node import Node, is_node, GlobalID from .node import Node, is_node, GlobalID
from .mutation import ClientIDMutation from .mutation import ClientIDMutation
from .connection import Connection, ConnectionField, PageInfo from .connection import Connection, ConnectionField, PageInfo
from .id_type import (
BaseGlobalIDType,
DefaultGlobalIDType,
SimpleGlobalIDType,
UUIDGlobalIDType,
)
__all__ = [ __all__ = [
"BaseGlobalIDType", "Node",
"is_node",
"GlobalID",
"ClientIDMutation", "ClientIDMutation",
"Connection", "Connection",
"ConnectionField", "ConnectionField",
"DefaultGlobalIDType",
"GlobalID",
"Node",
"PageInfo", "PageInfo",
"SimpleGlobalIDType",
"UUIDGlobalIDType",
"is_node",
] ]

View File

@ -1,42 +1,14 @@
import re import re
from collections.abc import Iterable from collections import Iterable, OrderedDict
from functools import partial from functools import partial
from typing import Type
from graphql_relay import connection_from_array from graphql_relay import connection_from_list
from ..types import Boolean, Enum, Int, Interface, List, NonNull, Scalar, String, Union from ..types import Boolean, Enum, Int, Interface, List, NonNull, Scalar, String, Union
from ..types.field import Field from ..types.field import Field
from ..types.objecttype import ObjectType, ObjectTypeOptions from ..types.objecttype import ObjectType, ObjectTypeOptions
from ..utils.thenables import maybe_thenable from ..utils.thenables import maybe_thenable
from .node import is_node, AbstractNode from .node import is_node
def get_edge_class(
connection_class: Type["Connection"],
_node: Type[AbstractNode],
base_name: str,
strict_types: bool = False,
):
edge_class = getattr(connection_class, "Edge", None)
class EdgeBase:
node = Field(
NonNull(_node) if strict_types else _node,
description="The item at the end of the edge",
)
cursor = String(required=True, description="A cursor for use in pagination")
class EdgeMeta:
description = f"A Relay edge containing a `{base_name}` and its cursor."
edge_name = f"{base_name}Edge"
edge_bases = [edge_class, EdgeBase] if edge_class else [EdgeBase]
if not isinstance(edge_class, ObjectType):
edge_bases = [*edge_bases, ObjectType]
return type(edge_name, tuple(edge_bases), {"Meta": EdgeMeta})
class PageInfo(ObjectType): class PageInfo(ObjectType):
@ -69,17 +41,6 @@ class PageInfo(ObjectType):
) )
# noinspection PyPep8Naming
def page_info_adapter(startCursor, endCursor, hasPreviousPage, hasNextPage):
"""Adapter for creating PageInfo instances"""
return PageInfo(
start_cursor=startCursor,
end_cursor=endCursor,
has_previous_page=hasPreviousPage,
has_next_page=hasNextPage,
)
class ConnectionOptions(ObjectTypeOptions): class ConnectionOptions(ObjectTypeOptions):
node = None node = None
@ -89,68 +50,81 @@ class Connection(ObjectType):
abstract = True abstract = True
@classmethod @classmethod
def __init_subclass_with_meta__( def __init_subclass_with_meta__(cls, node=None, name=None, **options):
cls, node=None, name=None, strict_types=False, _meta=None, **options _meta = ConnectionOptions(cls)
): assert node, "You have to provide a node in {}.Meta".format(cls.__name__)
if not _meta:
_meta = ConnectionOptions(cls)
assert node, f"You have to provide a node in {cls.__name__}.Meta"
assert isinstance(node, NonNull) or issubclass( assert isinstance(node, NonNull) or issubclass(
node, (Scalar, Enum, ObjectType, Interface, Union, NonNull) node, (Scalar, Enum, ObjectType, Interface, Union, NonNull)
), f'Received incompatible node "{node}" for Connection {cls.__name__}.' ), ('Received incompatible node "{}" for Connection {}.').format(
node, cls.__name__
)
base_name = re.sub("Connection$", "", name or cls.__name__) or node._meta.name base_name = re.sub("Connection$", "", name or cls.__name__) or node._meta.name
if not name: if not name:
name = f"{base_name}Connection" name = "{}Connection".format(base_name)
edge_class = getattr(cls, "Edge", None)
_node = node
class EdgeBase(object):
node = Field(_node, description="The item at the end of the edge")
cursor = String(required=True, description="A cursor for use in pagination")
class EdgeMeta:
description = "A Relay edge containing a `{}` and its cursor.".format(
base_name
)
edge_name = "{}Edge".format(base_name)
if edge_class:
edge_bases = (edge_class, EdgeBase, ObjectType)
else:
edge_bases = (EdgeBase, ObjectType)
edge = type(edge_name, edge_bases, {"Meta": EdgeMeta})
cls.Edge = edge
options["name"] = name options["name"] = name
_meta.node = node _meta.node = node
_meta.fields = OrderedDict(
if not _meta.fields: [
_meta.fields = {} (
"page_info",
if "page_info" not in _meta.fields: Field(
_meta.fields["page_info"] = Field( PageInfo,
PageInfo, name="pageInfo",
name="pageInfo", required=True,
required=True, description="Pagination data for this connection.",
description="Pagination data for this connection.", ),
) ),
(
if "edges" not in _meta.fields: "edges",
edge_class = get_edge_class(cls, node, base_name, strict_types) # type: ignore Field(
cls.Edge = edge_class NonNull(List(edge)),
_meta.fields["edges"] = Field( description="Contains the nodes in this connection.",
NonNull(List(NonNull(edge_class) if strict_types else edge_class)), ),
description="Contains the nodes in this connection.", ),
) ]
)
return super(Connection, cls).__init_subclass_with_meta__( return super(Connection, cls).__init_subclass_with_meta__(
_meta=_meta, **options _meta=_meta, **options
) )
# noinspection PyPep8Naming
def connection_adapter(cls, edges, pageInfo):
"""Adapter for creating Connection instances"""
return cls(edges=edges, page_info=pageInfo)
class IterableConnectionField(Field): class IterableConnectionField(Field):
def __init__(self, type_, *args, **kwargs): def __init__(self, type, *args, **kwargs):
kwargs.setdefault("before", String()) kwargs.setdefault("before", String())
kwargs.setdefault("after", String()) kwargs.setdefault("after", String())
kwargs.setdefault("first", Int()) kwargs.setdefault("first", Int())
kwargs.setdefault("last", Int()) kwargs.setdefault("last", Int())
super(IterableConnectionField, self).__init__(type_, *args, **kwargs) super(IterableConnectionField, self).__init__(type, *args, **kwargs)
@property @property
def type(self): def type(self):
type_ = super(IterableConnectionField, self).type type = super(IterableConnectionField, self).type
connection_type = type_ connection_type = type
if isinstance(type_, NonNull): if isinstance(type, NonNull):
connection_type = type_.of_type connection_type = type.of_type
if is_node(connection_type): if is_node(connection_type):
raise Exception( raise Exception(
@ -158,10 +132,10 @@ class IterableConnectionField(Field):
"Read more: https://github.com/graphql-python/graphene/blob/v2.0.0/UPGRADE-v2.0.md#node-connections" "Read more: https://github.com/graphql-python/graphene/blob/v2.0.0/UPGRADE-v2.0.md#node-connections"
) )
assert issubclass( assert issubclass(connection_type, Connection), (
connection_type, Connection '{} type have to be a subclass of Connection. Received "{}".'
), f'{self.__class__.__name__} type has to be a subclass of Connection. Received "{connection_type}".' ).format(self.__class__.__name__, connection_type)
return type_ return type
@classmethod @classmethod
def resolve_connection(cls, connection_type, args, resolved): def resolve_connection(cls, connection_type, args, resolved):
@ -169,15 +143,15 @@ class IterableConnectionField(Field):
return resolved return resolved
assert isinstance(resolved, Iterable), ( assert isinstance(resolved, Iterable), (
f"Resolved value from the connection field has to be an iterable or instance of {connection_type}. " "Resolved value from the connection field have to be iterable or instance of {}. "
f'Received "{resolved}"' 'Received "{}"'
) ).format(connection_type, resolved)
connection = connection_from_array( connection = connection_from_list(
resolved, resolved,
args, args,
connection_type=partial(connection_adapter, connection_type), connection_type=connection_type,
edge_type=connection_type.Edge, edge_type=connection_type.Edge,
page_info_type=page_info_adapter, pageinfo_type=PageInfo,
) )
connection.iterable = resolved connection.iterable = resolved
return connection return connection
@ -192,8 +166,8 @@ class IterableConnectionField(Field):
on_resolve = partial(cls.resolve_connection, connection_type, args) on_resolve = partial(cls.resolve_connection, connection_type, args)
return maybe_thenable(resolved, on_resolve) return maybe_thenable(resolved, on_resolve)
def wrap_resolve(self, parent_resolver): def get_resolver(self, parent_resolver):
resolver = super(IterableConnectionField, self).wrap_resolve(parent_resolver) resolver = super(IterableConnectionField, self).get_resolver(parent_resolver)
return partial(self.connection_resolver, resolver, self.type) return partial(self.connection_resolver, resolver, self.type)

View File

@ -1,87 +0,0 @@
from graphql_relay import from_global_id, to_global_id
from ..types import ID, UUID
from ..types.base import BaseType
from typing import Type
class BaseGlobalIDType:
"""
Base class that define the required attributes/method for a type.
"""
graphene_type: Type[BaseType] = ID
@classmethod
def resolve_global_id(cls, info, global_id):
# return _type, _id
raise NotImplementedError
@classmethod
def to_global_id(cls, _type, _id):
# return _id
raise NotImplementedError
class DefaultGlobalIDType(BaseGlobalIDType):
"""
Default global ID type: base64 encoded version of "<node type name>: <node id>".
"""
graphene_type = ID
@classmethod
def resolve_global_id(cls, info, global_id):
try:
_type, _id = from_global_id(global_id)
if not _type:
raise ValueError("Invalid Global ID")
return _type, _id
except Exception as e:
raise Exception(
f'Unable to parse global ID "{global_id}". '
'Make sure it is a base64 encoded string in the format: "TypeName:id". '
f"Exception message: {e}"
)
@classmethod
def to_global_id(cls, _type, _id):
return to_global_id(_type, _id)
class SimpleGlobalIDType(BaseGlobalIDType):
"""
Simple global ID type: simply the id of the object.
To be used carefully as the user is responsible for ensuring that the IDs are indeed global
(otherwise it could cause request caching issues).
"""
graphene_type = ID
@classmethod
def resolve_global_id(cls, info, global_id):
_type = info.return_type.graphene_type._meta.name
return _type, global_id
@classmethod
def to_global_id(cls, _type, _id):
return _id
class UUIDGlobalIDType(BaseGlobalIDType):
"""
UUID global ID type.
By definition UUID are global so they are used as they are.
"""
graphene_type = UUID
@classmethod
def resolve_global_id(cls, info, global_id):
_type = info.return_type.graphene_type._meta.name
return _type, global_id
@classmethod
def to_global_id(cls, _type, _id):
return _id

View File

@ -1,4 +1,5 @@
import re import re
from collections import OrderedDict
from ..types import Field, InputObjectType, String from ..types import Field, InputObjectType, String
from ..types.mutation import Mutation from ..types.mutation import Mutation
@ -27,24 +28,26 @@ class ClientIDMutation(Mutation):
input_fields = {} input_fields = {}
cls.Input = type( cls.Input = type(
f"{base_name}Input", "{}Input".format(base_name),
bases, bases,
dict(input_fields, client_mutation_id=String(name="clientMutationId")), OrderedDict(
input_fields, client_mutation_id=String(name="clientMutationId")
),
) )
arguments = dict( arguments = OrderedDict(
input=cls.Input(required=True) input=cls.Input(required=True)
# 'client_mutation_id': String(name='clientMutationId') # 'client_mutation_id': String(name='clientMutationId')
) )
mutate_and_get_payload = getattr(cls, "mutate_and_get_payload", None) mutate_and_get_payload = getattr(cls, "mutate_and_get_payload", None)
if cls.mutate and cls.mutate.__func__ == ClientIDMutation.mutate.__func__: if cls.mutate and cls.mutate.__func__ == ClientIDMutation.mutate.__func__:
assert mutate_and_get_payload, ( assert mutate_and_get_payload, (
f"{name or cls.__name__}.mutate_and_get_payload method is required" "{name}.mutate_and_get_payload method is required"
" in a ClientIDMutation." " in a ClientIDMutation."
) ).format(name=name or cls.__name__)
if not name: if not name:
name = f"{base_name}Payload" name = "{}Payload".format(base_name)
super(ClientIDMutation, cls).__init_subclass_with_meta__( super(ClientIDMutation, cls).__init_subclass_with_meta__(
output=None, arguments=arguments, name=name, **options output=None, arguments=arguments, name=name, **options
@ -58,7 +61,9 @@ class ClientIDMutation(Mutation):
payload.client_mutation_id = input.get("client_mutation_id") payload.client_mutation_id = input.get("client_mutation_id")
except Exception: except Exception:
raise Exception( raise Exception(
f"Cannot set client_mutation_id in the payload object {repr(payload)}" ("Cannot set client_mutation_id in the payload object {}").format(
repr(payload)
)
) )
return payload return payload

View File

@ -1,10 +1,12 @@
from collections import OrderedDict
from functools import partial from functools import partial
from inspect import isclass from inspect import isclass
from ..types import Field, Interface, ObjectType from graphql_relay import from_global_id, to_global_id
from ..types import ID, Field, Interface, ObjectType
from ..types.interface import InterfaceOptions from ..types.interface import InterfaceOptions
from ..types.utils import get_type from ..types.utils import get_type
from .id_type import BaseGlobalIDType, DefaultGlobalIDType
def is_node(objecttype): def is_node(objecttype):
@ -17,22 +19,16 @@ def is_node(objecttype):
if not issubclass(objecttype, ObjectType): if not issubclass(objecttype, ObjectType):
return False return False
return any(issubclass(i, Node) for i in objecttype._meta.interfaces) for i in objecttype._meta.interfaces:
if issubclass(i, Node):
return True
return False
class GlobalID(Field): class GlobalID(Field):
def __init__( def __init__(self, node=None, parent_type=None, required=True, *args, **kwargs):
self, super(GlobalID, self).__init__(ID, required=required, *args, **kwargs)
node=None,
parent_type=None,
required=True,
global_id_type=DefaultGlobalIDType,
*args,
**kwargs,
):
super(GlobalID, self).__init__(
global_id_type.graphene_type, required=required, *args, **kwargs
)
self.node = node or Node self.node = node or Node
self.parent_type_name = parent_type._meta.name if parent_type else None self.parent_type_name = parent_type._meta.name if parent_type else None
@ -42,7 +38,7 @@ class GlobalID(Field):
parent_type_name = parent_type_name or info.parent_type.name parent_type_name = parent_type_name or info.parent_type.name
return node.to_global_id(parent_type_name, type_id) # root._meta.name return node.to_global_id(parent_type_name, type_id) # root._meta.name
def wrap_resolve(self, parent_resolver): def get_resolver(self, parent_resolver):
return partial( return partial(
self.id_resolver, self.id_resolver,
parent_resolver, parent_resolver,
@ -52,22 +48,20 @@ class GlobalID(Field):
class NodeField(Field): class NodeField(Field):
def __init__(self, node, type_=False, **kwargs): def __init__(self, node, type=False, deprecation_reason=None, name=None, **kwargs):
assert issubclass(node, Node), "NodeField can only operate in Nodes" assert issubclass(node, Node), "NodeField can only operate in Nodes"
self.node_type = node self.node_type = node
self.field_type = type_ self.field_type = type
global_id_type = node._meta.global_id_type
super(NodeField, self).__init__( super(NodeField, self).__init__(
# If we don't specify a type, the field type will be the node interface # If we don's specify a type, the field type will be the node
type_ or node, # interface
id=global_id_type.graphene_type( type or node,
required=True, description="The ID of the object" description="The ID of the object",
), id=ID(required=True),
**kwargs,
) )
def wrap_resolve(self, parent_resolver): def get_resolver(self, parent_resolver):
return partial(self.node_type.node_resolver, get_type(self.field_type)) return partial(self.node_type.node_resolver, get_type(self.field_type))
@ -76,23 +70,13 @@ class AbstractNode(Interface):
abstract = True abstract = True
@classmethod @classmethod
def __init_subclass_with_meta__(cls, global_id_type=DefaultGlobalIDType, **options): def __init_subclass_with_meta__(cls, **options):
assert issubclass(
global_id_type, BaseGlobalIDType
), "Custom ID type need to be implemented as a subclass of BaseGlobalIDType."
_meta = InterfaceOptions(cls) _meta = InterfaceOptions(cls)
_meta.global_id_type = global_id_type _meta.fields = OrderedDict(
_meta.fields = { id=GlobalID(cls, description="The ID of the object.")
"id": GlobalID( )
cls, global_id_type=global_id_type, description="The ID of the object"
)
}
super(AbstractNode, cls).__init_subclass_with_meta__(_meta=_meta, **options) super(AbstractNode, cls).__init_subclass_with_meta__(_meta=_meta, **options)
@classmethod
def resolve_global_id(cls, info, global_id):
return cls._meta.global_id_type.resolve_global_id(info, global_id)
class Node(AbstractNode): class Node(AbstractNode):
"""An object with an ID""" """An object with an ID"""
@ -107,29 +91,29 @@ class Node(AbstractNode):
@classmethod @classmethod
def get_node_from_global_id(cls, info, global_id, only_type=None): def get_node_from_global_id(cls, info, global_id, only_type=None):
_type, _id = cls.resolve_global_id(info, global_id) try:
_type, _id = cls.from_global_id(global_id)
graphene_type = info.schema.get_type(_type) graphene_type = info.schema.get_type(_type).graphene_type
if graphene_type is None: except Exception:
raise Exception(f'Relay Node "{_type}" not found in schema') return None
graphene_type = graphene_type.graphene_type
if only_type: if only_type:
assert ( assert graphene_type == only_type, ("Must receive a {} id.").format(
graphene_type == only_type only_type._meta.name
), f"Must receive a {only_type._meta.name} id." )
# We make sure the ObjectType implements the "Node" interface # We make sure the ObjectType implements the "Node" interface
if cls not in graphene_type._meta.interfaces: if cls not in graphene_type._meta.interfaces:
raise Exception( return None
f'ObjectType "{_type}" does not implement the "{cls}" interface.'
)
get_node = getattr(graphene_type, "get_node", None) get_node = getattr(graphene_type, "get_node", None)
if get_node: if get_node:
return get_node(info, _id) return get_node(info, _id)
@classmethod @classmethod
def to_global_id(cls, type_, id): def from_global_id(cls, global_id):
return cls._meta.global_id_type.to_global_id(type_, id) return from_global_id(global_id)
@classmethod
def to_global_id(cls, type, id):
return to_global_id(type, id)

View File

@ -1,15 +1,7 @@
import re import pytest
from pytest import raises
from ...types import Argument, Field, Int, List, NonNull, ObjectType, Schema, String from ...types import Argument, Field, Int, List, NonNull, ObjectType, Schema, String
from ..connection import ( from ..connection import Connection, ConnectionField, PageInfo
Connection,
ConnectionField,
PageInfo,
ConnectionOptions,
get_edge_class,
)
from ..node import Node from ..node import Node
@ -32,7 +24,7 @@ def test_connection():
assert MyObjectConnection._meta.name == "MyObjectConnection" assert MyObjectConnection._meta.name == "MyObjectConnection"
fields = MyObjectConnection._meta.fields fields = MyObjectConnection._meta.fields
assert list(fields) == ["page_info", "edges", "extra"] assert list(fields.keys()) == ["page_info", "edges", "extra"]
edge_field = fields["edges"] edge_field = fields["edges"]
pageinfo_field = fields["page_info"] pageinfo_field = fields["page_info"]
@ -47,7 +39,7 @@ def test_connection():
def test_connection_inherit_abstracttype(): def test_connection_inherit_abstracttype():
class BaseConnection: class BaseConnection(object):
extra = String() extra = String()
class MyObjectConnection(BaseConnection, Connection): class MyObjectConnection(BaseConnection, Connection):
@ -56,118 +48,13 @@ def test_connection_inherit_abstracttype():
assert MyObjectConnection._meta.name == "MyObjectConnection" assert MyObjectConnection._meta.name == "MyObjectConnection"
fields = MyObjectConnection._meta.fields fields = MyObjectConnection._meta.fields
assert list(fields) == ["page_info", "edges", "extra"] assert list(fields.keys()) == ["page_info", "edges", "extra"]
def test_connection_extra_abstract_fields():
class ConnectionWithNodes(Connection):
class Meta:
abstract = True
@classmethod
def __init_subclass_with_meta__(cls, node=None, name=None, **options):
_meta = ConnectionOptions(cls)
_meta.fields = {
"nodes": Field(
NonNull(List(node)),
description="Contains all the nodes in this connection.",
),
}
return super(ConnectionWithNodes, cls).__init_subclass_with_meta__(
node=node, name=name, _meta=_meta, **options
)
class MyObjectConnection(ConnectionWithNodes):
class Meta:
node = MyObject
class Edge:
other = String()
assert MyObjectConnection._meta.name == "MyObjectConnection"
fields = MyObjectConnection._meta.fields
assert list(fields) == ["nodes", "page_info", "edges"]
edge_field = fields["edges"]
pageinfo_field = fields["page_info"]
nodes_field = fields["nodes"]
assert isinstance(edge_field, Field)
assert isinstance(edge_field.type, NonNull)
assert isinstance(edge_field.type.of_type, List)
assert edge_field.type.of_type.of_type == MyObjectConnection.Edge
assert isinstance(pageinfo_field, Field)
assert isinstance(pageinfo_field.type, NonNull)
assert pageinfo_field.type.of_type == PageInfo
assert isinstance(nodes_field, Field)
assert isinstance(nodes_field.type, NonNull)
assert isinstance(nodes_field.type.of_type, List)
assert nodes_field.type.of_type.of_type == MyObject
def test_connection_override_fields():
class ConnectionWithNodes(Connection):
class Meta:
abstract = True
@classmethod
def __init_subclass_with_meta__(cls, node=None, name=None, **options):
_meta = ConnectionOptions(cls)
base_name = (
re.sub("Connection$", "", name or cls.__name__) or node._meta.name
)
edge_class = get_edge_class(cls, node, base_name)
_meta.fields = {
"page_info": Field(
NonNull(
PageInfo,
name="pageInfo",
required=True,
description="Pagination data for this connection.",
)
),
"edges": Field(
NonNull(List(NonNull(edge_class))),
description="Contains the nodes in this connection.",
),
}
return super(ConnectionWithNodes, cls).__init_subclass_with_meta__(
node=node, name=name, _meta=_meta, **options
)
class MyObjectConnection(ConnectionWithNodes):
class Meta:
node = MyObject
assert MyObjectConnection._meta.name == "MyObjectConnection"
fields = MyObjectConnection._meta.fields
assert list(fields) == ["page_info", "edges"]
edge_field = fields["edges"]
pageinfo_field = fields["page_info"]
assert isinstance(edge_field, Field)
assert isinstance(edge_field.type, NonNull)
assert isinstance(edge_field.type.of_type, List)
assert isinstance(edge_field.type.of_type.of_type, NonNull)
assert edge_field.type.of_type.of_type.of_type.__name__ == "MyObjectEdge"
# This page info is NonNull
assert isinstance(pageinfo_field, Field)
assert isinstance(edge_field.type, NonNull)
assert pageinfo_field.type.of_type == PageInfo
def test_connection_name(): def test_connection_name():
custom_name = "MyObjectCustomNameConnection" custom_name = "MyObjectCustomNameConnection"
class BaseConnection: class BaseConnection(object):
extra = String() extra = String()
class MyObjectConnection(BaseConnection, Connection): class MyObjectConnection(BaseConnection, Connection):
@ -189,7 +76,7 @@ def test_edge():
Edge = MyObjectConnection.Edge Edge = MyObjectConnection.Edge
assert Edge._meta.name == "MyObjectEdge" assert Edge._meta.name == "MyObjectEdge"
edge_fields = Edge._meta.fields edge_fields = Edge._meta.fields
assert list(edge_fields) == ["node", "cursor", "other"] assert list(edge_fields.keys()) == ["node", "cursor", "other"]
assert isinstance(edge_fields["node"], Field) assert isinstance(edge_fields["node"], Field)
assert edge_fields["node"].type == MyObject assert edge_fields["node"].type == MyObject
@ -199,7 +86,7 @@ def test_edge():
def test_edge_with_bases(): def test_edge_with_bases():
class BaseEdge: class BaseEdge(object):
extra = String() extra = String()
class MyObjectConnection(Connection): class MyObjectConnection(Connection):
@ -212,7 +99,7 @@ def test_edge_with_bases():
Edge = MyObjectConnection.Edge Edge = MyObjectConnection.Edge
assert Edge._meta.name == "MyObjectEdge" assert Edge._meta.name == "MyObjectEdge"
edge_fields = Edge._meta.fields edge_fields = Edge._meta.fields
assert list(edge_fields) == ["node", "cursor", "extra", "other"] assert list(edge_fields.keys()) == ["node", "cursor", "extra", "other"]
assert isinstance(edge_fields["node"], Field) assert isinstance(edge_fields["node"], Field)
assert edge_fields["node"].type == MyObject assert edge_fields["node"].type == MyObject
@ -235,7 +122,7 @@ def test_edge_with_nonnull_node():
def test_pageinfo(): def test_pageinfo():
assert PageInfo._meta.name == "PageInfo" assert PageInfo._meta.name == "PageInfo"
fields = PageInfo._meta.fields fields = PageInfo._meta.fields
assert list(fields) == [ assert list(fields.keys()) == [
"has_next_page", "has_next_page",
"has_previous_page", "has_previous_page",
"start_cursor", "start_cursor",
@ -259,7 +146,7 @@ def test_connectionfield():
def test_connectionfield_node_deprecated(): def test_connectionfield_node_deprecated():
field = ConnectionField(MyObject) field = ConnectionField(MyObject)
with raises(Exception) as exc_info: with pytest.raises(Exception) as exc_info:
field.type field.type
assert "ConnectionFields now need a explicit ConnectionType for Nodes." in str( assert "ConnectionFields now need a explicit ConnectionType for Nodes." in str(
@ -299,20 +186,3 @@ def test_connectionfield_required():
executed = schema.execute("{ testConnection { edges { cursor } } }") executed = schema.execute("{ testConnection { edges { cursor } } }")
assert not executed.errors assert not executed.errors
assert executed.data == {"testConnection": {"edges": []}} assert executed.data == {"testConnection": {"edges": []}}
def test_connectionfield_strict_types():
class MyObjectConnection(Connection):
class Meta:
node = MyObject
strict_types = True
connection_field = ConnectionField(MyObjectConnection)
edges_field_type = connection_field.type._meta.fields["edges"].type
assert isinstance(edges_field_type, NonNull)
edges_list_element_type = edges_field_type.of_type.of_type
assert isinstance(edges_list_element_type, NonNull)
node_field = edges_list_element_type.of_type._meta.fields["node"]
assert isinstance(node_field.type, NonNull)

View File

@ -1,6 +1,7 @@
from pytest import mark from collections import OrderedDict
from graphql_relay.utils import base64 from graphql_relay.utils import base64
from promise import Promise
from ...types import ObjectType, Schema, String from ...types import ObjectType, Schema, String
from ..connection import Connection, ConnectionField, PageInfo from ..connection import Connection, ConnectionField, PageInfo
@ -24,15 +25,15 @@ class LetterConnection(Connection):
class Query(ObjectType): class Query(ObjectType):
letters = ConnectionField(LetterConnection) letters = ConnectionField(LetterConnection)
connection_letters = ConnectionField(LetterConnection) connection_letters = ConnectionField(LetterConnection)
async_letters = ConnectionField(LetterConnection) promise_letters = ConnectionField(LetterConnection)
node = Node.Field() node = Node.Field()
def resolve_letters(self, info, **args): def resolve_letters(self, info, **args):
return list(letters.values()) return list(letters.values())
async def resolve_async_letters(self, info, **args): def resolve_promise_letters(self, info, **args):
return list(letters.values()) return Promise.resolve(list(letters.values()))
def resolve_connection_letters(self, info, **args): def resolve_connection_letters(self, info, **args):
return LetterConnection( return LetterConnection(
@ -45,16 +46,18 @@ class Query(ObjectType):
schema = Schema(Query) schema = Schema(Query)
letters = {letter: Letter(id=i, letter=letter) for i, letter in enumerate(letter_chars)} letters = OrderedDict()
for i, letter in enumerate(letter_chars):
letters[letter] = Letter(id=i, letter=letter)
def edges(selected_letters): def edges(selected_letters):
return [ return [
{ {
"node": {"id": base64("Letter:%s" % letter.id), "letter": letter.letter}, "node": {"id": base64("Letter:%s" % l.id), "letter": l.letter},
"cursor": base64("arrayconnection:%s" % letter.id), "cursor": base64("arrayconnection:%s" % l.id),
} }
for letter in [letters[i] for i in selected_letters] for l in [letters[i] for i in selected_letters]
] ]
@ -63,10 +66,11 @@ def cursor_for(ltr):
return base64("arrayconnection:%s" % letter.id) return base64("arrayconnection:%s" % letter.id)
async def execute(args=""): def execute(args=""):
if args: if args:
args = "(" + args + ")" args = "(" + args + ")"
return await schema.execute_async(
return schema.execute(
""" """
{ {
letters%s { letters%s {
@ -90,8 +94,8 @@ async def execute(args=""):
) )
async def check(args, letters, has_previous_page=False, has_next_page=False): def check(args, letters, has_previous_page=False, has_next_page=False):
result = await execute(args) result = execute(args)
expected_edges = edges(letters) expected_edges = edges(letters)
expected_page_info = { expected_page_info = {
"hasPreviousPage": has_previous_page, "hasPreviousPage": has_previous_page,
@ -106,126 +110,114 @@ async def check(args, letters, has_previous_page=False, has_next_page=False):
} }
@mark.asyncio def test_returns_all_elements_without_filters():
async def test_returns_all_elements_without_filters(): check("", "ABCDE")
await check("", "ABCDE")
@mark.asyncio def test_respects_a_smaller_first():
async def test_respects_a_smaller_first(): check("first: 2", "AB", has_next_page=True)
await check("first: 2", "AB", has_next_page=True)
@mark.asyncio def test_respects_an_overly_large_first():
async def test_respects_an_overly_large_first(): check("first: 10", "ABCDE")
await check("first: 10", "ABCDE")
@mark.asyncio def test_respects_a_smaller_last():
async def test_respects_a_smaller_last(): check("last: 2", "DE", has_previous_page=True)
await check("last: 2", "DE", has_previous_page=True)
@mark.asyncio def test_respects_an_overly_large_last():
async def test_respects_an_overly_large_last(): check("last: 10", "ABCDE")
await check("last: 10", "ABCDE")
@mark.asyncio def test_respects_first_and_after():
async def test_respects_first_and_after(): check('first: 2, after: "{}"'.format(cursor_for("B")), "CD", has_next_page=True)
await check(f'first: 2, after: "{cursor_for("B")}"', "CD", has_next_page=True)
@mark.asyncio def test_respects_first_and_after_with_long_first():
async def test_respects_first_and_after_with_long_first(): check('first: 10, after: "{}"'.format(cursor_for("B")), "CDE")
await check(f'first: 10, after: "{cursor_for("B")}"', "CDE")
@mark.asyncio def test_respects_last_and_before():
async def test_respects_last_and_before(): check('last: 2, before: "{}"'.format(cursor_for("D")), "BC", has_previous_page=True)
await check(f'last: 2, before: "{cursor_for("D")}"', "BC", has_previous_page=True)
@mark.asyncio def test_respects_last_and_before_with_long_last():
async def test_respects_last_and_before_with_long_last(): check('last: 10, before: "{}"'.format(cursor_for("D")), "ABC")
await check(f'last: 10, before: "{cursor_for("D")}"', "ABC")
@mark.asyncio def test_respects_first_and_after_and_before_too_few():
async def test_respects_first_and_after_and_before_too_few(): check(
await check( 'first: 2, after: "{}", before: "{}"'.format(cursor_for("A"), cursor_for("E")),
f'first: 2, after: "{cursor_for("A")}", before: "{cursor_for("E")}"',
"BC", "BC",
has_next_page=True, has_next_page=True,
) )
@mark.asyncio def test_respects_first_and_after_and_before_too_many():
async def test_respects_first_and_after_and_before_too_many(): check(
await check( 'first: 4, after: "{}", before: "{}"'.format(cursor_for("A"), cursor_for("E")),
f'first: 4, after: "{cursor_for("A")}", before: "{cursor_for("E")}"', "BCD" "BCD",
) )
@mark.asyncio def test_respects_first_and_after_and_before_exactly_right():
async def test_respects_first_and_after_and_before_exactly_right(): check(
await check( 'first: 3, after: "{}", before: "{}"'.format(cursor_for("A"), cursor_for("E")),
f'first: 3, after: "{cursor_for("A")}", before: "{cursor_for("E")}"', "BCD" "BCD",
) )
@mark.asyncio def test_respects_last_and_after_and_before_too_few():
async def test_respects_last_and_after_and_before_too_few(): check(
await check( 'last: 2, after: "{}", before: "{}"'.format(cursor_for("A"), cursor_for("E")),
f'last: 2, after: "{cursor_for("A")}", before: "{cursor_for("E")}"',
"CD", "CD",
has_previous_page=True, has_previous_page=True,
) )
@mark.asyncio def test_respects_last_and_after_and_before_too_many():
async def test_respects_last_and_after_and_before_too_many(): check(
await check( 'last: 4, after: "{}", before: "{}"'.format(cursor_for("A"), cursor_for("E")),
f'last: 4, after: "{cursor_for("A")}", before: "{cursor_for("E")}"', "BCD" "BCD",
) )
@mark.asyncio def test_respects_last_and_after_and_before_exactly_right():
async def test_respects_last_and_after_and_before_exactly_right(): check(
await check( 'last: 3, after: "{}", before: "{}"'.format(cursor_for("A"), cursor_for("E")),
f'last: 3, after: "{cursor_for("A")}", before: "{cursor_for("E")}"', "BCD" "BCD",
) )
@mark.asyncio def test_returns_no_elements_if_first_is_0():
async def test_returns_no_elements_if_first_is_0(): check("first: 0", "", has_next_page=True)
await check("first: 0", "", has_next_page=True)
@mark.asyncio def test_returns_all_elements_if_cursors_are_invalid():
async def test_returns_all_elements_if_cursors_are_invalid(): check('before: "invalid" after: "invalid"', "ABCDE")
await check('before: "invalid" after: "invalid"', "ABCDE")
@mark.asyncio def test_returns_all_elements_if_cursors_are_on_the_outside():
async def test_returns_all_elements_if_cursors_are_on_the_outside(): check(
await check( 'before: "{}" after: "{}"'.format(
f'before: "{base64("arrayconnection:%s" % 6)}" after: "{base64("arrayconnection:%s" % -1)}"', base64("arrayconnection:%s" % 6), base64("arrayconnection:%s" % -1)
),
"ABCDE", "ABCDE",
) )
@mark.asyncio def test_returns_no_elements_if_cursors_cross():
async def test_returns_no_elements_if_cursors_cross(): check(
await check( 'before: "{}" after: "{}"'.format(
f'before: "{base64("arrayconnection:%s" % 2)}" after: "{base64("arrayconnection:%s" % 4)}"', base64("arrayconnection:%s" % 2), base64("arrayconnection:%s" % 4)
),
"", "",
) )
@mark.asyncio def test_connection_type_nodes():
async def test_connection_type_nodes(): result = schema.execute(
result = await schema.execute_async(
""" """
{ {
connectionLetters { connectionLetters {
@ -256,12 +248,11 @@ async def test_connection_type_nodes():
} }
@mark.asyncio def test_connection_promise():
async def test_connection_async(): result = schema.execute(
result = await schema.execute_async(
""" """
{ {
asyncLetters(first:1) { promiseLetters(first:1) {
edges { edges {
node { node {
id id
@ -279,7 +270,7 @@ async def test_connection_async():
assert not result.errors assert not result.errors
assert result.data == { assert result.data == {
"asyncLetters": { "promiseLetters": {
"edges": [{"node": {"id": "TGV0dGVyOjA=", "letter": "A"}}], "edges": [{"node": {"id": "TGV0dGVyOjA=", "letter": "A"}}],
"pageInfo": {"hasPreviousPage": False, "hasNextPage": True}, "pageInfo": {"hasPreviousPage": False, "hasNextPage": True},
} }

View File

@ -1,325 +0,0 @@
import re
from uuid import uuid4
from graphql import graphql_sync
from ..id_type import BaseGlobalIDType, SimpleGlobalIDType, UUIDGlobalIDType
from ..node import Node
from ...types import Int, ObjectType, Schema, String
class TestUUIDGlobalID:
def setup_method(self):
self.user_list = [
{"id": uuid4(), "name": "First"},
{"id": uuid4(), "name": "Second"},
{"id": uuid4(), "name": "Third"},
{"id": uuid4(), "name": "Fourth"},
]
self.users = {user["id"]: user for user in self.user_list}
class CustomNode(Node):
class Meta:
global_id_type = UUIDGlobalIDType
class User(ObjectType):
class Meta:
interfaces = [CustomNode]
name = String()
@classmethod
def get_node(cls, _type, _id):
return self.users[_id]
class RootQuery(ObjectType):
user = CustomNode.Field(User)
self.schema = Schema(query=RootQuery, types=[User])
self.graphql_schema = self.schema.graphql_schema
def test_str_schema_correct(self):
"""
Check that the schema has the expected and custom node interface and user type and that they both use UUIDs
"""
parsed = re.findall(r"(.+) \{\n\s*([\w\W]*?)\n\}", str(self.schema))
types = [t for t, f in parsed]
fields = [f for t, f in parsed]
custom_node_interface = "interface CustomNode"
assert custom_node_interface in types
assert (
'"""The ID of the object"""\n id: UUID!'
== fields[types.index(custom_node_interface)]
)
user_type = "type User implements CustomNode"
assert user_type in types
assert (
'"""The ID of the object"""\n id: UUID!\n name: String'
== fields[types.index(user_type)]
)
def test_get_by_id(self):
query = """query userById($id: UUID!) {
user(id: $id) {
id
name
}
}"""
# UUID need to be converted to string for serialization
result = graphql_sync(
self.graphql_schema,
query,
variable_values={"id": str(self.user_list[0]["id"])},
)
assert not result.errors
assert result.data["user"]["id"] == str(self.user_list[0]["id"])
assert result.data["user"]["name"] == self.user_list[0]["name"]
class TestSimpleGlobalID:
def setup_method(self):
self.user_list = [
{"id": "my global primary key in clear 1", "name": "First"},
{"id": "my global primary key in clear 2", "name": "Second"},
{"id": "my global primary key in clear 3", "name": "Third"},
{"id": "my global primary key in clear 4", "name": "Fourth"},
]
self.users = {user["id"]: user for user in self.user_list}
class CustomNode(Node):
class Meta:
global_id_type = SimpleGlobalIDType
class User(ObjectType):
class Meta:
interfaces = [CustomNode]
name = String()
@classmethod
def get_node(cls, _type, _id):
return self.users[_id]
class RootQuery(ObjectType):
user = CustomNode.Field(User)
self.schema = Schema(query=RootQuery, types=[User])
self.graphql_schema = self.schema.graphql_schema
def test_str_schema_correct(self):
"""
Check that the schema has the expected and custom node interface and user type and that they both use UUIDs
"""
parsed = re.findall(r"(.+) \{\n\s*([\w\W]*?)\n\}", str(self.schema))
types = [t for t, f in parsed]
fields = [f for t, f in parsed]
custom_node_interface = "interface CustomNode"
assert custom_node_interface in types
assert (
'"""The ID of the object"""\n id: ID!'
== fields[types.index(custom_node_interface)]
)
user_type = "type User implements CustomNode"
assert user_type in types
assert (
'"""The ID of the object"""\n id: ID!\n name: String'
== fields[types.index(user_type)]
)
def test_get_by_id(self):
query = """query {
user(id: "my global primary key in clear 3") {
id
name
}
}"""
result = graphql_sync(self.graphql_schema, query)
assert not result.errors
assert result.data["user"]["id"] == self.user_list[2]["id"]
assert result.data["user"]["name"] == self.user_list[2]["name"]
class TestCustomGlobalID:
def setup_method(self):
self.user_list = [
{"id": 1, "name": "First"},
{"id": 2, "name": "Second"},
{"id": 3, "name": "Third"},
{"id": 4, "name": "Fourth"},
]
self.users = {user["id"]: user for user in self.user_list}
class CustomGlobalIDType(BaseGlobalIDType):
"""
Global id that is simply and integer in clear.
"""
graphene_type = Int
@classmethod
def resolve_global_id(cls, info, global_id):
_type = info.return_type.graphene_type._meta.name
return _type, global_id
@classmethod
def to_global_id(cls, _type, _id):
return _id
class CustomNode(Node):
class Meta:
global_id_type = CustomGlobalIDType
class User(ObjectType):
class Meta:
interfaces = [CustomNode]
name = String()
@classmethod
def get_node(cls, _type, _id):
return self.users[_id]
class RootQuery(ObjectType):
user = CustomNode.Field(User)
self.schema = Schema(query=RootQuery, types=[User])
self.graphql_schema = self.schema.graphql_schema
def test_str_schema_correct(self):
"""
Check that the schema has the expected and custom node interface and user type and that they both use UUIDs
"""
parsed = re.findall(r"(.+) \{\n\s*([\w\W]*?)\n\}", str(self.schema))
types = [t for t, f in parsed]
fields = [f for t, f in parsed]
custom_node_interface = "interface CustomNode"
assert custom_node_interface in types
assert (
'"""The ID of the object"""\n id: Int!'
== fields[types.index(custom_node_interface)]
)
user_type = "type User implements CustomNode"
assert user_type in types
assert (
'"""The ID of the object"""\n id: Int!\n name: String'
== fields[types.index(user_type)]
)
def test_get_by_id(self):
query = """query {
user(id: 2) {
id
name
}
}"""
result = graphql_sync(self.graphql_schema, query)
assert not result.errors
assert result.data["user"]["id"] == self.user_list[1]["id"]
assert result.data["user"]["name"] == self.user_list[1]["name"]
class TestIncompleteCustomGlobalID:
def setup_method(self):
self.user_list = [
{"id": 1, "name": "First"},
{"id": 2, "name": "Second"},
{"id": 3, "name": "Third"},
{"id": 4, "name": "Fourth"},
]
self.users = {user["id"]: user for user in self.user_list}
def test_must_define_to_global_id(self):
"""
Test that if the `to_global_id` method is not defined, we can query the object, but we can't request its ID.
"""
class CustomGlobalIDType(BaseGlobalIDType):
graphene_type = Int
@classmethod
def resolve_global_id(cls, info, global_id):
_type = info.return_type.graphene_type._meta.name
return _type, global_id
class CustomNode(Node):
class Meta:
global_id_type = CustomGlobalIDType
class User(ObjectType):
class Meta:
interfaces = [CustomNode]
name = String()
@classmethod
def get_node(cls, _type, _id):
return self.users[_id]
class RootQuery(ObjectType):
user = CustomNode.Field(User)
self.schema = Schema(query=RootQuery, types=[User])
self.graphql_schema = self.schema.graphql_schema
query = """query {
user(id: 2) {
name
}
}"""
result = graphql_sync(self.graphql_schema, query)
assert not result.errors
assert result.data["user"]["name"] == self.user_list[1]["name"]
query = """query {
user(id: 2) {
id
name
}
}"""
result = graphql_sync(self.graphql_schema, query)
assert result.errors is not None
assert len(result.errors) == 1
assert result.errors[0].path == ["user", "id"]
def test_must_define_resolve_global_id(self):
"""
Test that if the `resolve_global_id` method is not defined, we can't query the object by ID.
"""
class CustomGlobalIDType(BaseGlobalIDType):
graphene_type = Int
@classmethod
def to_global_id(cls, _type, _id):
return _id
class CustomNode(Node):
class Meta:
global_id_type = CustomGlobalIDType
class User(ObjectType):
class Meta:
interfaces = [CustomNode]
name = String()
@classmethod
def get_node(cls, _type, _id):
return self.users[_id]
class RootQuery(ObjectType):
user = CustomNode.Field(User)
self.schema = Schema(query=RootQuery, types=[User])
self.graphql_schema = self.schema.graphql_schema
query = """query {
user(id: 2) {
id
name
}
}"""
result = graphql_sync(self.graphql_schema, query)
assert result.errors is not None
assert len(result.errors) == 1
assert result.errors[0].path == ["user"]

View File

@ -17,7 +17,7 @@ class User(ObjectType):
name = String() name = String()
class Info: class Info(object):
def __init__(self, parent_type): def __init__(self, parent_type):
self.parent_type = GrapheneObjectType( self.parent_type = GrapheneObjectType(
graphene_type=parent_type, graphene_type=parent_type,
@ -45,7 +45,7 @@ def test_global_id_allows_overriding_of_node_and_required():
def test_global_id_defaults_to_info_parent_type(): def test_global_id_defaults_to_info_parent_type():
my_id = "1" my_id = "1"
gid = GlobalID() gid = GlobalID()
id_resolver = gid.wrap_resolve(lambda *_: my_id) id_resolver = gid.get_resolver(lambda *_: my_id)
my_global_id = id_resolver(None, Info(User)) my_global_id = id_resolver(None, Info(User))
assert my_global_id == to_global_id(User._meta.name, my_id) assert my_global_id == to_global_id(User._meta.name, my_id)
@ -53,6 +53,6 @@ def test_global_id_defaults_to_info_parent_type():
def test_global_id_allows_setting_customer_parent_type(): def test_global_id_allows_setting_customer_parent_type():
my_id = "1" my_id = "1"
gid = GlobalID(parent_type=User) gid = GlobalID(parent_type=User)
id_resolver = gid.wrap_resolve(lambda *_: my_id) id_resolver = gid.get_resolver(lambda *_: my_id)
my_global_id = id_resolver(None, None) my_global_id = id_resolver(None, None)
assert my_global_id == to_global_id(User._meta.name, my_id) assert my_global_id == to_global_id(User._meta.name, my_id)

View File

@ -1,4 +1,5 @@
from pytest import mark, raises import pytest
from promise import Promise
from ...types import ( from ...types import (
ID, ID,
@ -14,7 +15,7 @@ from ...types.scalars import String
from ..mutation import ClientIDMutation from ..mutation import ClientIDMutation
class SharedFields: class SharedFields(object):
shared = String() shared = String()
@ -36,7 +37,7 @@ class SaySomething(ClientIDMutation):
return SaySomething(phrase=str(what)) return SaySomething(phrase=str(what))
class FixedSaySomething: class FixedSaySomething(object):
__slots__ = ("phrase",) __slots__ = ("phrase",)
def __init__(self, phrase): def __init__(self, phrase):
@ -54,15 +55,15 @@ class SaySomethingFixed(ClientIDMutation):
return FixedSaySomething(phrase=str(what)) return FixedSaySomething(phrase=str(what))
class SaySomethingAsync(ClientIDMutation): class SaySomethingPromise(ClientIDMutation):
class Input: class Input:
what = String() what = String()
phrase = String() phrase = String()
@staticmethod @staticmethod
async def mutate_and_get_payload(self, info, what, client_mutation_id=None): def mutate_and_get_payload(self, info, what, client_mutation_id=None):
return SaySomething(phrase=str(what)) return Promise.resolve(SaySomething(phrase=str(what)))
# MyEdge = MyNode.Connection.Edge # MyEdge = MyNode.Connection.Edge
@ -96,7 +97,7 @@ class RootQuery(ObjectType):
class Mutation(ObjectType): class Mutation(ObjectType):
say = SaySomething.Field() say = SaySomething.Field()
say_fixed = SaySomethingFixed.Field() say_fixed = SaySomethingFixed.Field()
say_async = SaySomethingAsync.Field() say_promise = SaySomethingPromise.Field()
other = OtherMutation.Field() other = OtherMutation.Field()
@ -104,7 +105,7 @@ schema = Schema(query=RootQuery, mutation=Mutation)
def test_no_mutate_and_get_payload(): def test_no_mutate_and_get_payload():
with raises(AssertionError) as excinfo: with pytest.raises(AssertionError) as excinfo:
class MyMutation(ClientIDMutation): class MyMutation(ClientIDMutation):
pass pass
@ -117,12 +118,12 @@ def test_no_mutate_and_get_payload():
def test_mutation(): def test_mutation():
fields = SaySomething._meta.fields fields = SaySomething._meta.fields
assert list(fields) == ["phrase", "client_mutation_id"] assert list(fields.keys()) == ["phrase", "client_mutation_id"]
assert SaySomething._meta.name == "SaySomethingPayload" assert SaySomething._meta.name == "SaySomethingPayload"
assert isinstance(fields["phrase"], Field) assert isinstance(fields["phrase"], Field)
field = SaySomething.Field() field = SaySomething.Field()
assert field.type == SaySomething assert field.type == SaySomething
assert list(field.args) == ["input"] assert list(field.args.keys()) == ["input"]
assert isinstance(field.args["input"], Argument) assert isinstance(field.args["input"], Argument)
assert isinstance(field.args["input"].type, NonNull) assert isinstance(field.args["input"].type, NonNull)
assert field.args["input"].type.of_type == SaySomething.Input assert field.args["input"].type.of_type == SaySomething.Input
@ -135,7 +136,7 @@ def test_mutation_input():
Input = SaySomething.Input Input = SaySomething.Input
assert issubclass(Input, InputObjectType) assert issubclass(Input, InputObjectType)
fields = Input._meta.fields fields = Input._meta.fields
assert list(fields) == ["what", "client_mutation_id"] assert list(fields.keys()) == ["what", "client_mutation_id"]
assert isinstance(fields["what"], InputField) assert isinstance(fields["what"], InputField)
assert fields["what"].type == String assert fields["what"].type == String
assert isinstance(fields["client_mutation_id"], InputField) assert isinstance(fields["client_mutation_id"], InputField)
@ -144,11 +145,11 @@ def test_mutation_input():
def test_subclassed_mutation(): def test_subclassed_mutation():
fields = OtherMutation._meta.fields fields = OtherMutation._meta.fields
assert list(fields) == ["name", "my_node_edge", "client_mutation_id"] assert list(fields.keys()) == ["name", "my_node_edge", "client_mutation_id"]
assert isinstance(fields["name"], Field) assert isinstance(fields["name"], Field)
field = OtherMutation.Field() field = OtherMutation.Field()
assert field.type == OtherMutation assert field.type == OtherMutation
assert list(field.args) == ["input"] assert list(field.args.keys()) == ["input"]
assert isinstance(field.args["input"], Argument) assert isinstance(field.args["input"], Argument)
assert isinstance(field.args["input"].type, NonNull) assert isinstance(field.args["input"].type, NonNull)
assert field.args["input"].type.of_type == OtherMutation.Input assert field.args["input"].type.of_type == OtherMutation.Input
@ -158,7 +159,7 @@ def test_subclassed_mutation_input():
Input = OtherMutation.Input Input = OtherMutation.Input
assert issubclass(Input, InputObjectType) assert issubclass(Input, InputObjectType)
fields = Input._meta.fields fields = Input._meta.fields
assert list(fields) == ["shared", "additional_field", "client_mutation_id"] assert list(fields.keys()) == ["shared", "additional_field", "client_mutation_id"]
assert isinstance(fields["shared"], InputField) assert isinstance(fields["shared"], InputField)
assert fields["shared"].type == String assert fields["shared"].type == String
assert isinstance(fields["additional_field"], InputField) assert isinstance(fields["additional_field"], InputField)
@ -184,13 +185,12 @@ def test_node_query_fixed():
) )
@mark.asyncio def test_node_query_promise():
async def test_node_query_async(): executed = schema.execute(
executed = await schema.execute_async( 'mutation a { sayPromise(input: {what:"hello", clientMutationId:"1"}) { phrase } }'
'mutation a { sayAsync(input: {what:"hello", clientMutationId:"1"}) { phrase } }'
) )
assert not executed.errors assert not executed.errors
assert executed.data == {"sayAsync": {"phrase": "hello"}} assert executed.data == {"sayPromise": {"phrase": "hello"}}
def test_edge_query(): def test_edge_query():

View File

@ -1,5 +1,4 @@
import re from collections import OrderedDict
from textwrap import dedent
from graphql_relay import to_global_id from graphql_relay import to_global_id
@ -7,7 +6,8 @@ from ...types import ObjectType, Schema, String
from ..node import Node, is_node from ..node import Node, is_node
class SharedNodeFields: class SharedNodeFields(object):
shared = String() shared = String()
something_else = String() something_else = String()
@ -54,7 +54,6 @@ def test_node_good():
assert "id" in MyNode._meta.fields assert "id" in MyNode._meta.fields
assert is_node(MyNode) assert is_node(MyNode)
assert not is_node(object) assert not is_node(object)
assert not is_node("node")
def test_node_query(): def test_node_query():
@ -71,33 +70,23 @@ def test_subclassed_node_query():
% to_global_id("MyOtherNode", 1) % to_global_id("MyOtherNode", 1)
) )
assert not executed.errors assert not executed.errors
assert executed.data == { assert executed.data == OrderedDict(
"node": { {
"shared": "1", "node": OrderedDict(
"extraField": "extra field info.", [
"somethingElse": "----", ("shared", "1"),
("extraField", "extra field info."),
("somethingElse", "----"),
]
)
} }
} )
def test_node_requesting_non_node(): def test_node_requesting_non_node():
executed = schema.execute( executed = schema.execute(
'{ node(id:"%s") { __typename } } ' % Node.to_global_id("RootQuery", 1) '{ node(id:"%s") { __typename } } ' % Node.to_global_id("RootQuery", 1)
) )
assert executed.errors
assert re.match(
r"ObjectType .* does not implement the .* interface.",
executed.errors[0].message,
)
assert executed.data == {"node": None}
def test_node_requesting_unknown_type():
executed = schema.execute(
'{ node(id:"%s") { __typename } } ' % Node.to_global_id("UnknownType", 1)
)
assert executed.errors
assert re.match(r"Relay Node .* not found in schema", executed.errors[0].message)
assert executed.data == {"node": None} assert executed.data == {"node": None}
@ -105,8 +94,7 @@ def test_node_query_incorrect_id():
executed = schema.execute( executed = schema.execute(
'{ node(id:"%s") { ... on MyNode { name } } }' % "something:2" '{ node(id:"%s") { ... on MyNode { name } } }' % "something:2"
) )
assert executed.errors assert not executed.errors
assert re.match(r"Unable to parse global ID .*", executed.errors[0].message)
assert executed.data == {"node": None} assert executed.data == {"node": None}
@ -122,17 +110,6 @@ def test_node_field_custom():
assert node_field.node_type == Node assert node_field.node_type == Node
def test_node_field_args():
field_args = {
"name": "my_custom_name",
"description": "my_custom_description",
"deprecation_reason": "my_custom_deprecation_reason",
}
node_field = Node.Field(**field_args)
for field_arg, value in field_args.items():
assert getattr(node_field, field_arg) == value
def test_node_field_only_type(): def test_node_field_only_type():
executed = schema.execute( executed = schema.execute(
'{ onlyNode(id:"%s") { __typename, name } } ' % Node.to_global_id("MyNode", 1) '{ onlyNode(id:"%s") { __typename, name } } ' % Node.to_global_id("MyNode", 1)
@ -147,7 +124,7 @@ def test_node_field_only_type_wrong():
% Node.to_global_id("MyOtherNode", 1) % Node.to_global_id("MyOtherNode", 1)
) )
assert len(executed.errors) == 1 assert len(executed.errors) == 1
assert str(executed.errors[0]).startswith("Must receive a MyNode id.") assert str(executed.errors[0]) == "Must receive a MyNode id."
assert executed.data == {"onlyNode": None} assert executed.data == {"onlyNode": None}
@ -166,54 +143,39 @@ def test_node_field_only_lazy_type_wrong():
% Node.to_global_id("MyOtherNode", 1) % Node.to_global_id("MyOtherNode", 1)
) )
assert len(executed.errors) == 1 assert len(executed.errors) == 1
assert str(executed.errors[0]).startswith("Must receive a MyNode id.") assert str(executed.errors[0]) == "Must receive a MyNode id."
assert executed.data == {"onlyNodeLazy": None} assert executed.data == {"onlyNodeLazy": None}
def test_str_schema(): def test_str_schema():
assert ( assert (
str(schema).strip() str(schema)
== dedent( == """
''' schema {
schema { query: RootQuery
query: RootQuery }
}
type MyNode implements Node { type MyNode implements Node {
"""The ID of the object""" id: ID!
id: ID! name: String
name: String }
}
"""An object with an ID""" type MyOtherNode implements Node {
interface Node { id: ID!
"""The ID of the object""" shared: String
id: ID! somethingElse: String
} extraField: String
}
type MyOtherNode implements Node { interface Node {
"""The ID of the object""" id: ID!
id: ID! }
shared: String
somethingElse: String
extraField: String
}
type RootQuery { type RootQuery {
first: String first: String
node( node(id: ID!): Node
"""The ID of the object""" onlyNode(id: ID!): MyNode
id: ID! onlyNodeLazy(id: ID!): MyNode
): Node }
onlyNode( """.lstrip()
"""The ID of the object"""
id: ID!
): MyNode
onlyNodeLazy(
"""The ID of the object"""
id: ID!
): MyNode
}
'''
).strip()
) )

View File

@ -1,6 +1,4 @@
from textwrap import dedent from graphql import graphql
from graphql import graphql_sync
from ...types import Interface, ObjectType, Schema from ...types import Interface, ObjectType, Schema
from ...types.scalars import Int, String from ...types.scalars import Int, String
@ -12,12 +10,12 @@ class CustomNode(Node):
name = "Node" name = "Node"
@staticmethod @staticmethod
def to_global_id(type_, id): def to_global_id(type, id):
return id return id
@staticmethod @staticmethod
def get_node_from_global_id(info, id, only_type=None): def get_node_from_global_id(info, id, only_type=None):
assert info.schema is graphql_schema assert info.schema == schema
if id in user_data: if id in user_data:
return user_data.get(id) return user_data.get(id)
else: else:
@ -25,14 +23,14 @@ class CustomNode(Node):
class BasePhoto(Interface): class BasePhoto(Interface):
width = Int(description="The width of the photo in pixels") width = Int()
class User(ObjectType): class User(ObjectType):
class Meta: class Meta:
interfaces = [CustomNode] interfaces = [CustomNode]
name = String(description="The full name of the user") name = String()
class Photo(ObjectType): class Photo(ObjectType):
@ -50,52 +48,37 @@ class RootQuery(ObjectType):
schema = Schema(query=RootQuery, types=[User, Photo]) schema = Schema(query=RootQuery, types=[User, Photo])
graphql_schema = schema.graphql_schema
def test_str_schema_correct(): def test_str_schema_correct():
assert ( assert (
str(schema).strip() str(schema)
== dedent( == """schema {
''' query: RootQuery
schema { }
query: RootQuery
}
type User implements Node { interface BasePhoto {
"""The ID of the object""" width: Int
id: ID! }
"""The full name of the user""" interface Node {
name: String id: ID!
} }
interface Node { type Photo implements Node, BasePhoto {
"""The ID of the object""" id: ID!
id: ID! width: Int
} }
type Photo implements Node & BasePhoto { type RootQuery {
"""The ID of the object""" node(id: ID!): Node
id: ID! }
"""The width of the photo in pixels""" type User implements Node {
width: Int id: ID!
} name: String
}
interface BasePhoto { """
"""The width of the photo in pixels"""
width: Int
}
type RootQuery {
node(
"""The ID of the object"""
id: ID!
): Node
}
'''
).strip()
) )
@ -108,7 +91,7 @@ def test_gets_the_correct_id_for_users():
} }
""" """
expected = {"node": {"id": "1"}} expected = {"node": {"id": "1"}}
result = graphql_sync(graphql_schema, query) result = graphql(schema, query)
assert not result.errors assert not result.errors
assert result.data == expected assert result.data == expected
@ -122,7 +105,7 @@ def test_gets_the_correct_id_for_photos():
} }
""" """
expected = {"node": {"id": "4"}} expected = {"node": {"id": "4"}}
result = graphql_sync(graphql_schema, query) result = graphql(schema, query)
assert not result.errors assert not result.errors
assert result.data == expected assert result.data == expected
@ -139,7 +122,7 @@ def test_gets_the_correct_name_for_users():
} }
""" """
expected = {"node": {"id": "1", "name": "John Doe"}} expected = {"node": {"id": "1", "name": "John Doe"}}
result = graphql_sync(graphql_schema, query) result = graphql(schema, query)
assert not result.errors assert not result.errors
assert result.data == expected assert result.data == expected
@ -156,7 +139,7 @@ def test_gets_the_correct_width_for_photos():
} }
""" """
expected = {"node": {"id": "4", "width": 400}} expected = {"node": {"id": "4", "width": 400}}
result = graphql_sync(graphql_schema, query) result = graphql(schema, query)
assert not result.errors assert not result.errors
assert result.data == expected assert result.data == expected
@ -171,7 +154,7 @@ def test_gets_the_correct_typename_for_users():
} }
""" """
expected = {"node": {"id": "1", "__typename": "User"}} expected = {"node": {"id": "1", "__typename": "User"}}
result = graphql_sync(graphql_schema, query) result = graphql(schema, query)
assert not result.errors assert not result.errors
assert result.data == expected assert result.data == expected
@ -186,7 +169,7 @@ def test_gets_the_correct_typename_for_photos():
} }
""" """
expected = {"node": {"id": "4", "__typename": "Photo"}} expected = {"node": {"id": "4", "__typename": "Photo"}}
result = graphql_sync(graphql_schema, query) result = graphql(schema, query)
assert not result.errors assert not result.errors
assert result.data == expected assert result.data == expected
@ -203,7 +186,7 @@ def test_ignores_photo_fragments_on_user():
} }
""" """
expected = {"node": {"id": "1"}} expected = {"node": {"id": "1"}}
result = graphql_sync(graphql_schema, query) result = graphql(schema, query)
assert not result.errors assert not result.errors
assert result.data == expected assert result.data == expected
@ -217,7 +200,7 @@ def test_returns_null_for_bad_ids():
} }
""" """
expected = {"node": None} expected = {"node": None}
result = graphql_sync(graphql_schema, query) result = graphql(schema, query)
assert not result.errors assert not result.errors
assert result.data == expected assert result.data == expected
@ -256,7 +239,7 @@ def test_have_correct_node_interface():
], ],
} }
} }
result = graphql_sync(graphql_schema, query) result = graphql(schema, query)
assert not result.errors assert not result.errors
assert result.data == expected assert result.data == expected
@ -308,6 +291,6 @@ def test_has_correct_node_root_field():
} }
} }
} }
result = graphql_sync(graphql_schema, query) result = graphql(schema, query)
assert not result.errors assert not result.errors
assert result.data == expected assert result.data == expected

View File

@ -1,3 +1,6 @@
from promise import Promise, is_thenable
import six
from graphql.error import format_error as format_graphql_error
from graphql.error import GraphQLError from graphql.error import GraphQLError
from graphene.types.schema import Schema from graphene.types.schema import Schema
@ -5,20 +8,25 @@ from graphene.types.schema import Schema
def default_format_error(error): def default_format_error(error):
if isinstance(error, GraphQLError): if isinstance(error, GraphQLError):
return error.formatted return format_graphql_error(error)
return {"message": str(error)}
return {"message": six.text_type(error)}
def format_execution_result(execution_result, format_error): def format_execution_result(execution_result, format_error):
if execution_result: if execution_result:
response = {} response = {}
if execution_result.errors: if execution_result.errors:
response["errors"] = [format_error(e) for e in execution_result.errors] response["errors"] = [format_error(e) for e in execution_result.errors]
response["data"] = execution_result.data
if not execution_result.invalid:
response["data"] = execution_result.data
return response return response
class Client: class Client(object):
def __init__(self, schema, format_error=None, **execute_options): def __init__(self, schema, format_error=None, **execute_options):
assert isinstance(schema, Schema) assert isinstance(schema, Schema)
self.schema = schema self.schema = schema
@ -30,10 +38,7 @@ class Client:
def execute(self, *args, **kwargs): def execute(self, *args, **kwargs):
executed = self.schema.execute(*args, **dict(self.execute_options, **kwargs)) executed = self.schema.execute(*args, **dict(self.execute_options, **kwargs))
return self.format_result(executed) if is_thenable(executed):
return Promise.resolve(executed).then(self.format_result)
async def execute_async(self, *args, **kwargs):
executed = await self.schema.execute_async(
*args, **dict(self.execute_options, **kwargs)
)
return self.format_result(executed) return self.format_result(executed)

View File

@ -1,41 +0,0 @@
# https://github.com/graphql-python/graphene/issues/1293
from datetime import datetime, timezone
import graphene
from graphql.utilities import print_schema
class Filters(graphene.InputObjectType):
datetime_after = graphene.DateTime(
required=False,
default_value=datetime.fromtimestamp(1434549820.776, timezone.utc),
)
datetime_before = graphene.DateTime(
required=False,
default_value=datetime.fromtimestamp(1444549820.776, timezone.utc),
)
class SetDatetime(graphene.Mutation):
class Arguments:
filters = Filters(required=True)
ok = graphene.Boolean()
def mutate(root, info, filters):
return SetDatetime(ok=True)
class Query(graphene.ObjectType):
goodbye = graphene.String()
class Mutations(graphene.ObjectType):
set_datetime = SetDatetime.Field()
def test_schema_printable_with_default_datetime_value():
schema = graphene.Schema(query=Query, mutation=Mutations)
schema_str = print_schema(schema.graphql_schema)
assert schema_str, "empty schema printed"

View File

@ -1,36 +0,0 @@
from ...types import ObjectType, Schema, String, NonNull
class Query(ObjectType):
hello = String(input=NonNull(String))
def resolve_hello(self, info, input):
if input == "nothing":
return None
return f"Hello {input}!"
schema = Schema(query=Query)
def test_required_input_provided():
"""
Test that a required argument works when provided.
"""
input_value = "Potato"
result = schema.execute('{ hello(input: "%s") }' % input_value)
assert not result.errors
assert result.data == {"hello": "Hello Potato!"}
def test_required_input_missing():
"""
Test that a required argument raised an error if not provided.
"""
result = schema.execute("{ hello }")
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message
== "Field 'hello' argument 'input' of type 'String!' is required, but it was not provided."
)

View File

@ -1,53 +0,0 @@
import pytest
from ...types.base64 import Base64
from ...types.datetime import Date, DateTime
from ...types.decimal import Decimal
from ...types.generic import GenericScalar
from ...types.json import JSONString
from ...types.objecttype import ObjectType
from ...types.scalars import ID, BigInt, Boolean, Float, Int, String
from ...types.schema import Schema
from ...types.uuid import UUID
@pytest.mark.parametrize(
"input_type,input_value",
[
(Date, '"2022-02-02"'),
(GenericScalar, '"foo"'),
(Int, "1"),
(BigInt, "12345678901234567890"),
(Float, "1.1"),
(String, '"foo"'),
(Boolean, "true"),
(ID, "1"),
(DateTime, '"2022-02-02T11:11:11"'),
(UUID, '"cbebbc62-758e-4f75-a890-bc73b5017d81"'),
(Decimal, '"1.1"'),
(JSONString, '"{\\"key\\":\\"foo\\",\\"value\\":\\"bar\\"}"'),
(Base64, '"Q2hlbG8gd29ycmxkCg=="'),
],
)
def test_parse_literal_with_variables(input_type, input_value):
# input_b needs to be evaluated as literal while the variable dict for
# input_a is passed along.
class Query(ObjectType):
generic = GenericScalar(input_a=GenericScalar(), input_b=input_type())
def resolve_generic(self, info, input_a=None, input_b=None):
return input
schema = Schema(query=Query)
query = f"""
query Test($a: GenericScalar){{
generic(inputA: $a, inputB: {input_value})
}}
"""
result = schema.execute(
query,
variables={"a": "bar"},
)
assert not result.errors

View File

@ -21,7 +21,7 @@ class CreatePostResult(graphene.Union):
class CreatePost(graphene.Mutation): class CreatePost(graphene.Mutation):
class Arguments: class Input:
text = graphene.String(required=True) text = graphene.String(required=True)
result = graphene.Field(CreatePostResult) result = graphene.Field(CreatePostResult)

View File

@ -1,6 +1,6 @@
# https://github.com/graphql-python/graphene/issues/356 # https://github.com/graphql-python/graphene/issues/356
from pytest import raises import pytest
import graphene import graphene
from graphene import relay from graphene import relay
@ -23,11 +23,10 @@ def test_issue():
class Query(graphene.ObjectType): class Query(graphene.ObjectType):
things = relay.ConnectionField(MyUnion) things = relay.ConnectionField(MyUnion)
with raises(Exception) as exc_info: with pytest.raises(Exception) as exc_info:
graphene.Schema(query=Query) graphene.Schema(query=Query)
assert str(exc_info.value) == ( assert str(exc_info.value) == (
"Query fields cannot be resolved." "IterableConnectionField type have to be a subclass of Connection. "
" IterableConnectionField type has to be a subclass of Connection." 'Received "MyUnion".'
' Received "MyUnion".'
) )

View File

@ -1,27 +0,0 @@
import pickle
from ...types.enum import Enum
class PickleEnum(Enum):
# is defined outside of test because pickle unable to dump class inside ot pytest function
A = "a"
B = 1
def test_enums_pickling():
a = PickleEnum.A
pickled = pickle.dumps(a)
restored = pickle.loads(pickled)
assert type(a) is type(restored)
assert a == restored
assert a.value == restored.value
assert a.name == restored.name
b = PickleEnum.B
pickled = pickle.dumps(b)
restored = pickle.loads(pickled)
assert type(a) is type(restored)
assert b == restored
assert b.value == restored.value
assert b.name == restored.name

View File

@ -1,53 +1,57 @@
from graphql import GraphQLResolveInfo as ResolveInfo # flake8: noqa
from graphql import ResolveInfo
from .argument import Argument from .objecttype import ObjectType
from .base64 import Base64 from .interface import Interface
from .context import Context from .mutation import Mutation
from .scalars import Scalar, String, ID, Int, Float, Boolean
from .datetime import Date, DateTime, Time from .datetime import Date, DateTime, Time
from .decimal import Decimal from .decimal import Decimal
from .dynamic import Dynamic from .json import JSONString
from .uuid import UUID
from .schema import Schema
from .structures import List, NonNull
from .enum import Enum from .enum import Enum
from .field import Field from .field import Field
from .inputfield import InputField from .inputfield import InputField
from .argument import Argument
from .inputobjecttype import InputObjectType from .inputobjecttype import InputObjectType
from .interface import Interface from .dynamic import Dynamic
from .json import JSONString
from .mutation import Mutation
from .objecttype import ObjectType
from .scalars import ID, BigInt, Boolean, Float, Int, Scalar, String
from .schema import Schema
from .structures import List, NonNull
from .union import Union from .union import Union
from .uuid import UUID from .context import Context
# Deprecated
from .abstracttype import AbstractType
__all__ = [ __all__ = [
"Argument", "ObjectType",
"Base64", "InputObjectType",
"BigInt", "Interface",
"Boolean", "Mutation",
"Context",
"Date",
"DateTime",
"Decimal",
"Dynamic",
"Enum", "Enum",
"Field", "Field",
"Float",
"ID",
"InputField", "InputField",
"InputObjectType",
"Int",
"Interface",
"JSONString",
"List",
"Mutation",
"NonNull",
"ObjectType",
"ResolveInfo",
"Scalar",
"Schema", "Schema",
"Scalar",
"String", "String",
"ID",
"Int",
"Float",
"Date",
"DateTime",
"Time", "Time",
"Decimal",
"JSONString",
"UUID", "UUID",
"Boolean",
"List",
"NonNull",
"Argument",
"Dynamic",
"Union", "Union",
"Context",
"ResolveInfo",
# Deprecated
"AbstractType",
] ]

View File

@ -0,0 +1,11 @@
from ..utils.deprecated import warn_deprecation
from ..utils.subclass_with_meta import SubclassWithMeta
class AbstractType(SubclassWithMeta):
def __init_subclass__(cls, *args, **kwargs):
warn_deprecation(
"Abstract type is deprecated, please use normal object inheritance instead.\n"
"See more: https://github.com/graphql-python/graphene/blob/master/UPGRADE-v2.0.md#deprecations"
)
super(AbstractType, cls).__init_subclass__(*args, **kwargs)

View File

@ -1,5 +1,5 @@
from collections import OrderedDict
from itertools import chain from itertools import chain
from graphql import Undefined
from .dynamic import Dynamic from .dynamic import Dynamic
from .mountedtype import MountedType from .mountedtype import MountedType
@ -31,22 +31,18 @@ class Argument(MountedType):
type (class for a graphene.UnmountedType): must be a class (not an instance) of an type (class for a graphene.UnmountedType): must be a class (not an instance) of an
unmounted graphene type (ex. scalar or object) which is used for the type of this unmounted graphene type (ex. scalar or object) which is used for the type of this
argument in the GraphQL schema. argument in the GraphQL schema.
required (optional, bool): indicates this argument as not null in the graphql schema. Same behavior required (bool): indicates this argument as not null in the graphql scehma. Same behavior
as graphene.NonNull. Default False. as graphene.NonNull. Default False.
name (optional, str): the name of the GraphQL argument. Defaults to parameter name. name (str): the name of the GraphQL argument. Defaults to parameter name.
description (optional, str): the description of the GraphQL argument in the schema. description (str): the description of the GraphQL argument in the schema.
default_value (optional, Any): The value to be provided if the user does not set this argument in default_value (Any): The value to be provided if the user does not set this argument in
the operation. the operation.
deprecation_reason (optional, str): Setting this value indicates that the argument is
depreciated and may provide instruction or reason on how for clients to proceed. Cannot be
set if the argument is required (see spec).
""" """
def __init__( def __init__(
self, self,
type_, type,
default_value=Undefined, default_value=None,
deprecation_reason=None,
description=None, description=None,
name=None, name=None,
required=False, required=False,
@ -55,16 +51,12 @@ class Argument(MountedType):
super(Argument, self).__init__(_creation_counter=_creation_counter) super(Argument, self).__init__(_creation_counter=_creation_counter)
if required: if required:
assert ( type = NonNull(type)
deprecation_reason is None
), f"Argument {name} is required, cannot deprecate it."
type_ = NonNull(type_)
self.name = name self.name = name
self._type = type_ self._type = type
self.default_value = default_value self.default_value = default_value
self.description = description self.description = description
self.deprecation_reason = deprecation_reason
@property @property
def type(self): def type(self):
@ -76,7 +68,6 @@ class Argument(MountedType):
and self.type == other.type and self.type == other.type
and self.default_value == other.default_value and self.default_value == other.default_value
and self.description == other.description and self.description == other.description
and self.deprecation_reason == other.deprecation_reason
) )
@ -90,7 +81,7 @@ def to_arguments(args, extra_args=None):
else: else:
extra_args = [] extra_args = []
iter_arguments = chain(args.items(), extra_args) iter_arguments = chain(args.items(), extra_args)
arguments = {} arguments = OrderedDict()
for default_name, arg in iter_arguments: for default_name, arg in iter_arguments:
if isinstance(arg, Dynamic): if isinstance(arg, Dynamic):
arg = arg.get_type() arg = arg.get_type()
@ -104,17 +95,18 @@ def to_arguments(args, extra_args=None):
if isinstance(arg, (InputField, Field)): if isinstance(arg, (InputField, Field)):
raise ValueError( raise ValueError(
f"Expected {default_name} to be Argument, " "Expected {} to be Argument, but received {}. Try using Argument({}).".format(
f"but received {type(arg).__name__}. Try using Argument({arg.type})." default_name, type(arg).__name__, arg.type
)
) )
if not isinstance(arg, Argument): if not isinstance(arg, Argument):
raise ValueError(f'Unknown argument "{default_name}".') raise ValueError('Unknown argument "{}".'.format(default_name))
arg_name = default_name or arg.name arg_name = default_name or arg.name
assert ( assert (
arg_name not in arguments arg_name not in arguments
), f'More than one Argument have same name "{arg_name}".' ), 'More than one Argument have same name "{}".'.format(arg_name)
arguments[arg_name] = arg arguments[arg_name] = arg
return arguments return arguments

View File

@ -1,17 +1,19 @@
from typing import Type, Optional from ..utils.subclass_with_meta import SubclassWithMeta
from ..utils.subclass_with_meta import SubclassWithMeta, SubclassWithMeta_Meta
from ..utils.trim_docstring import trim_docstring from ..utils.trim_docstring import trim_docstring
import six
if six.PY3:
from typing import Type
class BaseOptions: class BaseOptions(object):
name: Optional[str] = None name = None # type: str
description: Optional[str] = None description = None # type: str
_frozen: bool = False _frozen = False # type: bool
def __init__(self, class_type: Type): def __init__(self, class_type):
self.class_type: Type = class_type self.class_type = class_type # type: Type
def freeze(self): def freeze(self):
self._frozen = True self._frozen = True
@ -20,13 +22,10 @@ class BaseOptions:
if not self._frozen: if not self._frozen:
super(BaseOptions, self).__setattr__(name, value) super(BaseOptions, self).__setattr__(name, value)
else: else:
raise Exception(f"Can't modify frozen Options {self}") raise Exception("Can't modify frozen Options {}".format(self))
def __repr__(self): def __repr__(self):
return f"<{self.__class__.__name__} name={repr(self.name)}>" return "<{} name={}>".format(self.__class__.__name__, repr(self.name))
BaseTypeMeta = SubclassWithMeta_Meta
class BaseType(SubclassWithMeta): class BaseType(SubclassWithMeta):
@ -38,7 +37,7 @@ class BaseType(SubclassWithMeta):
def __init_subclass_with_meta__( def __init_subclass_with_meta__(
cls, name=None, description=None, _meta=None, **_kwargs cls, name=None, description=None, _meta=None, **_kwargs
): ):
assert "_meta" not in cls.__dict__, "Can't assign meta directly" assert "_meta" not in cls.__dict__, "Can't assign directly meta"
if not _meta: if not _meta:
return return
_meta.name = name or cls.__name__ _meta.name = name or cls.__name__

View File

@ -1,43 +0,0 @@
from binascii import Error as _Error
from base64 import b64decode, b64encode
from graphql.error import GraphQLError
from graphql.language import StringValueNode, print_ast
from .scalars import Scalar
class Base64(Scalar):
"""
The `Base64` scalar type represents a base64-encoded String.
"""
@staticmethod
def serialize(value):
if not isinstance(value, bytes):
if isinstance(value, str):
value = value.encode("utf-8")
else:
value = str(value).encode("utf-8")
return b64encode(value).decode("utf-8")
@classmethod
def parse_literal(cls, node, _variables=None):
if not isinstance(node, StringValueNode):
raise GraphQLError(
f"Base64 cannot represent non-string value: {print_ast(node)}"
)
return cls.parse_value(node.value)
@staticmethod
def parse_value(value):
if not isinstance(value, bytes):
if not isinstance(value, str):
raise GraphQLError(
f"Base64 cannot represent non-string value: {repr(value)}"
)
value = value.encode("utf-8")
try:
return b64decode(value, validate=True).decode("utf-8")
except _Error:
raise GraphQLError(f"Base64 cannot decode value: {repr(value)}")

View File

@ -1,4 +1,4 @@
class Context: class Context(object):
""" """
Context can be used to make a convenient container for attributes to provide Context can be used to make a convenient container for attributes to provide
for execution for resolvers of a GraphQL operation like a query. for execution for resolvers of a GraphQL operation like a query.

View File

@ -1,9 +1,10 @@
from __future__ import absolute_import
import datetime import datetime
from dateutil.parser import isoparse from aniso8601 import parse_date, parse_datetime, parse_time
from graphql.language import ast
from graphql.error import GraphQLError from six import string_types
from graphql.language import StringValueNode, print_ast
from .scalars import Scalar from .scalars import Scalar
@ -19,28 +20,25 @@ class Date(Scalar):
def serialize(date): def serialize(date):
if isinstance(date, datetime.datetime): if isinstance(date, datetime.datetime):
date = date.date() date = date.date()
if not isinstance(date, datetime.date): assert isinstance(
raise GraphQLError(f"Date cannot represent value: {repr(date)}") date, datetime.date
), 'Received not compatible date "{}"'.format(repr(date))
return date.isoformat() return date.isoformat()
@classmethod @classmethod
def parse_literal(cls, node, _variables=None): def parse_literal(cls, node):
if not isinstance(node, StringValueNode): if isinstance(node, ast.StringValue):
raise GraphQLError( return cls.parse_value(node.value)
f"Date cannot represent non-string value: {print_ast(node)}"
)
return cls.parse_value(node.value)
@staticmethod @staticmethod
def parse_value(value): def parse_value(value):
if isinstance(value, datetime.date):
return value
if not isinstance(value, str):
raise GraphQLError(f"Date cannot represent non-string value: {repr(value)}")
try: try:
return datetime.date.fromisoformat(value) if isinstance(value, datetime.date):
return value
elif isinstance(value, string_types):
return parse_date(value)
except ValueError: except ValueError:
raise GraphQLError(f"Date cannot represent value: {repr(value)}") return None
class DateTime(Scalar): class DateTime(Scalar):
@ -52,30 +50,25 @@ class DateTime(Scalar):
@staticmethod @staticmethod
def serialize(dt): def serialize(dt):
if not isinstance(dt, (datetime.datetime, datetime.date)): assert isinstance(
raise GraphQLError(f"DateTime cannot represent value: {repr(dt)}") dt, (datetime.datetime, datetime.date)
), 'Received not compatible datetime "{}"'.format(repr(dt))
return dt.isoformat() return dt.isoformat()
@classmethod @classmethod
def parse_literal(cls, node, _variables=None): def parse_literal(cls, node):
if not isinstance(node, StringValueNode): if isinstance(node, ast.StringValue):
raise GraphQLError( return cls.parse_value(node.value)
f"DateTime cannot represent non-string value: {print_ast(node)}"
)
return cls.parse_value(node.value)
@staticmethod @staticmethod
def parse_value(value): def parse_value(value):
if isinstance(value, datetime.datetime):
return value
if not isinstance(value, str):
raise GraphQLError(
f"DateTime cannot represent non-string value: {repr(value)}"
)
try: try:
return isoparse(value) if isinstance(value, datetime.datetime):
return value
elif isinstance(value, string_types):
return parse_datetime(value)
except ValueError: except ValueError:
raise GraphQLError(f"DateTime cannot represent value: {repr(value)}") return None
class Time(Scalar): class Time(Scalar):
@ -87,25 +80,22 @@ class Time(Scalar):
@staticmethod @staticmethod
def serialize(time): def serialize(time):
if not isinstance(time, datetime.time): assert isinstance(
raise GraphQLError(f"Time cannot represent value: {repr(time)}") time, datetime.time
), 'Received not compatible time "{}"'.format(repr(time))
return time.isoformat() return time.isoformat()
@classmethod @classmethod
def parse_literal(cls, node, _variables=None): def parse_literal(cls, node):
if not isinstance(node, StringValueNode): if isinstance(node, ast.StringValue):
raise GraphQLError( return cls.parse_value(node.value)
f"Time cannot represent non-string value: {print_ast(node)}"
)
return cls.parse_value(node.value)
@classmethod @classmethod
def parse_value(cls, value): def parse_value(cls, value):
if isinstance(value, datetime.time):
return value
if not isinstance(value, str):
raise GraphQLError(f"Time cannot represent non-string value: {repr(value)}")
try: try:
return datetime.time.fromisoformat(value) if isinstance(value, datetime.time):
return value
elif isinstance(value, string_types):
return parse_time(value)
except ValueError: except ValueError:
raise GraphQLError(f"Time cannot represent value: {repr(value)}") return None

View File

@ -1,7 +1,8 @@
from __future__ import absolute_import
from decimal import Decimal as _Decimal from decimal import Decimal as _Decimal
from graphql import Undefined from graphql.language import ast
from graphql.language.ast import StringValueNode, IntValueNode
from .scalars import Scalar from .scalars import Scalar
@ -15,20 +16,19 @@ class Decimal(Scalar):
def serialize(dec): def serialize(dec):
if isinstance(dec, str): if isinstance(dec, str):
dec = _Decimal(dec) dec = _Decimal(dec)
assert isinstance( assert isinstance(dec, _Decimal), 'Received not compatible Decimal "{}"'.format(
dec, _Decimal repr(dec)
), f'Received not compatible Decimal "{repr(dec)}"' )
return str(dec) return str(dec)
@classmethod @classmethod
def parse_literal(cls, node, _variables=None): def parse_literal(cls, node):
if isinstance(node, (StringValueNode, IntValueNode)): if isinstance(node, ast.StringValue):
return cls.parse_value(node.value) return cls.parse_value(node.value)
return Undefined
@staticmethod @staticmethod
def parse_value(value): def parse_value(value):
try: try:
return _Decimal(value) return _Decimal(value)
except Exception: except ValueError:
return Undefined return None

View File

@ -1,5 +1,3 @@
from enum import Enum as PyEnum
from graphql import ( from graphql import (
GraphQLEnumType, GraphQLEnumType,
GraphQLInputObjectType, GraphQLInputObjectType,
@ -10,7 +8,7 @@ from graphql import (
) )
class GrapheneGraphQLType: class GrapheneGraphQLType(object):
""" """
A class for extending the base GraphQLType with the related A class for extending the base GraphQLType with the related
graphene_type graphene_type
@ -20,11 +18,6 @@ class GrapheneGraphQLType:
self.graphene_type = kwargs.pop("graphene_type") self.graphene_type = kwargs.pop("graphene_type")
super(GrapheneGraphQLType, self).__init__(*args, **kwargs) super(GrapheneGraphQLType, self).__init__(*args, **kwargs)
def __copy__(self):
result = GrapheneGraphQLType(graphene_type=self.graphene_type)
result.__dict__.update(self.__dict__)
return result
class GrapheneInterfaceType(GrapheneGraphQLType, GraphQLInterfaceType): class GrapheneInterfaceType(GrapheneGraphQLType, GraphQLInterfaceType):
pass pass
@ -43,19 +36,7 @@ class GrapheneScalarType(GrapheneGraphQLType, GraphQLScalarType):
class GrapheneEnumType(GrapheneGraphQLType, GraphQLEnumType): class GrapheneEnumType(GrapheneGraphQLType, GraphQLEnumType):
def serialize(self, value): pass
if not isinstance(value, PyEnum):
enum = self.graphene_type._meta.enum
try:
# Try and get enum by value
value = enum(value)
except ValueError:
# Try and get enum by name
try:
value = enum[value]
except KeyError:
pass
return super(GrapheneEnumType, self).serialize(value)
class GrapheneInputObjectType(GrapheneGraphQLType, GraphQLInputObjectType): class GrapheneInputObjectType(GrapheneGraphQLType, GraphQLInputObjectType):

View File

@ -10,10 +10,10 @@ class Dynamic(MountedType):
the schema. So we can have lazy fields. the schema. So we can have lazy fields.
""" """
def __init__(self, type_, with_schema=False, _creation_counter=None): def __init__(self, type, with_schema=False, _creation_counter=None):
super(Dynamic, self).__init__(_creation_counter=_creation_counter) super(Dynamic, self).__init__(_creation_counter=_creation_counter)
assert inspect.isfunction(type_) or isinstance(type_, partial) assert inspect.isfunction(type) or isinstance(type, partial)
self.type = type_ self.type = type
self.with_schema = with_schema self.with_schema = with_schema
def get_type(self, schema=None): def get_type(self, schema=None):

View File

@ -1,7 +1,10 @@
from enum import Enum as PyEnum from collections import OrderedDict
import six
from graphene.utils.subclass_with_meta import SubclassWithMeta_Meta from graphene.utils.subclass_with_meta import SubclassWithMeta_Meta
from ..pyutils.compat import Enum as PyEnum
from .base import BaseOptions, BaseType from .base import BaseOptions, BaseType
from .unmountedtype import UnmountedType from .unmountedtype import UnmountedType
@ -12,10 +15,6 @@ def eq_enum(self, other):
return self.value is other return self.value is other
def hash_enum(self):
return hash(self.name)
EnumType = type(PyEnum) EnumType = type(PyEnum)
@ -25,17 +24,15 @@ class EnumOptions(BaseOptions):
class EnumMeta(SubclassWithMeta_Meta): class EnumMeta(SubclassWithMeta_Meta):
def __new__(cls, name_, bases, classdict, **options): def __new__(cls, name, bases, classdict, **options):
enum_members = dict(classdict, __eq__=eq_enum, __hash__=hash_enum) enum_members = OrderedDict(classdict, __eq__=eq_enum)
# We remove the Meta attribute from the class to not collide # We remove the Meta attribute from the class to not collide
# with the enum values. # with the enum values.
enum_members.pop("Meta", None) enum_members.pop("Meta", None)
enum = PyEnum(cls.__name__, enum_members) enum = PyEnum(cls.__name__, enum_members)
obj = SubclassWithMeta_Meta.__new__( return SubclassWithMeta_Meta.__new__(
cls, name_, bases, dict(classdict, __enum__=enum), **options cls, name, bases, OrderedDict(classdict, __enum__=enum), **options
) )
globals()[name_] = obj.__enum__
return obj
def get(cls, value): def get(cls, value):
return cls._meta.enum(value) return cls._meta.enum(value)
@ -44,7 +41,7 @@ class EnumMeta(SubclassWithMeta_Meta):
return cls._meta.enum[value] return cls._meta.enum[value]
def __prepare__(name, bases, **kwargs): # noqa: N805 def __prepare__(name, bases, **kwargs): # noqa: N805
return {} return OrderedDict()
def __call__(cls, *args, **kwargs): # noqa: N805 def __call__(cls, *args, **kwargs): # noqa: N805
if cls is Enum: if cls is Enum:
@ -58,22 +55,18 @@ class EnumMeta(SubclassWithMeta_Meta):
return super(EnumMeta, cls).__call__(*args, **kwargs) return super(EnumMeta, cls).__call__(*args, **kwargs)
# return cls._meta.enum(*args, **kwargs) # return cls._meta.enum(*args, **kwargs)
def __iter__(cls): def from_enum(cls, enum, description=None, deprecation_reason=None): # noqa: N805
return cls._meta.enum.__iter__() description = description or enum.__doc__
def from_enum(cls, enum, name=None, description=None, deprecation_reason=None): # noqa: N805
name = name or enum.__name__
description = description or enum.__doc__ or "An enumeration."
meta_dict = { meta_dict = {
"enum": enum, "enum": enum,
"description": description, "description": description,
"deprecation_reason": deprecation_reason, "deprecation_reason": deprecation_reason,
} }
meta_class = type("Meta", (object,), meta_dict) meta_class = type("Meta", (object,), meta_dict)
return type(name, (Enum,), {"Meta": meta_class}) return type(meta_class.enum.__name__, (Enum,), {"Meta": meta_class})
class Enum(UnmountedType, BaseType, metaclass=EnumMeta): class Enum(six.with_metaclass(EnumMeta, UnmountedType, BaseType)):
""" """
Enum type definition Enum type definition

View File

@ -1,20 +1,18 @@
import inspect import inspect
from collections.abc import Mapping from collections import Mapping, OrderedDict
from functools import partial from functools import partial
from .argument import Argument, to_arguments from .argument import Argument, to_arguments
from .mountedtype import MountedType from .mountedtype import MountedType
from .resolver import default_resolver
from .structures import NonNull from .structures import NonNull
from .unmountedtype import UnmountedType from .unmountedtype import UnmountedType
from .utils import get_type from .utils import get_type
from ..utils.deprecated import warn_deprecation
base_type = type base_type = type
def source_resolver(source, root, info, **args): def source_resolver(source, root, info, **args):
resolved = default_resolver(source, None, root, info, **args) resolved = getattr(root, source, None)
if inspect.isfunction(resolved) or inspect.ismethod(resolved): if inspect.isfunction(resolved) or inspect.ismethod(resolved):
return resolved() return resolved()
return resolved return resolved
@ -41,20 +39,18 @@ class Field(MountedType):
last_name = graphene.Field(String, description='Surname') # explicitly mounted as Field last_name = graphene.Field(String, description='Surname') # explicitly mounted as Field
args: args:
type (class for a graphene.UnmountedType): Must be a class (not an instance) of an type (class for a graphene.UnmountedType): must be a class (not an instance) of an
unmounted graphene type (ex. scalar or object) which is used for the type of this unmounted graphene type (ex. scalar or object) which is used for the type of this
field in the GraphQL schema. You can provide a dotted module import path (string) field in the GraphQL schema.
to the class instead of the class itself (e.g. to avoid circular import issues). args (optional, Dict[str, graphene.Argument]): arguments that can be input to the field.
args (optional, Dict[str, graphene.Argument]): Arguments that can be input to the field. Prefer to use **extra_args.
Prefer to use ``**extra_args``, unless you use an argument name that clashes with one
of the Field arguments presented here (see :ref:`example<ResolverParamGraphQLArguments>`).
resolver (optional, Callable): A function to get the value for a Field from the parent resolver (optional, Callable): A function to get the value for a Field from the parent
value object. If not set, the default resolver method for the schema is used. value object. If not set, the default resolver method for the schema is used.
source (optional, str): attribute name to resolve for this field from the parent value source (optional, str): attribute name to resolve for this field from the parent value
object. Alternative to resolver (cannot set both source and resolver). object. Alternative to resolver (cannot set both source and resolver).
deprecation_reason (optional, str): Setting this value indicates that the field is deprecation_reason (optional, str): Setting this value indicates that the field is
depreciated and may provide instruction or reason on how for clients to proceed. depreciated and may provide instruction or reason on how for clients to proceed.
required (optional, bool): indicates this field as not null in the graphql schema. Same behavior as required (optional, bool): indicates this field as not null in the graphql scehma. Same behavior as
graphene.NonNull. Default False. graphene.NonNull. Default False.
name (optional, str): the name of the GraphQL field (must be unique in a type). Defaults to attribute name (optional, str): the name of the GraphQL field (must be unique in a type). Defaults to attribute
name. name.
@ -66,7 +62,7 @@ class Field(MountedType):
def __init__( def __init__(
self, self,
type_, type,
args=None, args=None,
resolver=None, resolver=None,
source=None, source=None,
@ -76,21 +72,21 @@ class Field(MountedType):
required=False, required=False,
_creation_counter=None, _creation_counter=None,
default_value=None, default_value=None,
**extra_args, **extra_args
): ):
super(Field, self).__init__(_creation_counter=_creation_counter) super(Field, self).__init__(_creation_counter=_creation_counter)
assert not args or isinstance( assert not args or isinstance(args, Mapping), (
args, Mapping 'Arguments in a field have to be a mapping, received "{}".'
), f'Arguments in a field have to be a mapping, received "{args}".' ).format(args)
assert not ( assert not (
source and resolver source and resolver
), "A Field cannot have a source and a resolver in at the same time." ), "A Field cannot have a source and a resolver in at the same time."
assert not callable( assert not callable(default_value), (
default_value 'The default value can not be a function but received "{}".'
), f'The default value can not be a function but received "{base_type(default_value)}".' ).format(base_type(default_value))
if required: if required:
type_ = NonNull(type_) type = NonNull(type)
# Check if name is actually an argument of the field # Check if name is actually an argument of the field
if isinstance(name, (Argument, UnmountedType)): if isinstance(name, (Argument, UnmountedType)):
@ -103,8 +99,8 @@ class Field(MountedType):
source = None source = None
self.name = name self.name = name
self._type = type_ self._type = type
self.args = to_arguments(args or {}, extra_args) self.args = to_arguments(args or OrderedDict(), extra_args)
if source: if source:
resolver = partial(source_resolver, source) resolver = partial(source_resolver, source)
self.resolver = resolver self.resolver = resolver
@ -116,24 +112,5 @@ class Field(MountedType):
def type(self): def type(self):
return get_type(self._type) return get_type(self._type)
get_resolver = None def get_resolver(self, parent_resolver):
def wrap_resolve(self, parent_resolver):
"""
Wraps a function resolver, using the ObjectType resolve_{FIELD_NAME}
(parent_resolver) if the Field definition has no resolver.
"""
if self.get_resolver is not None:
warn_deprecation(
"The get_resolver method is being deprecated, please rename it to wrap_resolve."
)
return self.get_resolver(parent_resolver)
return self.resolver or parent_resolver return self.resolver or parent_resolver
def wrap_subscribe(self, parent_subscribe):
"""
Wraps a function subscribe, using the ObjectType subscribe_{FIELD_NAME}
(parent_subscribe) if the Field definition has no subscribe.
"""
return parent_subscribe

View File

@ -1,10 +1,12 @@
from __future__ import unicode_literals
from graphql.language.ast import ( from graphql.language.ast import (
BooleanValueNode, BooleanValue,
FloatValueNode, FloatValue,
IntValueNode, IntValue,
ListValueNode, ListValue,
ObjectValueNode, ObjectValue,
StringValueNode, StringValue,
) )
from graphene.types.scalars import MAX_INT, MIN_INT from graphene.types.scalars import MAX_INT, MIN_INT
@ -27,18 +29,18 @@ class GenericScalar(Scalar):
parse_value = identity parse_value = identity
@staticmethod @staticmethod
def parse_literal(ast, _variables=None): def parse_literal(ast):
if isinstance(ast, (StringValueNode, BooleanValueNode)): if isinstance(ast, (StringValue, BooleanValue)):
return ast.value return ast.value
elif isinstance(ast, IntValueNode): elif isinstance(ast, IntValue):
num = int(ast.value) num = int(ast.value)
if MIN_INT <= num <= MAX_INT: if MIN_INT <= num <= MAX_INT:
return num return num
elif isinstance(ast, FloatValueNode): elif isinstance(ast, FloatValue):
return float(ast.value) return float(ast.value)
elif isinstance(ast, ListValueNode): elif isinstance(ast, ListValue):
return [GenericScalar.parse_literal(value) for value in ast.values] return [GenericScalar.parse_literal(value) for value in ast.values]
elif isinstance(ast, ObjectValueNode): elif isinstance(ast, ObjectValue):
return { return {
field.name.value: GenericScalar.parse_literal(field.value) field.name.value: GenericScalar.parse_literal(field.value)
for field in ast.fields for field in ast.fields

View File

@ -1,5 +1,3 @@
from graphql import Undefined
from .mountedtype import MountedType from .mountedtype import MountedType
from .structures import NonNull from .structures import NonNull
from .utils import get_type from .utils import get_type
@ -40,7 +38,7 @@ class InputField(MountedType):
deprecation_reason (optional, str): Setting this value indicates that the field is deprecation_reason (optional, str): Setting this value indicates that the field is
depreciated and may provide instruction or reason on how for clients to proceed. depreciated and may provide instruction or reason on how for clients to proceed.
description (optional, str): Description of the GraphQL field in the schema. description (optional, str): Description of the GraphQL field in the schema.
required (optional, bool): Indicates this input field as not null in the graphql schema. required (optional, bool): Indicates this input field as not null in the graphql scehma.
Raises a validation error if argument not provided. Same behavior as graphene.NonNull. Raises a validation error if argument not provided. Same behavior as graphene.NonNull.
Default False. Default False.
**extra_args (optional, Dict): Not used. **extra_args (optional, Dict): Not used.
@ -48,23 +46,20 @@ class InputField(MountedType):
def __init__( def __init__(
self, self,
type_, type,
name=None, name=None,
default_value=Undefined, default_value=None,
deprecation_reason=None, deprecation_reason=None,
description=None, description=None,
required=False, required=False,
_creation_counter=None, _creation_counter=None,
**extra_args, **extra_args
): ):
super(InputField, self).__init__(_creation_counter=_creation_counter) super(InputField, self).__init__(_creation_counter=_creation_counter)
self.name = name self.name = name
if required: if required:
assert ( type = NonNull(type)
deprecation_reason is None self._type = type
), f"InputField {name} is required, cannot deprecate it."
type_ = NonNull(type_)
self._type = type_
self.deprecation_reason = deprecation_reason self.deprecation_reason = deprecation_reason
self.default_value = default_value self.default_value = default_value
self.description = description self.description = description

View File

@ -1,12 +1,13 @@
from typing import TYPE_CHECKING from collections import OrderedDict
from .base import BaseOptions, BaseType from .base import BaseOptions, BaseType
from .inputfield import InputField from .inputfield import InputField
from .unmountedtype import UnmountedType from .unmountedtype import UnmountedType
from .utils import yank_fields_from_attrs from .utils import yank_fields_from_attrs
# For static type checking with type checker # For static type checking with Mypy
if TYPE_CHECKING: MYPY = False
if MYPY:
from typing import Dict, Callable # NOQA from typing import Dict, Callable # NOQA
@ -15,39 +16,14 @@ class InputObjectTypeOptions(BaseOptions):
container = None # type: InputObjectTypeContainer container = None # type: InputObjectTypeContainer
# Currently in Graphene, we get a `None` whenever we access an (optional) field that was not set in an InputObjectType class InputObjectTypeContainer(dict, BaseType):
# using the InputObjectType.<attribute> dot access syntax. This is ambiguous, because in this current (Graphene
# historical) arrangement, we cannot distinguish between a field not being set and a field being set to None.
# At the same time, we shouldn't break existing code that expects a `None` when accessing a field that was not set.
_INPUT_OBJECT_TYPE_DEFAULT_VALUE = None
# To mitigate this, we provide the function `set_input_object_type_default_value` to allow users to change the default
# value returned in non-specified fields in InputObjectType to another meaningful sentinel value (e.g. Undefined)
# if they want to. This way, we can keep code that expects a `None` working while we figure out a better solution (or
# a well-documented breaking change) for this issue.
def set_input_object_type_default_value(default_value):
"""
Change the sentinel value returned by non-specified fields in an InputObjectType
Useful to differentiate between a field not being set and a field being set to None by using a sentinel value
(e.g. Undefined is a good sentinel value for this purpose)
This function should be called at the beginning of the app or in some other place where it is guaranteed to
be called before any InputObjectType is defined.
"""
global _INPUT_OBJECT_TYPE_DEFAULT_VALUE
_INPUT_OBJECT_TYPE_DEFAULT_VALUE = default_value
class InputObjectTypeContainer(dict, BaseType): # type: ignore
class Meta: class Meta:
abstract = True abstract = True
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
dict.__init__(self, *args, **kwargs) dict.__init__(self, *args, **kwargs)
for key in self._meta.fields: for key in self._meta.fields.keys():
setattr(self, key, self.get(key, _INPUT_OBJECT_TYPE_DEFAULT_VALUE)) setattr(self, key, self.get(key, None))
def __init_subclass__(cls, *args, **kwargs): def __init_subclass__(cls, *args, **kwargs):
pass pass
@ -94,7 +70,7 @@ class InputObjectType(UnmountedType, BaseType):
if not _meta: if not _meta:
_meta = InputObjectTypeOptions(cls) _meta = InputObjectTypeOptions(cls)
fields = {} fields = OrderedDict()
for base in reversed(cls.__mro__): for base in reversed(cls.__mro__):
fields.update(yank_fields_from_attrs(base.__dict__, _as=InputField)) fields.update(yank_fields_from_attrs(base.__dict__, _as=InputField))

Some files were not shown because too many files have changed in this diff Show More