Merge pull request #2 from graphql-python/master

pull latest master
This commit is contained in:
Dan 2019-03-07 21:25:48 -08:00 committed by GitHub
commit 799613970e
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
134 changed files with 3372 additions and 3526 deletions

View File

@ -2,11 +2,6 @@ repos:
- repo: git://github.com/pre-commit/pre-commit-hooks
rev: v1.3.0
hooks:
- id: autopep8-wrapper
args:
- -i
- --ignore=E128,E309,E501
exclude: ^docs/.*$
- id: check-json
- id: check-yaml
- id: debug-statements
@ -19,14 +14,11 @@ repos:
- --autofix
- id: flake8
- repo: https://github.com/asottile/pyupgrade
rev: v1.2.0
rev: v1.4.0
hooks:
- id: pyupgrade
- repo: https://github.com/asottile/seed-isort-config
rev: v1.0.1
- repo: https://github.com/ambv/black
rev: 18.6b4
hooks:
- id: seed-isort-config
- repo: https://github.com/pre-commit/mirrors-isort
rev: v4.3.4
hooks:
- id: isort
- id: black
language_version: python3

View File

@ -1,20 +1,24 @@
language: python
matrix:
include:
- env: TOXENV=py27
python: 2.7
- env: TOXENV=py34
python: 3.4
- env: TOXENV=py35
python: 3.5
- env: TOXENV=py36
python: 3.6
- env: TOXENV=pypy
python: pypy-5.7.1
- env: TOXENV=pre-commit
python: 3.6
- env: TOXENV=mypy
python: 3.6
- env: TOXENV=py27
python: 2.7
- env: TOXENV=py34
python: 3.4
- env: TOXENV=py35
python: 3.5
- env: TOXENV=py36
python: 3.6
- env: TOXENV=py37
python: 3.7
dist: xenial
sudo: true
- env: TOXENV=pypy
python: pypy-5.7.1
- env: TOXENV=pre-commit
python: 3.6
- env: TOXENV=mypy
python: 3.6
install:
- pip install coveralls tox
script: tox

97
BACKERS.md Normal file
View File

@ -0,0 +1,97 @@
<h1 align="center">Sponsors &amp; Backers</h1>
Graphene is an MIT-licensed open source project. It's an independent project with its ongoing development made possible entirely thanks to the support by these awesome [backers](https://github.com/graphql-python/graphene/blob/master/BACKERS.md). If you'd like to join them, please consider:
- [Become a backer or sponsor on Patreon](https://www.patreon.com/syrusakbary).
- [One-time donation via PayPal.](https://graphene-python.org/support-graphene/)
<br><br>
<!--<h2 align="center">Special Sponsors</h2>
<p align="center">
<a href="https://stdlib.com" target="_blank">
<img width="260px" src="https://raw.githubusercontent.com/graphql-python/graphene-python.org/master/src/pages/sponsors/generic-logo.png">
</a>
</p>
<!--special end-->
<h2 align="center">Platinum via Patreon</h2>
<!--platinum start-->
<table>
<tbody>
<tr>
<td align="center" valign="middle">
<a href="https://www.patreon.com/join/syrusakbary" target="_blank">
<img width="222px" src="https://raw.githubusercontent.com/graphql-python/graphene-python.org/master/src/pages/sponsors/generic-logo.png">
</a>
</td>
</tr>
</tbody>
</table>
<h2 align="center">Gold via Patreon</h2>
<!--gold start-->
<table>
<tbody>
<tr>
<td align="center" valign="middle">
<a href="https://www.patreon.com/join/syrusakbary" target="_blank">
<img width="148px" src="https://raw.githubusercontent.com/graphql-python/graphene-python.org/master/src/pages/sponsors/generic-logo.png">
</a>
</td>
</tr>
</tbody>
</table>
<!--gold end-->
<h2 align="center">Silver via Patreon</h2>
<!--silver start-->
<table>
<tbody>
<tr>
<td align="center" valign="middle">
<a href="https://www.patreon.com/join/syrusakbary" target="_blank">
<img width="148px" src="https://raw.githubusercontent.com/graphql-python/graphene-python.org/master/src/pages/sponsors/generic-logo.png">
</a>
</td>
</tr>
</tbody>
</table>
<!--silver end-->
<h2 align="center">Bronze via Patreon</h2>
<!--bronze start-->
<table>
<tbody>
<tr>
<td align="center" valign="middle">
<a href="https://www.patreon.com/join/syrusakbary" target="_blank">
<img width="148px" src="https://raw.githubusercontent.com/graphql-python/graphene-python.org/master/src/pages/sponsors/generic-logo.png">
</a>
</td>
</tr>
</tbody>
</table>
<!--bronze end-->
<h2 align="center">Generous Backers via Patreon ($50+)</h2>
<!--50 start-->
- [Lee Benson](https://github.com/leebenson)
- [Become a Patron](https://www.patreon.com/join/syrusakbary)
<!--50 end-->
<h2 align="center">Backers via Patreon</h2>
<!--10 start-->
- [Become a Patron](https://www.patreon.com/join/syrusakbary)
<!--10 end-->

1
CODEOWNERS Normal file
View File

@ -0,0 +1 @@
/ @syrusakbary @ekampf @dan98765

11
Makefile Normal file
View File

@ -0,0 +1,11 @@
.PHONY: help
help:
@echo "Please use \`make <target>' where <target> is one of"
@grep -E '^\.PHONY: [a-zA-Z_-]+ .*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = "(: |##)"}; {printf "\033[36m%-30s\033[0m %s\n", $$2, $$3}'
.PHONY: docs ## Generate docs
docs:
@cd docs &&\
pip install -r requirements.txt &&\
make html &&\
cd -

110
README.md
View File

@ -1,9 +1,77 @@
Please read [UPGRADE-v2.0.md](/UPGRADE-v2.0.md) to learn how to upgrade to Graphene `2.0`.
**We are looking for contributors**! Please check the [ROADMAP](https://github.com/graphql-python/graphene/blob/master/ROADMAP.md) to see how you can help ❤️
---
# ![Graphene Logo](http://graphene-python.org/favicon.png) [Graphene](http://graphene-python.org) [![Build Status](https://travis-ci.org/graphql-python/graphene.svg?branch=master)](https://travis-ci.org/graphql-python/graphene) [![PyPI version](https://badge.fury.io/py/graphene.svg)](https://badge.fury.io/py/graphene) [![Coverage Status](https://coveralls.io/repos/graphql-python/graphene/badge.svg?branch=master&service=github)](https://coveralls.io/github/graphql-python/graphene?branch=master)
<h1 align="center">Supporting Graphene Python</h1>
Graphene is an MIT-licensed open source project. It's an independent project with its ongoing development made possible entirely thanks to the support by these awesome [backers](https://github.com/graphql-python/graphene/blob/master/BACKERS.md). If you'd like to join them, please consider:
- [Become a backer or sponsor on Patreon](https://www.patreon.com/syrusakbary).
- [One-time donation via PayPal.](https://graphene-python.org/support-graphene/)
<!--<h2 align="center">Special Sponsors</h2>
<p align="center">
<a href="https://stdlib.com" target="_blank">
<img width="260px" src="https://raw.githubusercontent.com/graphql-python/graphene-python.org/master/src/pages/sponsors/generic-logo.png">
</a>
</p>
<!--special end-->
<h2 align="center">Platinum via Patreon</h2>
<!--platinum start-->
<table>
<tbody>
<tr>
<td align="center" valign="middle">
<a href="https://www.patreon.com/join/syrusakbary" target="_blank">
<img width="222px" src="https://raw.githubusercontent.com/graphql-python/graphene-python.org/master/src/pages/sponsors/generic-logo.png">
</a>
</td>
</tr>
</tbody>
</table>
<h2 align="center">Gold via Patreon</h2>
<!--gold start-->
<table>
<tbody>
<tr>
<td align="center" valign="middle">
<a href="https://www.patreon.com/join/syrusakbary" target="_blank">
<img width="148px" src="https://raw.githubusercontent.com/graphql-python/graphene-python.org/master/src/pages/sponsors/generic-logo.png">
</a>
</td>
</tr>
</tbody>
</table>
<!--gold end-->
<h2 align="center">Silver via Patreon</h2>
<!--silver start-->
<table>
<tbody>
<tr>
<td align="center" valign="middle">
<a href="https://www.patreon.com/join/syrusakbary" target="_blank">
<img width="148px" src="https://raw.githubusercontent.com/graphql-python/graphene-python.org/master/src/pages/sponsors/generic-logo.png">
</a>
</td>
</tr>
</tbody>
</table>
<!--silver end-->
---
## Introduction
[Graphene](http://graphene-python.org) is a Python library for building GraphQL schemas/types fast and easily.
@ -13,17 +81,16 @@ Please read [UPGRADE-v2.0.md](/UPGRADE-v2.0.md) to learn how to upgrade to Graph
We believe that by providing a complete API you could plug Graphene anywhere your data lives and make your data available
through GraphQL.
## Integrations
Graphene has multiple integrations with different frameworks:
| integration | Package |
|---------------|-------------------|
| Django | [graphene-django](https://github.com/graphql-python/graphene-django/) |
| SQLAlchemy | [graphene-sqlalchemy](https://github.com/graphql-python/graphene-sqlalchemy/) |
| Google App Engine | [graphene-gae](https://github.com/graphql-python/graphene-gae/) |
| Peewee | *In progress* ([Tracking Issue](https://github.com/graphql-python/graphene/issues/289)) |
| integration | Package |
| ----------------- | --------------------------------------------------------------------------------------- |
| Django | [graphene-django](https://github.com/graphql-python/graphene-django/) |
| SQLAlchemy | [graphene-sqlalchemy](https://github.com/graphql-python/graphene-sqlalchemy/) |
| Google App Engine | [graphene-gae](https://github.com/graphql-python/graphene-gae/) |
| Peewee | _In progress_ ([Tracking Issue](https://github.com/graphql-python/graphene/issues/289)) |
Also, Graphene is fully compatible with the GraphQL spec, working seamlessly with all GraphQL clients, such as [Relay](https://github.com/facebook/relay), [Apollo](https://github.com/apollographql/apollo-client) and [gql](https://github.com/graphql-python/gql).
@ -39,7 +106,6 @@ pip install "graphene>=2.0"
Please read [UPGRADE-v2.0.md](/UPGRADE-v2.0.md) to learn how to upgrade.
## Examples
Here is one example for you to get started:
@ -67,9 +133,8 @@ result = schema.execute(query)
If you want to learn even more, you can also check the following [examples](examples/):
* **Basic Schema**: [Starwars example](examples/starwars)
* **Relay Schema**: [Starwars Relay example](examples/starwars_relay)
- **Basic Schema**: [Starwars example](examples/starwars)
- **Relay Schema**: [Starwars Relay example](examples/starwars_relay)
## Contributing
@ -84,13 +149,13 @@ pip install -e ".[test]"
Well-written tests and maintaining good test coverage is important to this project. While developing, run new and existing tests with:
```sh
py.test PATH/TO/MY/DIR/test_test.py # Single file
py.test PATH/TO/MY/DIR/ # All tests in directory
py.test graphene/relay/tests/test_node.py # Single file
py.test graphene/relay # All tests in directory
```
Add the `-s` flag if you have introduced breakpoints into the code for debugging.
Add the `-v` ("verbose") flag to get more detailed test output. For even more detailed output, use `-vv`.
Check out the [pytest documentation](https://docs.pytest.org/en/latest/) for more options and test running controls.
Check out the [pytest documentation](https://docs.pytest.org/en/latest/) for more options and test running controls.
You can also run the benchmarks with:
@ -99,28 +164,25 @@ py.test graphene --benchmark-only
```
Graphene supports several versions of Python. To make sure that changes do not break compatibility with any of those versions, we use `tox` to create virtualenvs for each python version and run tests with that version. To run against all python versions defined in the `tox.ini` config file, just run:
```sh
tox
```
If you wish to run against a specific version defined in the `tox.ini` file:
```sh
tox -e py36
```
Tox can only use whatever versions of python are installed on your system. When you create a pull request, Travis will also be running the same tests and report the results, so there is no need for potential contributors to try to install every single version of python on their own system ahead of time. We appreciate opening issues and pull requests to make graphene even more stable & useful!
### Documentation
The documentation is generated using the excellent [Sphinx](http://www.sphinx-doc.org/) and a custom theme.
The documentation dependencies are installed by running:
An HTML version of the documentation is produced by running:
```sh
cd docs
pip install -r requirements.txt
```
Then to produce a HTML version of the documentation:
```sh
make html
make docs
```

View File

@ -1,11 +1,190 @@
Please read `UPGRADE-v2.0.md </UPGRADE-v2.0.md>`__ to learn how to
upgrade to Graphene ``2.0``.
**We are looking for contributors**! Please check the
`ROADMAP <https://github.com/graphql-python/graphene/blob/master/ROADMAP.md>`__
to see how you can help ❤️
--------------
|Graphene Logo| `Graphene <http://graphene-python.org>`__ |Build Status| |PyPI version| |Coverage Status|
=========================================================================================================
.. raw:: html
<h1 align="center">
Supporting Graphene Python
.. raw:: html
</h1>
Graphene is an MIT-licensed open source project. It's an independent
project with its ongoing development made possible entirely thanks to
the support by these awesome
`backers <https://github.com/graphql-python/graphene/blob/master/BACKERS.md>`__.
If you'd like to join them, please consider:
- `Become a backer or sponsor on
Patreon <https://www.patreon.com/syrusakbary>`__.
- `One-time donation via
PayPal. <https://graphene-python.org/support-graphene/>`__
<!--
.. raw:: html
<h2 align="center">
Special Sponsors
.. raw:: html
</h2>
.. raw:: html
<p align="center">
.. raw:: html
</p>
.. raw:: html
<!--special end-->
.. raw:: html
<h2 align="center">
Platinum via Patreon
.. raw:: html
</h2>
.. raw:: html
<!--platinum start-->
.. raw:: html
<table>
.. raw:: html
<tbody>
::
<tr>
<td align="center" valign="middle">
<a href="https://www.patreon.com/join/syrusakbary" target="_blank">
<img width="222px" src="https://raw.githubusercontent.com/graphql-python/graphene-python.org/master/src/pages/sponsors/generic-logo.png">
</a>
</td>
</tr>
.. raw:: html
</tbody>
.. raw:: html
</table>
.. raw:: html
<h2 align="center">
Gold via Patreon
.. raw:: html
</h2>
.. raw:: html
<!--gold start-->
.. raw:: html
<table>
.. raw:: html
<tbody>
::
<tr>
<td align="center" valign="middle">
<a href="https://www.patreon.com/join/syrusakbary" target="_blank">
<img width="148px" src="https://raw.githubusercontent.com/graphql-python/graphene-python.org/master/src/pages/sponsors/generic-logo.png">
</a>
</td>
</tr>
.. raw:: html
</tbody>
.. raw:: html
</table>
.. raw:: html
<!--gold end-->
.. raw:: html
<h2 align="center">
Silver via Patreon
.. raw:: html
</h2>
.. raw:: html
<!--silver start-->
.. raw:: html
<table>
.. raw:: html
<tbody>
::
<tr>
<td align="center" valign="middle">
<a href="https://www.patreon.com/join/syrusakbary" target="_blank">
<img width="148px" src="https://raw.githubusercontent.com/graphql-python/graphene-python.org/master/src/pages/sponsors/generic-logo.png">
</a>
</td>
</tr>
.. raw:: html
</tbody>
.. raw:: html
</table>
.. raw:: html
<!--silver end-->
--------------
Introduction
------------
`Graphene <http://graphene-python.org>`__ is a Python library for
building GraphQL schemas/types fast and easily.
@ -91,17 +270,29 @@ If you want to learn even more, you can also check the following
Contributing
------------
After cloning this repo, ensure dependencies are installed by running:
After cloning this repo, create a
`virtualenv <https://virtualenv.pypa.io/en/stable/>`__ and ensure
dependencies are installed by running:
.. code:: sh
virtualenv venv
source venv/bin/activate
pip install -e ".[test]"
After developing, the full test suite can be evaluated by running:
Well-written tests and maintaining good test coverage is important to
this project. While developing, run new and existing tests with:
.. code:: sh
py.test graphene --cov=graphene --benchmark-skip # Use -v -s for verbose mode
py.test graphene/relay/tests/test_node.py # Single file
py.test graphene/relay # All tests in directory
Add the ``-s`` flag if you have introduced breakpoints into the code for
debugging. Add the ``-v`` ("verbose") flag to get more detailed test
output. For even more detailed output, use ``-vv``. Check out the
`pytest documentation <https://docs.pytest.org/en/latest/>`__ for more
options and test running controls.
You can also run the benchmarks with:
@ -109,24 +300,41 @@ You can also run the benchmarks with:
py.test graphene --benchmark-only
Graphene supports several versions of Python. To make sure that changes
do not break compatibility with any of those versions, we use ``tox`` to
create virtualenvs for each python version and run tests with that
version. To run against all python versions defined in the ``tox.ini``
config file, just run:
.. code:: sh
tox
If you wish to run against a specific version defined in the ``tox.ini``
file:
.. code:: sh
tox -e py36
Tox can only use whatever versions of python are installed on your
system. When you create a pull request, Travis will also be running the
same tests and report the results, so there is no need for potential
contributors to try to install every single version of python on their
own system ahead of time. We appreciate opening issues and pull requests
to make graphene even more stable & useful!
Documentation
~~~~~~~~~~~~~
The documentation is generated using the excellent
`Sphinx <http://www.sphinx-doc.org/>`__ and a custom theme.
The documentation dependencies are installed by running:
An HTML version of the documentation is produced by running:
.. code:: sh
cd docs
pip install -r requirements.txt
Then to produce a HTML version of the documentation:
.. code:: sh
make html
make docs
.. |Graphene Logo| image:: http://graphene-python.org/favicon.png
.. |Build Status| image:: https://travis-ci.org/graphql-python/graphene.svg?branch=master

33
ROADMAP.md Normal file
View File

@ -0,0 +1,33 @@
# Graphene Roadmap
In order to move Graphene and the GraphQL Python ecosystem forward I realized is essential to be clear with the community on next steps, so we can move uniformly.
There are few key points that need to happen in the short/mid term, divided into two main sections:
- [Community](#community)
- [Graphene 3](#graphene-3)
_👋 If you have more ideas on how to move the Graphene ecosystem forward, don't hesistate to [open a PR](https://github.com/graphql-python/graphene/edit/master/ROADMAP.md)_
## Community
The goal is to improve adoption and sustainability of the project.
- 💎 Add Commercial Support for Graphene - [See issue](https://github.com/graphql-python/graphene/issues/813)
- Create [Patreon page](https://www.patreon.com/syrusakbary)
- Add [/support-graphene page](https://graphene-python.org/support-graphene/) in Graphene website
- 📘 Vastly improve documentation - [See issue](https://github.com/graphql-python/graphene/issues/823)
- ~~💰 Apply for [Mozilla MOSS](https://www.mozilla.org/en-US/moss/) sponsorship~~ (not for now)
## Graphene 3
The goal is to summarize the different improvements that Graphene will need to accomplish for version 3.
In a nushell, Graphene 3 should take the Python 3 integration one step forward while still maintaining compatibility with Python 2.
- 🚀 [graphql-core-next](https://github.com/graphql-python/graphql-core-next) GraphQL engine support (almost same API as graphql-core)
- 🔸 GraphQL types from type annotations - [See issue](https://github.com/graphql-python/graphene/issues/729)
- 📄 Schema creation from SDL (API TBD)
- ✨ Improve connections structure
- 📗 Improve function documentation
- 🔀 Add support for coroutines in Connection, Mutation (abstracting out Promise requirement) - [See PR](https://github.com/graphql-python/graphene/pull/824)

View File

@ -17,7 +17,7 @@ developer has to write to use them.
**New Features!**
* [`InputObjectType`](#inputobjecttype)
* [`Meta as Class arguments`](#meta-ass-class-arguments) (_only available for Python 3_)
* [`Meta as Class arguments`](#meta-as-class-arguments) (_only available for Python 3_)
> The type metaclasses are now deleted as they are no longer necessary. If your code was depending

View File

@ -17,75 +17,50 @@ I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
.PHONY: help
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " applehelp to make an Apple Help Book"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " epub3 to make an epub3"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " texinfo to make Texinfo files"
@echo " info to make Texinfo files and run them through makeinfo"
@echo " gettext to make PO message catalogs"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " xml to make Docutils-native XML files"
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
@echo " coverage to run coverage check of the documentation (if enabled)"
@echo " dummy to check syntax errors of document sources"
@grep -E '^\.PHONY: [a-zA-Z_-]+ .*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = "(: |##)"}; {printf "\033[36m%-30s\033[0m %s\n", $$2, $$3}'
.PHONY: clean
clean:
rm -rf $(BUILDDIR)/*
.PHONY: html
.PHONY: html ## to make standalone HTML files
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
.PHONY: dirhtml
.PHONY: dirhtml ## to make HTML files named index.html in directories
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
.PHONY: singlehtml
.PHONY: singlehtml ## to make a single large HTML file
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
.PHONY: pickle
.PHONY: pickle ## to make pickle files
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
.PHONY: json
.PHONY: json ## to make JSON files
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
.PHONY: htmlhelp
.PHONY: htmlhelp ## to make HTML files and a HTML help project
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
.PHONY: qthelp
.PHONY: qthelp ## to make HTML files and a qthelp project
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@ -95,7 +70,7 @@ qthelp:
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Graphene.qhc"
.PHONY: applehelp
.PHONY: applehelp ## to make an Apple Help Book
applehelp:
$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
@echo
@ -104,7 +79,7 @@ applehelp:
"~/Library/Documentation/Help or install it in your application" \
"bundle."
.PHONY: devhelp
.PHONY: devhelp ## to make HTML files and a Devhelp project
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@ -114,19 +89,19 @@ devhelp:
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/Graphene"
@echo "# devhelp"
.PHONY: epub
.PHONY: epub ## to make an epub
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
.PHONY: epub3
.PHONY: epub3 ## to make an epub3
epub3:
$(SPHINXBUILD) -b epub3 $(ALLSPHINXOPTS) $(BUILDDIR)/epub3
@echo
@echo "Build finished. The epub3 file is in $(BUILDDIR)/epub3."
.PHONY: latex
.PHONY: latex ## to make LaTeX files, you can set PAPER=a4 or PAPER=letter
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@ -134,33 +109,33 @@ latex:
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
.PHONY: latexpdf
.PHONY: latexpdf ## to make LaTeX files and run them through pdflatex
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: latexpdfja
.PHONY: latexpdfja ## to make LaTeX files and run them through platex/dvipdfmx
latexpdfja:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through platex and dvipdfmx..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: text
.PHONY: text ## to make text files
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
.PHONY: man
.PHONY: man ## to make manual pages
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
.PHONY: texinfo
.PHONY: texinfo ## to make Texinfo files
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@ -168,57 +143,57 @@ texinfo:
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
.PHONY: info
.PHONY: info ## to make Texinfo files and run them through makeinfo
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
.PHONY: gettext
.PHONY: gettext ## to make PO message catalogs
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
.PHONY: changes
.PHONY: changes ## to make an overview of all changed/added/deprecated items
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
.PHONY: linkcheck
.PHONY: linkcheck ## to check all external links for integrity
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
.PHONY: doctest
.PHONY: doctest ## to run all doctests embedded in the documentation (if enabled)
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
.PHONY: coverage
.PHONY: coverage ## to run coverage check of the documentation (if enabled)
coverage:
$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
@echo "Testing of coverage in the sources finished, look at the " \
"results in $(BUILDDIR)/coverage/python.txt."
.PHONY: xml
.PHONY: xml ## to make Docutils-native XML files
xml:
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
@echo
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
.PHONY: pseudoxml
.PHONY: pseudoxml ## to make pseudoxml-XML files for display purposes
pseudoxml:
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
@echo
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
.PHONY: dummy
.PHONY: dummy ## to check syntax errors of document sources
dummy:
$(SPHINXBUILD) -b dummy $(ALLSPHINXOPTS) $(BUILDDIR)/dummy
@echo

View File

@ -2,7 +2,7 @@ import os
import sphinx_graphene_theme
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
on_rtd = os.environ.get("READTHEDOCS", None) == "True"
# -*- coding: utf-8 -*-
#
@ -36,46 +36,44 @@ on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
'sphinx.ext.autodoc',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.viewcode',
"sphinx.ext.autodoc",
"sphinx.ext.intersphinx",
"sphinx.ext.todo",
"sphinx.ext.coverage",
"sphinx.ext.viewcode",
]
if not on_rtd:
extensions += [
'sphinx.ext.githubpages',
]
extensions += ["sphinx.ext.githubpages"]
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
templates_path = ["_templates"]
# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
#
# source_suffix = ['.rst', '.md']
source_suffix = '.rst'
source_suffix = ".rst"
# The encoding of source files.
#
# source_encoding = 'utf-8-sig'
# The master toctree document.
master_doc = 'index'
master_doc = "index"
# General information about the project.
project = u'Graphene'
copyright = u'Graphene 2016'
author = u'Syrus Akbary'
project = u"Graphene"
copyright = u"Graphene 2016"
author = u"Syrus Akbary"
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
version = u'1.0'
version = u"1.0"
# The full version, including alpha/beta/rc tags.
release = u'1.0'
release = u"1.0"
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
@ -96,7 +94,7 @@ language = None
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This patterns also effect to html_static_path and html_extra_path
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
# The reST default role (used for this markup: `text`) to use for all
# documents.
@ -118,7 +116,7 @@ exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
# show_authors = False
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
pygments_style = "sphinx"
# A list of ignored prefixes for module index sorting.
# modindex_common_prefix = []
@ -175,7 +173,7 @@ html_theme_path = [sphinx_graphene_theme.get_html_theme_path()]
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
html_static_path = ["_static"]
# Add any extra paths that contain custom files (such as robots.txt or
# .htaccess) here, relative to this directory. These files are copied
@ -255,34 +253,30 @@ html_static_path = ['_static']
# html_search_scorer = 'scorer.js'
# Output file base name for HTML help builder.
htmlhelp_basename = 'Graphenedoc'
htmlhelp_basename = "Graphenedoc"
# -- Options for LaTeX output ---------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#
# 'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#
# 'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#
# 'preamble': '',
# Latex figure (float) alignment
#
# 'figure_align': 'htbp',
# The paper size ('letterpaper' or 'a4paper').
#
# 'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#
# 'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#
# 'preamble': '',
# Latex figure (float) alignment
#
# 'figure_align': 'htbp',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, 'Graphene.tex', u'Graphene Documentation',
u'Syrus Akbary', 'manual'),
(master_doc, "Graphene.tex", u"Graphene Documentation", u"Syrus Akbary", "manual")
]
# The name of an image file (relative to this directory) to place at the top of
@ -322,10 +316,7 @@ latex_documents = [
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(master_doc, 'graphene', u'Graphene Documentation',
[author], 1)
]
man_pages = [(master_doc, "graphene", u"Graphene Documentation", [author], 1)]
# If true, show URL addresses after external links.
#
@ -338,9 +329,15 @@ man_pages = [
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
(master_doc, 'Graphene', u'Graphene Documentation',
author, 'Graphene', 'One line description of project.',
'Miscellaneous'),
(
master_doc,
"Graphene",
u"Graphene Documentation",
author,
"Graphene",
"One line description of project.",
"Miscellaneous",
)
]
# Documents to append as an appendix to all manuals.
@ -414,7 +411,7 @@ epub_copyright = copyright
# epub_post_files = []
# A list of files that should not be packed into the epub file.
epub_exclude_files = ['search.html']
epub_exclude_files = ["search.html"]
# The depth of the table of contents in toc.ncx.
#
@ -447,9 +444,15 @@ epub_exclude_files = ['search.html']
# Example configuration for intersphinx: refer to the Python standard library.
intersphinx_mapping = {
'https://docs.python.org/': None,
'python': ('https://docs.python.org/', None),
'graphene_django': ('http://docs.graphene-python.org/projects/django/en/latest/', None),
'graphene_sqlalchemy': ('http://docs.graphene-python.org/projects/sqlalchemy/en/latest/', None),
'graphene_gae': ('http://docs.graphene-python.org/projects/gae/en/latest/', None),
"https://docs.python.org/": None,
"python": ("https://docs.python.org/", None),
"graphene_django": (
"http://docs.graphene-python.org/projects/django/en/latest/",
None,
),
"graphene_sqlalchemy": (
"http://docs.graphene-python.org/projects/sqlalchemy/en/latest/",
None,
),
"graphene_gae": ("http://docs.graphene-python.org/projects/gae/en/latest/", None),
}

View File

@ -25,8 +25,8 @@ Create loaders by providing a batch loading function.
return Promise.resolve([get_user(id=key) for key in keys])
A batch loading function accepts an list of keys, and returns a ``Promise``
which resolves to an list of ``values``.
A batch loading function accepts a list of keys, and returns a ``Promise``
which resolves to a list of ``values``.
Then load individual values from the loader. ``DataLoader`` will coalesce all
individual loads which occur within a single frame of execution (executed once
@ -34,7 +34,6 @@ the wrapping promise is resolved) and then call your batch function with all
requested keys.
.. code:: python
user_loader = UserLoader()
@ -47,6 +46,19 @@ requested keys.
A naive application may have issued *four* round-trips to a backend for the
required information, but with ``DataLoader`` this application will make at most *two*.
Note that loaded values are one-to-one with the keys and must have the same
order. This means that if you load all values from a single query, you must
make sure that you then order the query result for the results to match the keys:
.. code:: python
class UserLoader(DataLoader):
def batch_load_fn(self, keys):
users = {user.id: user for user in User.objects.filter(id__in=keys)}
return Promise.resolve([users.get(user_id) for user_id in keys])
``DataLoader`` allows you to decouple unrelated parts of your application without
sacrificing the performance of batch data-loading. While the loader presents
an API that loads individual values, all concurrent requests will be coalesced

View File

@ -16,7 +16,7 @@ For executing a query a schema, you can directly call the ``execute`` method on
Context
_______
You can pass context to a query via ``context_value``.
You can pass context to a query via ``context``.
.. code:: python
@ -28,14 +28,14 @@ You can pass context to a query via ``context_value``.
return info.context.get('name')
schema = graphene.Schema(Query)
result = schema.execute('{ name }', context_value={'name': 'Syrus'})
result = schema.execute('{ name }', context={'name': 'Syrus'})
Variables
_______
You can pass variables to a query via ``variable_values``.
You can pass variables to a query via ``variables``.
.. code:: python
@ -55,5 +55,5 @@ You can pass variables to a query via ``variable_values``.
lastName
}
}''',
variable_values={'id': 12},
variables={'id': 12},
)

View File

@ -30,17 +30,18 @@ server with an associated set of resolve methods that know how to fetch
data.
We are going to create a very simple schema, with a ``Query`` with only
one field: ``hello`` and an input name. And when we query it, it should return ``"Hello {name}"``.
one field: ``hello`` and an input name. And when we query it, it should return ``"Hello
{argument}"``.
.. code:: python
import graphene
class Query(graphene.ObjectType):
hello = graphene.String(name=graphene.String(default_value="stranger"))
hello = graphene.String(argument=graphene.String(default_value="stranger"))
def resolve_hello(self, info, name):
return 'Hello ' + name
def resolve_hello(self, info, argument):
return 'Hello ' + argument
schema = graphene.Schema(query=Query)
@ -54,4 +55,8 @@ Then we can start querying our schema:
result = schema.execute('{ hello }')
print(result.data['hello']) # "Hello stranger"
# or passing the argument in the query
result = schema.execute('{ hello (argument: "graph") }')
print(result.data['hello']) # "Hello graph"
Congrats! You got your first graphene schema working!

View File

@ -54,7 +54,7 @@ Execute parameters
~~~~~~~~~~~~~~~~~~
You can also add extra keyword arguments to the ``execute`` method, such as
``context_value``, ``root_value``, ``variable_values``, ...:
``context``, ``root``, ``variables``, ...:
.. code:: python
@ -63,7 +63,7 @@ You can also add extra keyword arguments to the ``execute`` method, such as
def test_hey():
client = Client(my_schema)
executed = client.execute('''{ hey }''', context_value={'user': 'Peter'})
executed = client.execute('''{ hey }''', context={'user': 'Peter'})
assert executed == {
'data': {
'hey': 'hello Peter!'

View File

@ -1,7 +1,7 @@
Enums
=====
A ``Enum`` is a special ``GraphQL`` type that represents a set of
An ``Enum`` is a special ``GraphQL`` type that represents a set of
symbolic names (members) bound to unique, constant values.
Definition

View File

@ -91,7 +91,7 @@ For example, you can define a field ``hero`` that resolves to any
schema = graphene.Schema(query=Query, types=[Human, Droid])
This allows you to directly query for fields that exist on the Character interface
as well as selecting specific fields on any type that implments the interface
as well as selecting specific fields on any type that implements the interface
using `inline fragments <https://graphql.org/learn/queries/#inline-fragments>`_.
For example, the following query:

View File

@ -27,7 +27,7 @@ This example defines a Mutation:
**person** and **ok** are the output fields of the Mutation when it is
resolved.
**Input** attributes are the arguments that the Mutation
**Arguments** attributes are the arguments that the Mutation
``CreatePerson`` needs for resolving, in this case **name** will be the
only argument for the mutation.

View File

@ -25,8 +25,8 @@ This example model defines a Person, with a first and a last name:
last_name = graphene.String()
full_name = graphene.String()
def resolve_full_name(self, info):
return '{} {}'.format(self.first_name, self.last_name)
def resolve_full_name(root, info):
return '{} {}'.format(root.first_name, root.last_name)
**first\_name** and **last\_name** are fields of the ObjectType. Each
field is specified as a class attribute, and each attribute maps to a
@ -46,33 +46,158 @@ The above ``Person`` ObjectType has the following schema representation:
Resolvers
---------
A resolver is a method that resolves certain fields within a
``ObjectType``. If not specififed otherwise, the resolver of a
A resolver is a method that resolves certain fields within an
``ObjectType``. If not specified otherwise, the resolver of a
field is the ``resolve_{field_name}`` method on the ``ObjectType``.
By default resolvers take the arguments ``info`` and ``*args``.
NOTE: The resolvers on a ``ObjectType`` are always treated as ``staticmethod``\ s,
NOTE: The resolvers on an ``ObjectType`` are always treated as ``staticmethod``\ s,
so the first argument to the resolver method ``self`` (or ``root``) need
not be an actual instance of the ``ObjectType``.
If an explicit resolver is not defined on the ``ObjectType`` then Graphene will
attempt to use a property with the same name on the object that is passed to the
``ObjectType``.
Quick example
~~~~~~~~~~~~~
.. code:: python
This example model defines a ``Query`` type, which has a reverse field
that reverses the given ``word`` argument using the ``resolve_reverse``
method in the class.
import graphene
class Person(graphene.ObjectType):
first_name = graphene.String()
last_name = graphene.String()
class Query(graphene.ObjectType):
me = graphene.Field(Person)
def resolve_me(_, info):
# returns an object that represents a Person
return get_human(name='Luke Skywalker')
If you are passing a dict instead of an object to your ``ObjectType`` you can
change the default resolver in the ``Meta`` class like this:
.. code:: python
import graphene
from graphene.types.resolver import dict_resolver
class Person(graphene.ObjectType):
class Meta:
default_resolver = dict_resolver
first_name = graphene.String()
last_name = graphene.String()
class Query(graphene.ObjectType):
me = graphene.Field(Person)
def resolve_me(_, info):
return {
"first_name": "Luke",
"last_name": "Skywalker",
}
Or you can change the default resolver globally by calling ``set_default_resolver``
before executing a query.
.. code:: python
import graphene
from graphene.types.resolver import dict_resolver, set_default_resolver
set_default_resolver(dict_resolver)
schema = graphene.Schema(query=Query)
result = schema.execute('''
query {
me {
firstName
}
}
''')
Resolvers with arguments
~~~~~~~~~~~~~~~~~~~~~~~~
Any arguments that a field defines gets passed to the resolver function as
kwargs. For example:
.. code:: python
import graphene
class Query(graphene.ObjectType):
reverse = graphene.String(word=graphene.String())
human_by_name = graphene.Field(Human, name=graphene.String(required=True))
def resolve_human_by_name(_, info, name):
return get_human(name=name)
You can then execute the following query:
.. code::
query {
humanByName(name: "Luke Skywalker") {
firstName
lastName
}
}
NOTE: if you define an argument for a field that is not required (and in a query
execution it is not provided as an argument) it will not be passed to the
resolver function at all. This is so that the developer can differenciate
between a ``undefined`` value for an argument and an explicit ``null`` value.
For example, given this schema:
.. code:: python
import graphene
class Query(graphene.ObjectType):
hello = graphene.String(required=True, name=graphene.String())
def resolve_hello(_, info, name):
return name if name else 'World'
And this query:
.. code::
query {
hello
}
An error will be thrown:
.. code::
TypeError: resolve_hello() missing 1 required positional argument: 'name'
You can fix this error in 2 ways. Either by combining all keyword arguments
into a dict:
.. code:: python
class Query(graphene.ObjectType):
hello = graphene.String(required=True, name=graphene.String())
def resolve_hello(_, info, **args):
return args.get('name', 'World')
Or by setting a default value for the keyword argument:
.. code:: python
class Query(graphene.ObjectType):
hello = graphene.String(required=True, name=graphene.String())
def resolve_hello(_, info, name='World'):
return name
def resolve_reverse(self, info, word):
return word[::-1]
Resolvers outside the class
~~~~~~~~~~~~~~~~~~~~~~~~~~~
@ -83,11 +208,13 @@ A field can use a custom resolver from outside the class:
import graphene
def reverse(root, info, word):
return word[::-1]
def resolve_full_name(person, info):
return '{} {}'.format(person.first_name, person.last_name)
class Query(graphene.ObjectType):
reverse = graphene.String(word=graphene.String(), resolver=reverse)
class Person(graphene.ObjectType):
first_name = graphene.String()
last_name = graphene.String()
full_name = graphene.String(resolver=resolve_full_name)
Instances as data containers

View File

@ -22,7 +22,6 @@ class Query(graphene.ObjectType):
class CreateAddress(graphene.Mutation):
class Arguments:
geo = GeoInput(required=True)
@ -37,42 +36,34 @@ class Mutation(graphene.ObjectType):
schema = graphene.Schema(query=Query, mutation=Mutation)
query = '''
query = """
query something{
address(geo: {lat:32.2, lng:12}) {
latlng
}
}
'''
mutation = '''
"""
mutation = """
mutation addAddress{
createAddress(geo: {lat:32.2, lng:12}) {
latlng
}
}
'''
"""
def test_query():
result = schema.execute(query)
assert not result.errors
assert result.data == {
'address': {
'latlng': "(32.2,12.0)",
}
}
assert result.data == {"address": {"latlng": "(32.2,12.0)"}}
def test_mutation():
result = schema.execute(mutation)
assert not result.errors
assert result.data == {
'createAddress': {
'latlng': "(32.2,12.0)",
}
}
assert result.data == {"createAddress": {"latlng": "(32.2,12.0)"}}
if __name__ == '__main__':
if __name__ == "__main__":
result = schema.execute(query)
print(result.data['address']['latlng'])
print(result.data["address"]["latlng"])

View File

@ -10,31 +10,26 @@ class Query(graphene.ObjectType):
me = graphene.Field(User)
def resolve_me(self, info):
return info.context['user']
return info.context["user"]
schema = graphene.Schema(query=Query)
query = '''
query = """
query something{
me {
id
name
}
}
'''
"""
def test_query():
result = schema.execute(query, context_value={'user': User(id='1', name='Syrus')})
result = schema.execute(query, context={"user": User(id="1", name="Syrus")})
assert not result.errors
assert result.data == {
'me': {
'id': '1',
'name': 'Syrus',
}
}
assert result.data == {"me": {"id": "1", "name": "Syrus"}}
if __name__ == '__main__':
result = schema.execute(query, context_value={'user': User(id='X', name='Console')})
print(result.data['me'])
if __name__ == "__main__":
result = schema.execute(query, context={"user": User(id="X", name="Console")})
print(result.data["me"])

View File

@ -12,11 +12,11 @@ class Query(graphene.ObjectType):
patron = graphene.Field(Patron)
def resolve_patron(self, info):
return Patron(id=1, name='Syrus', age=27)
return Patron(id=1, name="Syrus", age=27)
schema = graphene.Schema(query=Query)
query = '''
query = """
query something{
patron {
id
@ -24,21 +24,15 @@ query = '''
age
}
}
'''
"""
def test_query():
result = schema.execute(query)
assert not result.errors
assert result.data == {
'patron': {
'id': '1',
'name': 'Syrus',
'age': 27,
}
}
assert result.data == {"patron": {"id": "1", "name": "Syrus", "age": 27}}
if __name__ == '__main__':
if __name__ == "__main__":
result = schema.execute(query)
print(result.data['patron'])
print(result.data["patron"])

View File

@ -4,75 +4,73 @@ droid_data = {}
def setup():
from .schema import Human, Droid
global human_data, droid_data
luke = Human(
id='1000',
name='Luke Skywalker',
friends=['1002', '1003', '2000', '2001'],
id="1000",
name="Luke Skywalker",
friends=["1002", "1003", "2000", "2001"],
appears_in=[4, 5, 6],
home_planet='Tatooine',
home_planet="Tatooine",
)
vader = Human(
id='1001',
name='Darth Vader',
friends=['1004'],
id="1001",
name="Darth Vader",
friends=["1004"],
appears_in=[4, 5, 6],
home_planet='Tatooine',
home_planet="Tatooine",
)
han = Human(
id='1002',
name='Han Solo',
friends=['1000', '1003', '2001'],
id="1002",
name="Han Solo",
friends=["1000", "1003", "2001"],
appears_in=[4, 5, 6],
home_planet=None,
)
leia = Human(
id='1003',
name='Leia Organa',
friends=['1000', '1002', '2000', '2001'],
id="1003",
name="Leia Organa",
friends=["1000", "1002", "2000", "2001"],
appears_in=[4, 5, 6],
home_planet='Alderaan',
home_planet="Alderaan",
)
tarkin = Human(
id='1004',
name='Wilhuff Tarkin',
friends=['1001'],
id="1004",
name="Wilhuff Tarkin",
friends=["1001"],
appears_in=[4],
home_planet=None,
)
human_data = {
'1000': luke,
'1001': vader,
'1002': han,
'1003': leia,
'1004': tarkin,
"1000": luke,
"1001": vader,
"1002": han,
"1003": leia,
"1004": tarkin,
}
c3po = Droid(
id='2000',
name='C-3PO',
friends=['1000', '1002', '1003', '2001'],
id="2000",
name="C-3PO",
friends=["1000", "1002", "1003", "2001"],
appears_in=[4, 5, 6],
primary_function='Protocol',
primary_function="Protocol",
)
r2d2 = Droid(
id='2001',
name='R2-D2',
friends=['1000', '1002', '1003'],
id="2001",
name="R2-D2",
friends=["1000", "1002", "1003"],
appears_in=[4, 5, 6],
primary_function='Astromech',
primary_function="Astromech",
)
droid_data = {
'2000': c3po,
'2001': r2d2,
}
droid_data = {"2000": c3po, "2001": r2d2}
def get_character(id):
@ -85,8 +83,8 @@ def get_friends(character):
def get_hero(episode):
if episode == 5:
return human_data['1000']
return droid_data['2001']
return human_data["1000"]
return droid_data["2001"]
def get_human(id):

View File

@ -21,29 +21,23 @@ class Character(graphene.Interface):
class Human(graphene.ObjectType):
class Meta:
interfaces = (Character, )
interfaces = (Character,)
home_planet = graphene.String()
class Droid(graphene.ObjectType):
class Meta:
interfaces = (Character, )
interfaces = (Character,)
primary_function = graphene.String()
class Query(graphene.ObjectType):
hero = graphene.Field(Character,
episode=Episode()
)
human = graphene.Field(Human,
id=graphene.String()
)
droid = graphene.Field(Droid,
id=graphene.String()
)
hero = graphene.Field(Character, episode=Episode())
human = graphene.Field(Human, id=graphene.String())
droid = graphene.Field(Droid, id=graphene.String())
def resolve_hero(self, info, episode=None):
return get_hero(episode)

View File

@ -6,196 +6,95 @@ from snapshottest import Snapshot
snapshots = Snapshot()
snapshots['test_hero_name_query 1'] = {
'data': {
'hero': {
'name': 'R2-D2'
snapshots["test_hero_name_query 1"] = {"data": {"hero": {"name": "R2-D2"}}}
snapshots["test_hero_name_and_friends_query 1"] = {
"data": {
"hero": {
"id": "2001",
"name": "R2-D2",
"friends": [
{"name": "Luke Skywalker"},
{"name": "Han Solo"},
{"name": "Leia Organa"},
],
}
}
}
snapshots['test_hero_name_and_friends_query 1'] = {
'data': {
'hero': {
'id': '2001',
'name': 'R2-D2',
'friends': [
snapshots["test_nested_query 1"] = {
"data": {
"hero": {
"name": "R2-D2",
"friends": [
{
'name': 'Luke Skywalker'
},
{
'name': 'Han Solo'
},
{
'name': 'Leia Organa'
}
]
}
}
}
snapshots['test_nested_query 1'] = {
'data': {
'hero': {
'name': 'R2-D2',
'friends': [
{
'name': 'Luke Skywalker',
'appearsIn': [
'NEWHOPE',
'EMPIRE',
'JEDI'
"name": "Luke Skywalker",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Han Solo"},
{"name": "Leia Organa"},
{"name": "C-3PO"},
{"name": "R2-D2"},
],
'friends': [
{
'name': 'Han Solo'
},
{
'name': 'Leia Organa'
},
{
'name': 'C-3PO'
},
{
'name': 'R2-D2'
}
]
},
{
'name': 'Han Solo',
'appearsIn': [
'NEWHOPE',
'EMPIRE',
'JEDI'
"name": "Han Solo",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Luke Skywalker"},
{"name": "Leia Organa"},
{"name": "R2-D2"},
],
'friends': [
{
'name': 'Luke Skywalker'
},
{
'name': 'Leia Organa'
},
{
'name': 'R2-D2'
}
]
},
{
'name': 'Leia Organa',
'appearsIn': [
'NEWHOPE',
'EMPIRE',
'JEDI'
"name": "Leia Organa",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Luke Skywalker"},
{"name": "Han Solo"},
{"name": "C-3PO"},
{"name": "R2-D2"},
],
'friends': [
{
'name': 'Luke Skywalker'
},
{
'name': 'Han Solo'
},
{
'name': 'C-3PO'
},
{
'name': 'R2-D2'
}
]
}
]
},
],
}
}
}
snapshots['test_fetch_luke_query 1'] = {
'data': {
'human': {
'name': 'Luke Skywalker'
}
snapshots["test_fetch_luke_query 1"] = {"data": {"human": {"name": "Luke Skywalker"}}}
snapshots["test_fetch_some_id_query 1"] = {
"data": {"human": {"name": "Luke Skywalker"}}
}
snapshots["test_fetch_some_id_query2 1"] = {"data": {"human": {"name": "Han Solo"}}}
snapshots["test_invalid_id_query 1"] = {"data": {"human": None}}
snapshots["test_fetch_luke_aliased 1"] = {"data": {"luke": {"name": "Luke Skywalker"}}}
snapshots["test_fetch_luke_and_leia_aliased 1"] = {
"data": {"luke": {"name": "Luke Skywalker"}, "leia": {"name": "Leia Organa"}}
}
snapshots["test_duplicate_fields 1"] = {
"data": {
"luke": {"name": "Luke Skywalker", "homePlanet": "Tatooine"},
"leia": {"name": "Leia Organa", "homePlanet": "Alderaan"},
}
}
snapshots['test_fetch_some_id_query 1'] = {
'data': {
'human': {
'name': 'Luke Skywalker'
}
snapshots["test_use_fragment 1"] = {
"data": {
"luke": {"name": "Luke Skywalker", "homePlanet": "Tatooine"},
"leia": {"name": "Leia Organa", "homePlanet": "Alderaan"},
}
}
snapshots['test_fetch_some_id_query2 1'] = {
'data': {
'human': {
'name': 'Han Solo'
}
}
snapshots["test_check_type_of_r2 1"] = {
"data": {"hero": {"__typename": "Droid", "name": "R2-D2"}}
}
snapshots['test_invalid_id_query 1'] = {
'data': {
'human': None
}
}
snapshots['test_fetch_luke_aliased 1'] = {
'data': {
'luke': {
'name': 'Luke Skywalker'
}
}
}
snapshots['test_fetch_luke_and_leia_aliased 1'] = {
'data': {
'luke': {
'name': 'Luke Skywalker'
},
'leia': {
'name': 'Leia Organa'
}
}
}
snapshots['test_duplicate_fields 1'] = {
'data': {
'luke': {
'name': 'Luke Skywalker',
'homePlanet': 'Tatooine'
},
'leia': {
'name': 'Leia Organa',
'homePlanet': 'Alderaan'
}
}
}
snapshots['test_use_fragment 1'] = {
'data': {
'luke': {
'name': 'Luke Skywalker',
'homePlanet': 'Tatooine'
},
'leia': {
'name': 'Leia Organa',
'homePlanet': 'Alderaan'
}
}
}
snapshots['test_check_type_of_r2 1'] = {
'data': {
'hero': {
'__typename': 'Droid',
'name': 'R2-D2'
}
}
}
snapshots['test_check_type_of_luke 1'] = {
'data': {
'hero': {
'__typename': 'Human',
'name': 'Luke Skywalker'
}
}
snapshots["test_check_type_of_luke 1"] = {
"data": {"hero": {"__typename": "Human", "name": "Luke Skywalker"}}
}

View File

@ -9,18 +9,18 @@ client = Client(schema)
def test_hero_name_query(snapshot):
query = '''
query = """
query HeroNameQuery {
hero {
name
}
}
'''
"""
snapshot.assert_match(client.execute(query))
def test_hero_name_and_friends_query(snapshot):
query = '''
query = """
query HeroNameAndFriendsQuery {
hero {
id
@ -30,12 +30,12 @@ def test_hero_name_and_friends_query(snapshot):
}
}
}
'''
"""
snapshot.assert_match(client.execute(query))
def test_nested_query(snapshot):
query = '''
query = """
query NestedQuery {
hero {
name
@ -48,76 +48,70 @@ def test_nested_query(snapshot):
}
}
}
'''
"""
snapshot.assert_match(client.execute(query))
def test_fetch_luke_query(snapshot):
query = '''
query = """
query FetchLukeQuery {
human(id: "1000") {
name
}
}
'''
"""
snapshot.assert_match(client.execute(query))
def test_fetch_some_id_query(snapshot):
query = '''
query = """
query FetchSomeIDQuery($someId: String!) {
human(id: $someId) {
name
}
}
'''
params = {
'someId': '1000',
}
snapshot.assert_match(client.execute(query, variable_values=params))
"""
params = {"someId": "1000"}
snapshot.assert_match(client.execute(query, variables=params))
def test_fetch_some_id_query2(snapshot):
query = '''
query = """
query FetchSomeIDQuery($someId: String!) {
human(id: $someId) {
name
}
}
'''
params = {
'someId': '1002',
}
snapshot.assert_match(client.execute(query, variable_values=params))
"""
params = {"someId": "1002"}
snapshot.assert_match(client.execute(query, variables=params))
def test_invalid_id_query(snapshot):
query = '''
query = """
query humanQuery($id: String!) {
human(id: $id) {
name
}
}
'''
params = {
'id': 'not a valid id',
}
snapshot.assert_match(client.execute(query, variable_values=params))
"""
params = {"id": "not a valid id"}
snapshot.assert_match(client.execute(query, variables=params))
def test_fetch_luke_aliased(snapshot):
query = '''
query = """
query FetchLukeAliased {
luke: human(id: "1000") {
name
}
}
'''
"""
snapshot.assert_match(client.execute(query))
def test_fetch_luke_and_leia_aliased(snapshot):
query = '''
query = """
query FetchLukeAndLeiaAliased {
luke: human(id: "1000") {
name
@ -126,12 +120,12 @@ def test_fetch_luke_and_leia_aliased(snapshot):
name
}
}
'''
"""
snapshot.assert_match(client.execute(query))
def test_duplicate_fields(snapshot):
query = '''
query = """
query DuplicateFields {
luke: human(id: "1000") {
name
@ -142,12 +136,12 @@ def test_duplicate_fields(snapshot):
homePlanet
}
}
'''
"""
snapshot.assert_match(client.execute(query))
def test_use_fragment(snapshot):
query = '''
query = """
query UseFragment {
luke: human(id: "1000") {
...HumanFragment
@ -160,29 +154,29 @@ def test_use_fragment(snapshot):
name
homePlanet
}
'''
"""
snapshot.assert_match(client.execute(query))
def test_check_type_of_r2(snapshot):
query = '''
query = """
query CheckTypeOfR2 {
hero {
__typename
name
}
}
'''
"""
snapshot.assert_match(client.execute(query))
def test_check_type_of_luke(snapshot):
query = '''
query = """
query CheckTypeOfLuke {
hero(episode: EMPIRE) {
__typename
name
}
}
'''
"""
snapshot.assert_match(client.execute(query))

View File

@ -5,101 +5,67 @@ def setup():
global data
from .schema import Ship, Faction
xwing = Ship(
id='1',
name='X-Wing',
)
ywing = Ship(
id='2',
name='Y-Wing',
)
xwing = Ship(id="1", name="X-Wing")
awing = Ship(
id='3',
name='A-Wing',
)
ywing = Ship(id="2", name="Y-Wing")
awing = Ship(id="3", name="A-Wing")
# Yeah, technically it's Corellian. But it flew in the service of the rebels,
# so for the purposes of this demo it's a rebel ship.
falcon = Ship(
id='4',
name='Millenium Falcon',
)
falcon = Ship(id="4", name="Millenium Falcon")
homeOne = Ship(
id='5',
name='Home One',
)
homeOne = Ship(id="5", name="Home One")
tieFighter = Ship(
id='6',
name='TIE Fighter',
)
tieFighter = Ship(id="6", name="TIE Fighter")
tieInterceptor = Ship(
id='7',
name='TIE Interceptor',
)
tieInterceptor = Ship(id="7", name="TIE Interceptor")
executor = Ship(
id='8',
name='Executor',
)
executor = Ship(id="8", name="Executor")
rebels = Faction(
id='1',
name='Alliance to Restore the Republic',
ships=['1', '2', '3', '4', '5']
id="1", name="Alliance to Restore the Republic", ships=["1", "2", "3", "4", "5"]
)
empire = Faction(
id='2',
name='Galactic Empire',
ships=['6', '7', '8']
)
empire = Faction(id="2", name="Galactic Empire", ships=["6", "7", "8"])
data = {
'Faction': {
'1': rebels,
'2': empire
"Faction": {"1": rebels, "2": empire},
"Ship": {
"1": xwing,
"2": ywing,
"3": awing,
"4": falcon,
"5": homeOne,
"6": tieFighter,
"7": tieInterceptor,
"8": executor,
},
'Ship': {
'1': xwing,
'2': ywing,
'3': awing,
'4': falcon,
'5': homeOne,
'6': tieFighter,
'7': tieInterceptor,
'8': executor
}
}
def create_ship(ship_name, faction_id):
from .schema import Ship
next_ship = len(data['Ship'].keys()) + 1
new_ship = Ship(
id=str(next_ship),
name=ship_name
)
data['Ship'][new_ship.id] = new_ship
data['Faction'][faction_id].ships.append(new_ship.id)
next_ship = len(data["Ship"].keys()) + 1
new_ship = Ship(id=str(next_ship), name=ship_name)
data["Ship"][new_ship.id] = new_ship
data["Faction"][faction_id].ships.append(new_ship.id)
return new_ship
def get_ship(_id):
return data['Ship'][_id]
return data["Ship"][_id]
def get_faction(_id):
return data['Faction'][_id]
return data["Faction"][_id]
def get_rebels():
return get_faction('1')
return get_faction("1")
def get_empire():
return get_faction('2')
return get_faction("2")

View File

@ -5,12 +5,12 @@ from .data import create_ship, get_empire, get_faction, get_rebels, get_ship
class Ship(graphene.ObjectType):
'''A ship in the Star Wars saga'''
"""A ship in the Star Wars saga"""
class Meta:
interfaces = (relay.Node, )
interfaces = (relay.Node,)
name = graphene.String(description='The name of the ship.')
name = graphene.String(description="The name of the ship.")
@classmethod
def get_node(cls, info, id):
@ -18,19 +18,20 @@ class Ship(graphene.ObjectType):
class ShipConnection(relay.Connection):
class Meta:
node = Ship
class Faction(graphene.ObjectType):
'''A faction in the Star Wars saga'''
"""A faction in the Star Wars saga"""
class Meta:
interfaces = (relay.Node, )
interfaces = (relay.Node,)
name = graphene.String(description='The name of the faction.')
ships = relay.ConnectionField(ShipConnection, description='The ships used by the faction.')
name = graphene.String(description="The name of the faction.")
ships = relay.ConnectionField(
ShipConnection, description="The ships used by the faction."
)
def resolve_ships(self, info, **args):
# Transform the instance ship_ids into real instances
@ -42,7 +43,6 @@ class Faction(graphene.ObjectType):
class IntroduceShip(relay.ClientIDMutation):
class Input:
ship_name = graphene.String(required=True)
faction_id = graphene.String(required=True)
@ -51,7 +51,9 @@ class IntroduceShip(relay.ClientIDMutation):
faction = graphene.Field(Faction)
@classmethod
def mutate_and_get_payload(cls, root, info, ship_name, faction_id, client_mutation_id=None):
def mutate_and_get_payload(
cls, root, info, ship_name, faction_id, client_mutation_id=None
):
ship = create_ship(ship_name, faction_id)
faction = get_faction(faction_id)
return IntroduceShip(ship=ship, faction=faction)

View File

@ -6,26 +6,21 @@ from snapshottest import Snapshot
snapshots = Snapshot()
snapshots['test_correct_fetch_first_ship_rebels 1'] = {
'data': {
'rebels': {
'name': 'Alliance to Restore the Republic',
'ships': {
'pageInfo': {
'startCursor': 'YXJyYXljb25uZWN0aW9uOjA=',
'endCursor': 'YXJyYXljb25uZWN0aW9uOjA=',
'hasNextPage': True,
'hasPreviousPage': False
snapshots["test_correct_fetch_first_ship_rebels 1"] = {
"data": {
"rebels": {
"name": "Alliance to Restore the Republic",
"ships": {
"pageInfo": {
"startCursor": "YXJyYXljb25uZWN0aW9uOjA=",
"endCursor": "YXJyYXljb25uZWN0aW9uOjA=",
"hasNextPage": True,
"hasPreviousPage": False,
},
'edges': [
{
'cursor': 'YXJyYXljb25uZWN0aW9uOjA=',
'node': {
'name': 'X-Wing'
}
}
]
}
"edges": [
{"cursor": "YXJyYXljb25uZWN0aW9uOjA=", "node": {"name": "X-Wing"}}
],
},
}
}
}

View File

@ -6,56 +6,23 @@ from snapshottest import Snapshot
snapshots = Snapshot()
snapshots['test_mutations 1'] = {
'data': {
'introduceShip': {
'ship': {
'id': 'U2hpcDo5',
'name': 'Peter'
},
'faction': {
'name': 'Alliance to Restore the Republic',
'ships': {
'edges': [
{
'node': {
'id': 'U2hpcDox',
'name': 'X-Wing'
}
},
{
'node': {
'id': 'U2hpcDoy',
'name': 'Y-Wing'
}
},
{
'node': {
'id': 'U2hpcDoz',
'name': 'A-Wing'
}
},
{
'node': {
'id': 'U2hpcDo0',
'name': 'Millenium Falcon'
}
},
{
'node': {
'id': 'U2hpcDo1',
'name': 'Home One'
}
},
{
'node': {
'id': 'U2hpcDo5',
'name': 'Peter'
}
}
snapshots["test_mutations 1"] = {
"data": {
"introduceShip": {
"ship": {"id": "U2hpcDo5", "name": "Peter"},
"faction": {
"name": "Alliance to Restore the Republic",
"ships": {
"edges": [
{"node": {"id": "U2hpcDox", "name": "X-Wing"}},
{"node": {"id": "U2hpcDoy", "name": "Y-Wing"}},
{"node": {"id": "U2hpcDoz", "name": "A-Wing"}},
{"node": {"id": "U2hpcDo0", "name": "Millenium Falcon"}},
{"node": {"id": "U2hpcDo1", "name": "Home One"}},
{"node": {"id": "U2hpcDo5", "name": "Peter"}},
]
}
}
},
},
}
}
}

View File

@ -6,52 +6,31 @@ from snapshottest import Snapshot
snapshots = Snapshot()
snapshots['test_correctly_fetches_id_name_rebels 1'] = {
'data': {
'rebels': {
'id': 'RmFjdGlvbjox',
'name': 'Alliance to Restore the Republic'
}
snapshots["test_correctly_fetches_id_name_rebels 1"] = {
"data": {
"rebels": {"id": "RmFjdGlvbjox", "name": "Alliance to Restore the Republic"}
}
}
snapshots['test_correctly_refetches_rebels 1'] = {
'data': {
'node': {
'id': 'RmFjdGlvbjox',
'name': 'Alliance to Restore the Republic'
}
}
snapshots["test_correctly_refetches_rebels 1"] = {
"data": {"node": {"id": "RmFjdGlvbjox", "name": "Alliance to Restore the Republic"}}
}
snapshots['test_correctly_fetches_id_name_empire 1'] = {
'data': {
'empire': {
'id': 'RmFjdGlvbjoy',
'name': 'Galactic Empire'
}
}
snapshots["test_correctly_fetches_id_name_empire 1"] = {
"data": {"empire": {"id": "RmFjdGlvbjoy", "name": "Galactic Empire"}}
}
snapshots['test_correctly_refetches_empire 1'] = {
'data': {
'node': {
'id': 'RmFjdGlvbjoy',
'name': 'Galactic Empire'
}
}
snapshots["test_correctly_refetches_empire 1"] = {
"data": {"node": {"id": "RmFjdGlvbjoy", "name": "Galactic Empire"}}
}
snapshots['test_correctly_refetches_xwing 1'] = {
'data': {
'node': {
'id': 'U2hpcDox',
'name': 'X-Wing'
}
}
snapshots["test_correctly_refetches_xwing 1"] = {
"data": {"node": {"id": "U2hpcDox", "name": "X-Wing"}}
}
snapshots['test_str_schema 1'] = '''schema {
snapshots[
"test_str_schema 1"
] = """schema {
query: Query
mutation: Mutation
}
@ -109,4 +88,4 @@ type ShipEdge {
node: Ship
cursor: String!
}
'''
"""

View File

@ -9,7 +9,7 @@ client = Client(schema)
def test_correct_fetch_first_ship_rebels(snapshot):
query = '''
query = """
query RebelsShipsQuery {
rebels {
name,
@ -29,5 +29,5 @@ def test_correct_fetch_first_ship_rebels(snapshot):
}
}
}
'''
"""
snapshot.assert_match(client.execute(query))

View File

@ -9,7 +9,7 @@ client = Client(schema)
def test_mutations(snapshot):
query = '''
query = """
mutation MyMutation {
introduceShip(input:{clientMutationId:"abc", shipName: "Peter", factionId: "1"}) {
ship {
@ -29,5 +29,5 @@ def test_mutations(snapshot):
}
}
}
'''
"""
snapshot.assert_match(client.execute(query))

View File

@ -13,19 +13,19 @@ def test_str_schema(snapshot):
def test_correctly_fetches_id_name_rebels(snapshot):
query = '''
query = """
query RebelsQuery {
rebels {
id
name
}
}
'''
"""
snapshot.assert_match(client.execute(query))
def test_correctly_refetches_rebels(snapshot):
query = '''
query = """
query RebelsRefetchQuery {
node(id: "RmFjdGlvbjox") {
id
@ -34,24 +34,24 @@ def test_correctly_refetches_rebels(snapshot):
}
}
}
'''
"""
snapshot.assert_match(client.execute(query))
def test_correctly_fetches_id_name_empire(snapshot):
query = '''
query = """
query EmpireQuery {
empire {
id
name
}
}
'''
"""
snapshot.assert_match(client.execute(query))
def test_correctly_refetches_empire(snapshot):
query = '''
query = """
query EmpireRefetchQuery {
node(id: "RmFjdGlvbjoy") {
id
@ -60,12 +60,12 @@ def test_correctly_refetches_empire(snapshot):
}
}
}
'''
"""
snapshot.assert_match(client.execute(query))
def test_correctly_refetches_xwing(snapshot):
query = '''
query = """
query XWingRefetchQuery {
node(id: "U2hpcDox") {
id
@ -74,5 +74,5 @@ def test_correctly_refetches_xwing(snapshot):
}
}
}
'''
"""
snapshot.assert_match(client.execute(query))

View File

@ -10,17 +10,25 @@ from .types import (
InputField,
Schema,
Scalar,
String, ID, Int, Float, Boolean,
Date, DateTime, Time,
String,
ID,
Int,
Float,
Boolean,
Date,
DateTime,
Time,
Decimal,
JSONString,
UUID,
List, NonNull,
List,
NonNull,
Enum,
Argument,
Dynamic,
Union,
Context,
ResolveInfo
ResolveInfo,
)
from .relay import (
Node,
@ -29,54 +37,54 @@ from .relay import (
ClientIDMutation,
Connection,
ConnectionField,
PageInfo
PageInfo,
)
from .utils.resolve_only_args import resolve_only_args
from .utils.module_loading import lazy_import
VERSION = (2, 1, 2, 'final', 0)
VERSION = (2, 1, 3, "final", 0)
__version__ = get_version(VERSION)
__all__ = [
'__version__',
'ObjectType',
'InputObjectType',
'Interface',
'Mutation',
'Field',
'InputField',
'Schema',
'Scalar',
'String',
'ID',
'Int',
'Float',
'Enum',
'Boolean',
'Date',
'DateTime',
'Time',
'JSONString',
'UUID',
'List',
'NonNull',
'Argument',
'Dynamic',
'Union',
'resolve_only_args',
'Node',
'is_node',
'GlobalID',
'ClientIDMutation',
'Connection',
'ConnectionField',
'PageInfo',
'lazy_import',
'Context',
'ResolveInfo',
"__version__",
"ObjectType",
"InputObjectType",
"Interface",
"Mutation",
"Field",
"InputField",
"Schema",
"Scalar",
"String",
"ID",
"Int",
"Float",
"Enum",
"Boolean",
"Date",
"DateTime",
"Time",
"Decimal",
"JSONString",
"UUID",
"List",
"NonNull",
"Argument",
"Dynamic",
"Union",
"resolve_only_args",
"Node",
"is_node",
"GlobalID",
"ClientIDMutation",
"Connection",
"ConnectionField",
"PageInfo",
"lazy_import",
"Context",
"ResolveInfo",
# Deprecated
'AbstractType',
"AbstractType",
]

View File

@ -2,10 +2,7 @@ from __future__ import absolute_import
import six
try:
from enum import Enum
except ImportError:
from .enum import Enum
from graphql.pyutils.compat import Enum
try:
from inspect import signature
@ -13,8 +10,12 @@ except ImportError:
from .signature import signature
if six.PY2:
def func_name(func):
return func.func_name
else:
def func_name(func):
return func.__name__

View File

@ -1,873 +0,0 @@
"""Python Enumerations"""
import sys as _sys
__all__ = ['Enum', 'IntEnum', 'unique']
version = 1, 1, 6
pyver = float('%s.%s' % _sys.version_info[:2])
try:
any
except NameError:
def any(iterable):
for element in iterable:
if element:
return True
return False
try:
from collections import OrderedDict
except ImportError:
OrderedDict = None
try:
basestring
except NameError:
# In Python 2 basestring is the ancestor of both str and unicode
# in Python 3 it's just str, but was missing in 3.1
basestring = str
try:
unicode
except NameError:
# In Python 3 unicode no longer exists (it's just str)
unicode = str
class _RouteClassAttributeToGetattr(object):
"""Route attribute access on a class to __getattr__.
This is a descriptor, used to define attributes that act differently when
accessed through an instance and through a class. Instance access remains
normal, but access to an attribute through a class will be routed to the
class's __getattr__ method; this is done by raising AttributeError.
"""
def __init__(self, fget=None):
self.fget = fget
def __get__(self, instance, ownerclass=None):
if instance is None:
raise AttributeError()
return self.fget(instance)
def __set__(self, instance, value):
raise AttributeError("can't set attribute")
def __delete__(self, instance):
raise AttributeError("can't delete attribute")
def _is_descriptor(obj):
"""Returns True if obj is a descriptor, False otherwise."""
return (
hasattr(obj, '__get__') or
hasattr(obj, '__set__') or
hasattr(obj, '__delete__'))
def _is_dunder(name):
"""Returns True if a __dunder__ name, False otherwise."""
return (len(name) > 4 and
name[:2] == name[-2:] == '__' and
name[2:3] != '_' and
name[-3:-2] != '_')
def _is_sunder(name):
"""Returns True if a _sunder_ name, False otherwise."""
return (len(name) > 2 and
name[0] == name[-1] == '_' and
name[1:2] != '_' and
name[-2:-1] != '_')
def _make_class_unpicklable(cls):
"""Make the given class un-picklable."""
def _break_on_call_reduce(self, protocol=None):
raise TypeError('%r cannot be pickled' % self)
cls.__reduce_ex__ = _break_on_call_reduce
cls.__module__ = '<unknown>'
class _EnumDict(OrderedDict):
"""Track enum member order and ensure member names are not reused.
EnumMeta will use the names found in self._member_names as the
enumeration member names.
"""
def __init__(self):
super(_EnumDict, self).__init__()
self._member_names = []
def __setitem__(self, key, value):
"""Changes anything not dundered or not a descriptor.
If a descriptor is added with the same name as an enum member, the name
is removed from _member_names (this may leave a hole in the numerical
sequence of values).
If an enum member name is used twice, an error is raised; duplicate
values are not checked for.
Single underscore (sunder) names are reserved.
Note: in 3.x __order__ is simply discarded as a not necessary piece
leftover from 2.x
"""
if pyver >= 3.0 and key in ('_order_', '__order__'):
return
elif key == '__order__':
key = '_order_'
if _is_sunder(key):
if key != '_order_':
raise ValueError('_names_ are reserved for future Enum use')
elif _is_dunder(key):
pass
elif key in self._member_names:
# descriptor overwriting an enum?
raise TypeError('Attempted to reuse key: %r' % key)
elif not _is_descriptor(value):
if key in self:
# enum overwriting a descriptor?
raise TypeError('Key already defined as: %r' % self[key])
self._member_names.append(key)
super(_EnumDict, self).__setitem__(key, value)
# Dummy value for Enum as EnumMeta explicity checks for it, but of course until
# EnumMeta finishes running the first time the Enum class doesn't exist. This
# is also why there are checks in EnumMeta like `if Enum is not None`
Enum = None
class EnumMeta(type):
"""Metaclass for Enum"""
@classmethod
def __prepare__(metacls, cls, bases):
return _EnumDict()
def __new__(metacls, cls, bases, classdict):
# an Enum class is final once enumeration items have been defined; it
# cannot be mixed with other types (int, float, etc.) if it has an
# inherited __new__ unless a new __new__ is defined (or the resulting
# class will fail).
if isinstance(classdict, dict):
original_dict = classdict
classdict = _EnumDict()
for k, v in original_dict.items():
classdict[k] = v
member_type, first_enum = metacls._get_mixins_(bases)
__new__, save_new, use_args = metacls._find_new_(classdict, member_type,
first_enum)
# save enum items into separate mapping so they don't get baked into
# the new class
members = {k: classdict[k] for k in classdict._member_names}
for name in classdict._member_names:
del classdict[name]
# py2 support for definition order
_order_ = classdict.get('_order_')
if _order_ is None:
if pyver < 3.0:
try:
_order_ = [name for (name, value) in sorted(members.items(), key=lambda item: item[1])]
except TypeError:
_order_ = [name for name in sorted(members.keys())]
else:
_order_ = classdict._member_names
else:
del classdict['_order_']
if pyver < 3.0:
_order_ = _order_.replace(',', ' ').split()
aliases = [name for name in members if name not in _order_]
_order_ += aliases
# check for illegal enum names (any others?)
invalid_names = set(members) & {'mro'}
if invalid_names:
raise ValueError('Invalid enum member name(s): %s' % (
', '.join(invalid_names), ))
# save attributes from super classes so we know if we can take
# the shortcut of storing members in the class dict
base_attributes = {a for b in bases for a in b.__dict__}
# create our new Enum type
enum_class = super(EnumMeta, metacls).__new__(metacls, cls, bases, classdict)
enum_class._member_names_ = [] # names in random order
if OrderedDict is not None:
enum_class._member_map_ = OrderedDict()
else:
enum_class._member_map_ = {} # name->value map
enum_class._member_type_ = member_type
# Reverse value->name map for hashable values.
enum_class._value2member_map_ = {}
# instantiate them, checking for duplicates as we go
# we instantiate first instead of checking for duplicates first in case
# a custom __new__ is doing something funky with the values -- such as
# auto-numbering ;)
if __new__ is None:
__new__ = enum_class.__new__
for member_name in _order_:
value = members[member_name]
if not isinstance(value, tuple):
args = (value, )
else:
args = value
if member_type is tuple: # special case for tuple enums
args = (args, ) # wrap it one more time
if not use_args or not args:
enum_member = __new__(enum_class)
if not hasattr(enum_member, '_value_'):
enum_member._value_ = value
else:
enum_member = __new__(enum_class, *args)
if not hasattr(enum_member, '_value_'):
enum_member._value_ = member_type(*args)
value = enum_member._value_
enum_member._name_ = member_name
enum_member.__objclass__ = enum_class
enum_member.__init__(*args)
# If another member with the same value was already defined, the
# new member becomes an alias to the existing one.
for name, canonical_member in enum_class._member_map_.items():
if canonical_member.value == enum_member._value_:
enum_member = canonical_member
break
else:
# Aliases don't appear in member names (only in __members__).
enum_class._member_names_.append(member_name)
# performance boost for any member that would not shadow
# a DynamicClassAttribute (aka _RouteClassAttributeToGetattr)
if member_name not in base_attributes:
setattr(enum_class, member_name, enum_member)
# now add to _member_map_
enum_class._member_map_[member_name] = enum_member
try:
# This may fail if value is not hashable. We can't add the value
# to the map, and by-value lookups for this value will be
# linear.
enum_class._value2member_map_[value] = enum_member
except TypeError:
pass
# If a custom type is mixed into the Enum, and it does not know how
# to pickle itself, pickle.dumps will succeed but pickle.loads will
# fail. Rather than have the error show up later and possibly far
# from the source, sabotage the pickle protocol for this class so
# that pickle.dumps also fails.
#
# However, if the new class implements its own __reduce_ex__, do not
# sabotage -- it's on them to make sure it works correctly. We use
# __reduce_ex__ instead of any of the others as it is preferred by
# pickle over __reduce__, and it handles all pickle protocols.
unpicklable = False
if '__reduce_ex__' not in classdict:
if member_type is not object:
methods = ('__getnewargs_ex__', '__getnewargs__',
'__reduce_ex__', '__reduce__')
if not any(m in member_type.__dict__ for m in methods):
_make_class_unpicklable(enum_class)
unpicklable = True
# double check that repr and friends are not the mixin's or various
# things break (such as pickle)
for name in ('__repr__', '__str__', '__format__', '__reduce_ex__'):
class_method = getattr(enum_class, name)
getattr(member_type, name, None)
enum_method = getattr(first_enum, name, None)
if name not in classdict and class_method is not enum_method:
if name == '__reduce_ex__' and unpicklable:
continue
setattr(enum_class, name, enum_method)
# method resolution and int's are not playing nice
# Python's less than 2.6 use __cmp__
if pyver < 2.6:
if issubclass(enum_class, int):
setattr(enum_class, '__cmp__', getattr(int, '__cmp__'))
elif pyver < 3.0:
if issubclass(enum_class, int):
for method in (
'__le__',
'__lt__',
'__gt__',
'__ge__',
'__eq__',
'__ne__',
'__hash__',
):
setattr(enum_class, method, getattr(int, method))
# replace any other __new__ with our own (as long as Enum is not None,
# anyway) -- again, this is to support pickle
if Enum is not None:
# if the user defined their own __new__, save it before it gets
# clobbered in case they subclass later
if save_new:
setattr(enum_class, '__member_new__', enum_class.__dict__['__new__'])
setattr(enum_class, '__new__', Enum.__dict__['__new__'])
return enum_class
def __bool__(cls):
"""
classes/types should always be True.
"""
return True
def __call__(cls, value, names=None, module=None, type=None, start=1):
"""Either returns an existing member, or creates a new enum class.
This method is used both when an enum class is given a value to match
to an enumeration member (i.e. Color(3)) and for the functional API
(i.e. Color = Enum('Color', names='red green blue')).
When used for the functional API: `module`, if set, will be stored in
the new class' __module__ attribute; `type`, if set, will be mixed in
as the first base class.
Note: if `module` is not set this routine will attempt to discover the
calling module by walking the frame stack; if this is unsuccessful
the resulting class will not be pickleable.
"""
if names is None: # simple value lookup
return cls.__new__(cls, value)
# otherwise, functional API: we're creating a new Enum type
return cls._create_(value, names, module=module, type=type, start=start)
def __contains__(cls, member):
return isinstance(member, cls) and member.name in cls._member_map_
def __delattr__(cls, attr):
# nicer error message when someone tries to delete an attribute
# (see issue19025).
if attr in cls._member_map_:
raise AttributeError(
"%s: cannot delete Enum member." % cls.__name__)
super(EnumMeta, cls).__delattr__(attr)
def __dir__(self):
return (['__class__', '__doc__', '__members__', '__module__'] +
self._member_names_)
@property
def __members__(cls):
"""Returns a mapping of member name->value.
This mapping lists all enum members, including aliases. Note that this
is a copy of the internal mapping.
"""
return cls._member_map_.copy()
def __getattr__(cls, name):
"""Return the enum member matching `name`
We use __getattr__ instead of descriptors or inserting into the enum
class' __dict__ in order to support `name` and `value` being both
properties for enum members (which live in the class' __dict__) and
enum members themselves.
"""
if _is_dunder(name):
raise AttributeError(name)
try:
return cls._member_map_[name]
except KeyError:
raise AttributeError(name)
def __getitem__(cls, name):
return cls._member_map_[name]
def __iter__(cls):
return (cls._member_map_[name] for name in cls._member_names_)
def __reversed__(cls):
return (cls._member_map_[name] for name in reversed(cls._member_names_))
def __len__(cls):
return len(cls._member_names_)
__nonzero__ = __bool__
def __repr__(cls):
return "<enum %r>" % cls.__name__
def __setattr__(cls, name, value):
"""Block attempts to reassign Enum members.
A simple assignment to the class namespace only changes one of the
several possible ways to get an Enum member from the Enum class,
resulting in an inconsistent Enumeration.
"""
member_map = cls.__dict__.get('_member_map_', {})
if name in member_map:
raise AttributeError('Cannot reassign members.')
super(EnumMeta, cls).__setattr__(name, value)
def _create_(cls, class_name, names=None, module=None, type=None, start=1):
"""Convenience method to create a new Enum class.
`names` can be:
* A string containing member names, separated either with spaces or
commas. Values are auto-numbered from 1.
* An iterable of member names. Values are auto-numbered from 1.
* An iterable of (member name, value) pairs.
* A mapping of member name -> value.
"""
if pyver < 3.0:
# if class_name is unicode, attempt a conversion to ASCII
if isinstance(class_name, unicode):
try:
class_name = class_name.encode('ascii')
except UnicodeEncodeError:
raise TypeError('%r is not representable in ASCII' % class_name)
metacls = cls.__class__
if type is None:
bases = (cls, )
else:
bases = (type, cls)
classdict = metacls.__prepare__(class_name, bases)
_order_ = []
# special processing needed for names?
if isinstance(names, basestring):
names = names.replace(',', ' ').split()
if isinstance(names, (tuple, list)) and isinstance(names[0], basestring):
names = [(e, i + start) for (i, e) in enumerate(names)]
# Here, names is either an iterable of (name, value) or a mapping.
item = None # in case names is empty
for item in names:
if isinstance(item, basestring):
member_name, member_value = item, names[item]
else:
member_name, member_value = item
classdict[member_name] = member_value
_order_.append(member_name)
# only set _order_ in classdict if name/value was not from a mapping
if not isinstance(item, basestring):
classdict['_order_'] = ' '.join(_order_)
enum_class = metacls.__new__(metacls, class_name, bases, classdict)
# TODO: replace the frame hack if a blessed way to know the calling
# module is ever developed
if module is None:
try:
module = _sys._getframe(2).f_globals['__name__']
except (AttributeError, ValueError):
pass
if module is None:
_make_class_unpicklable(enum_class)
else:
enum_class.__module__ = module
return enum_class
@staticmethod
def _get_mixins_(bases):
"""Returns the type for creating enum members, and the first inherited
enum class.
bases: the tuple of bases that was given to __new__
"""
if not bases or Enum is None:
return object, Enum
# double check that we are not subclassing a class with existing
# enumeration members; while we're at it, see if any other data
# type has been mixed in so we can use the correct __new__
member_type = first_enum = None
for base in bases:
if (base is not Enum and
issubclass(base, Enum) and
base._member_names_):
raise TypeError("Cannot extend enumerations")
# base is now the last base in bases
if not issubclass(base, Enum):
raise TypeError("new enumerations must be created as "
"`ClassName([mixin_type,] enum_type)`")
# get correct mix-in type (either mix-in type of Enum subclass, or
# first base if last base is Enum)
if not issubclass(bases[0], Enum):
member_type = bases[0] # first data type
first_enum = bases[-1] # enum type
else:
for base in bases[0].__mro__:
# most common: (IntEnum, int, Enum, object)
# possible: (<Enum 'AutoIntEnum'>, <Enum 'IntEnum'>,
# <class 'int'>, <Enum 'Enum'>,
# <class 'object'>)
if issubclass(base, Enum):
if first_enum is None:
first_enum = base
else:
if member_type is None:
member_type = base
return member_type, first_enum
if pyver < 3.0:
@staticmethod
def _find_new_(classdict, member_type, first_enum):
"""Returns the __new__ to be used for creating the enum members.
classdict: the class dictionary given to __new__
member_type: the data type whose __new__ will be used by default
first_enum: enumeration to check for an overriding __new__
"""
# now find the correct __new__, checking to see of one was defined
# by the user; also check earlier enum classes in case a __new__ was
# saved as __member_new__
__new__ = classdict.get('__new__', None)
if __new__:
return None, True, True # __new__, save_new, use_args
N__new__ = getattr(None, '__new__')
O__new__ = getattr(object, '__new__')
if Enum is None:
E__new__ = N__new__
else:
E__new__ = Enum.__dict__['__new__']
# check all possibles for __member_new__ before falling back to
# __new__
for method in ('__member_new__', '__new__'):
for possible in (member_type, first_enum):
try:
target = possible.__dict__[method]
except (AttributeError, KeyError):
target = getattr(possible, method, None)
if target not in [
None,
N__new__,
O__new__,
E__new__,
]:
if method == '__member_new__':
classdict['__new__'] = target
return None, False, True
if isinstance(target, staticmethod):
target = target.__get__(member_type)
__new__ = target
break
if __new__ is not None:
break
else:
__new__ = object.__new__
# if a non-object.__new__ is used then whatever value/tuple was
# assigned to the enum member name will be passed to __new__ and to the
# new enum member's __init__
if __new__ is object.__new__:
use_args = False
else:
use_args = True
return __new__, False, use_args
else:
@staticmethod
def _find_new_(classdict, member_type, first_enum):
"""Returns the __new__ to be used for creating the enum members.
classdict: the class dictionary given to __new__
member_type: the data type whose __new__ will be used by default
first_enum: enumeration to check for an overriding __new__
"""
# now find the correct __new__, checking to see of one was defined
# by the user; also check earlier enum classes in case a __new__ was
# saved as __member_new__
__new__ = classdict.get('__new__', None)
# should __new__ be saved as __member_new__ later?
save_new = __new__ is not None
if __new__ is None:
# check all possibles for __member_new__ before falling back to
# __new__
for method in ('__member_new__', '__new__'):
for possible in (member_type, first_enum):
target = getattr(possible, method, None)
if target not in (
None,
None.__new__,
object.__new__,
Enum.__new__,
):
__new__ = target
break
if __new__ is not None:
break
else:
__new__ = object.__new__
# if a non-object.__new__ is used then whatever value/tuple was
# assigned to the enum member name will be passed to __new__ and to the
# new enum member's __init__
if __new__ is object.__new__:
use_args = False
else:
use_args = True
return __new__, save_new, use_args
########################################################
# In order to support Python 2 and 3 with a single
# codebase we have to create the Enum methods separately
# and then use the `type(name, bases, dict)` method to
# create the class.
########################################################
temp_enum_dict = {}
temp_enum_dict['__doc__'] = "Generic enumeration.\n\n Derive from this class to define new enumerations.\n\n"
def __new__(cls, value):
# all enum instances are actually created during class construction
# without calling this method; this method is called by the metaclass'
# __call__ (i.e. Color(3) ), and by pickle
if isinstance(value, cls):
# For lookups like Color(Color.red)
value = value.value
# return value
# by-value search for a matching enum member
# see if it's in the reverse mapping (for hashable values)
try:
if value in cls._value2member_map_:
return cls._value2member_map_[value]
except TypeError:
# not there, now do long search -- O(n) behavior
for member in cls._member_map_.values():
if member.value == value:
return member
raise ValueError("%s is not a valid %s" % (value, cls.__name__))
temp_enum_dict['__new__'] = __new__
del __new__
def __repr__(self):
return "<%s.%s: %r>" % (
self.__class__.__name__, self._name_, self._value_)
temp_enum_dict['__repr__'] = __repr__
del __repr__
def __str__(self):
return "%s.%s" % (self.__class__.__name__, self._name_)
temp_enum_dict['__str__'] = __str__
del __str__
if pyver >= 3.0:
def __dir__(self):
added_behavior = [
m
for cls in self.__class__.mro()
for m in cls.__dict__
if m[0] != '_' and m not in self._member_map_
]
return (['__class__', '__doc__', '__module__', ] + added_behavior)
temp_enum_dict['__dir__'] = __dir__
del __dir__
def __format__(self, format_spec):
# mixed-in Enums should use the mixed-in type's __format__, otherwise
# we can get strange results with the Enum name showing up instead of
# the value
# pure Enum branch
if self._member_type_ is object:
cls = str
val = str(self)
# mix-in branch
else:
cls = self._member_type_
val = self.value
return cls.__format__(val, format_spec)
temp_enum_dict['__format__'] = __format__
del __format__
####################################
# Python's less than 2.6 use __cmp__
if pyver < 2.6:
def __cmp__(self, other):
if isinstance(other, self.__class__):
if self is other:
return 0
return -1
return NotImplemented
raise TypeError("unorderable types: %s() and %s()" % (self.__class__.__name__, other.__class__.__name__))
temp_enum_dict['__cmp__'] = __cmp__
del __cmp__
else:
def __le__(self, other):
raise TypeError("unorderable types: %s() <= %s()" % (self.__class__.__name__, other.__class__.__name__))
temp_enum_dict['__le__'] = __le__
del __le__
def __lt__(self, other):
raise TypeError("unorderable types: %s() < %s()" % (self.__class__.__name__, other.__class__.__name__))
temp_enum_dict['__lt__'] = __lt__
del __lt__
def __ge__(self, other):
raise TypeError("unorderable types: %s() >= %s()" % (self.__class__.__name__, other.__class__.__name__))
temp_enum_dict['__ge__'] = __ge__
del __ge__
def __gt__(self, other):
raise TypeError("unorderable types: %s() > %s()" % (self.__class__.__name__, other.__class__.__name__))
temp_enum_dict['__gt__'] = __gt__
del __gt__
def __eq__(self, other):
if isinstance(other, self.__class__):
return self is other
return NotImplemented
temp_enum_dict['__eq__'] = __eq__
del __eq__
def __ne__(self, other):
if isinstance(other, self.__class__):
return self is not other
return NotImplemented
temp_enum_dict['__ne__'] = __ne__
del __ne__
def __hash__(self):
return hash(self._name_)
temp_enum_dict['__hash__'] = __hash__
del __hash__
def __reduce_ex__(self, proto):
return self.__class__, (self._value_, )
temp_enum_dict['__reduce_ex__'] = __reduce_ex__
del __reduce_ex__
# _RouteClassAttributeToGetattr is used to provide access to the `name`
# and `value` properties of enum members while keeping some measure of
# protection from modification, while still allowing for an enumeration
# to have members named `name` and `value`. This works because enumeration
# members are not set directly on the enum class -- __getattr__ is
# used to look them up.
@_RouteClassAttributeToGetattr
def name(self):
return self._name_
temp_enum_dict['name'] = name
del name
@_RouteClassAttributeToGetattr
def value(self):
return self._value_
temp_enum_dict['value'] = value
del value
@classmethod
def _convert(cls, name, module, filter, source=None):
"""
Create a new Enum subclass that replaces a collection of global constants
"""
# convert all constants from source (or module) that pass filter() to
# a new Enum called name, and export the enum and its members back to
# module;
# also, replace the __reduce_ex__ method so unpickling works in
# previous Python versions
module_globals = vars(_sys.modules[module])
if source:
source = vars(source)
else:
source = module_globals
members = {name: value for name, value in source.items() if filter(name)}
cls = cls(name, members, module=module)
cls.__reduce_ex__ = _reduce_ex_by_name
module_globals.update(cls.__members__)
module_globals[name] = cls
return cls
temp_enum_dict['_convert'] = _convert
del _convert
Enum = EnumMeta('Enum', (object, ), temp_enum_dict)
del temp_enum_dict
# Enum has now been created
###########################
class IntEnum(int, Enum):
"""Enum where members are also (and must be) ints"""
def _reduce_ex_by_name(self, proto):
return self.name
def unique(enumeration):
"""Class decorator that ensures only unique members exist in an enumeration."""
duplicates = []
for name, member in enumeration.__members__.items():
if name != member.name:
duplicates.append((name, member.name))
if duplicates:
duplicate_names = ', '.join(
["%s -> %s" % (alias, name) for (alias, name) in duplicates]
)
raise ValueError('duplicate names found in %r: %s' %
(enumeration, duplicate_names)
)
return enumeration

View File

@ -1,19 +1,23 @@
is_init_subclass_available = hasattr(object, '__init_subclass__')
is_init_subclass_available = hasattr(object, "__init_subclass__")
if not is_init_subclass_available:
class InitSubclassMeta(type):
"""Metaclass that implements PEP 487 protocol"""
def __new__(cls, name, bases, ns, **kwargs):
__init_subclass__ = ns.pop('__init_subclass__', None)
__init_subclass__ = ns.pop("__init_subclass__", None)
if __init_subclass__:
__init_subclass__ = classmethod(__init_subclass__)
ns['__init_subclass__'] = __init_subclass__
ns["__init_subclass__"] = __init_subclass__
return super(InitSubclassMeta, cls).__new__(cls, name, bases, ns, **kwargs)
def __init__(cls, name, bases, ns, **kwargs):
super(InitSubclassMeta, cls).__init__(name, bases, ns)
super_class = super(cls, cls)
if hasattr(super_class, '__init_subclass__'):
if hasattr(super_class, "__init_subclass__"):
super_class.__init_subclass__.__func__(cls, **kwargs)
else:
InitSubclassMeta = type # type: ignore

View File

@ -13,22 +13,24 @@ from collections import OrderedDict
__version__ = "0.4"
__all__ = ['BoundArguments', 'Parameter', 'Signature', 'signature']
__all__ = ["BoundArguments", "Parameter", "Signature", "signature"]
_WrapperDescriptor = type(type.__call__)
_MethodWrapper = type(all.__call__)
_NonUserDefinedCallables = (_WrapperDescriptor,
_MethodWrapper,
types.BuiltinFunctionType)
_NonUserDefinedCallables = (
_WrapperDescriptor,
_MethodWrapper,
types.BuiltinFunctionType,
)
def formatannotation(annotation, base_module=None):
if isinstance(annotation, type):
if annotation.__module__ in ('builtins', '__builtin__', base_module):
if annotation.__module__ in ("builtins", "__builtin__", base_module):
return annotation.__name__
return annotation.__module__ + '.' + annotation.__name__
return annotation.__module__ + "." + annotation.__name__
return repr(annotation)
@ -49,20 +51,20 @@ def _get_user_defined_method(cls, method_name, *nested):
def signature(obj):
'''Get a signature object for the passed callable.'''
"""Get a signature object for the passed callable."""
if not callable(obj):
raise TypeError('{!r} is not a callable object'.format(obj))
raise TypeError("{!r} is not a callable object".format(obj))
if isinstance(obj, types.MethodType):
sig = signature(obj.__func__)
if obj.__self__ is None:
# Unbound method: the first parameter becomes positional-only
if sig.parameters:
first = sig.parameters.values()[0].replace(
kind=_POSITIONAL_ONLY)
first = sig.parameters.values()[0].replace(kind=_POSITIONAL_ONLY)
return sig.replace(
parameters=(first,) + tuple(sig.parameters.values())[1:])
parameters=(first,) + tuple(sig.parameters.values())[1:]
)
else:
return sig
else:
@ -99,7 +101,7 @@ def signature(obj):
try:
ba = sig.bind_partial(*partial_args, **partial_keywords)
except TypeError as ex:
msg = 'partial object {!r} has incorrect arguments'.format(obj)
msg = "partial object {!r} has incorrect arguments".format(obj)
raise ValueError(msg)
for arg_name, arg_value in ba.arguments.items():
@ -122,11 +124,14 @@ def signature(obj):
# flag. Later, in '_bind', the 'default' value of this
# parameter will be added to 'kwargs', to simulate
# the 'functools.partial' real call.
new_params[arg_name] = param.replace(default=arg_value,
_partial_kwarg=True)
new_params[arg_name] = param.replace(
default=arg_value, _partial_kwarg=True
)
elif (param.kind not in (_VAR_KEYWORD, _VAR_POSITIONAL) and
not param._partial_kwarg):
elif (
param.kind not in (_VAR_KEYWORD, _VAR_POSITIONAL)
and not param._partial_kwarg
):
new_params.pop(arg_name)
return sig.replace(parameters=new_params.values())
@ -137,17 +142,17 @@ def signature(obj):
# First, let's see if it has an overloaded __call__ defined
# in its metaclass
call = _get_user_defined_method(type(obj), '__call__')
call = _get_user_defined_method(type(obj), "__call__")
if call is not None:
sig = signature(call)
else:
# Now we check if the 'obj' class has a '__new__' method
new = _get_user_defined_method(obj, '__new__')
new = _get_user_defined_method(obj, "__new__")
if new is not None:
sig = signature(new)
else:
# Finally, we should have at least __init__ implemented
init = _get_user_defined_method(obj, '__init__')
init = _get_user_defined_method(obj, "__init__")
if init is not None:
sig = signature(init)
elif not isinstance(obj, _NonUserDefinedCallables):
@ -155,7 +160,7 @@ def signature(obj):
# We also check that the 'obj' is not an instance of
# _WrapperDescriptor or _MethodWrapper to avoid
# infinite recursion (and even potential segfault)
call = _get_user_defined_method(type(obj), '__call__', 'im_func')
call = _get_user_defined_method(type(obj), "__call__", "im_func")
if call is not None:
sig = signature(call)
@ -166,14 +171,14 @@ def signature(obj):
if isinstance(obj, types.BuiltinFunctionType):
# Raise a nicer error message for builtins
msg = 'no signature found for builtin function {!r}'.format(obj)
msg = "no signature found for builtin function {!r}".format(obj)
raise ValueError(msg)
raise ValueError('callable {!r} is not supported by signature'.format(obj))
raise ValueError("callable {!r} is not supported by signature".format(obj))
class _void(object):
'''A private marker - used in Parameter & Signature'''
"""A private marker - used in Parameter & Signature"""
class _empty(object):
@ -183,25 +188,25 @@ class _empty(object):
class _ParameterKind(int):
def __new__(self, *args, **kwargs):
obj = int.__new__(self, *args)
obj._name = kwargs['name']
obj._name = kwargs["name"]
return obj
def __str__(self):
return self._name
def __repr__(self):
return '<_ParameterKind: {!r}>'.format(self._name)
return "<_ParameterKind: {!r}>".format(self._name)
_POSITIONAL_ONLY = _ParameterKind(0, name='POSITIONAL_ONLY')
_POSITIONAL_OR_KEYWORD = _ParameterKind(1, name='POSITIONAL_OR_KEYWORD')
_VAR_POSITIONAL = _ParameterKind(2, name='VAR_POSITIONAL')
_KEYWORD_ONLY = _ParameterKind(3, name='KEYWORD_ONLY')
_VAR_KEYWORD = _ParameterKind(4, name='VAR_KEYWORD')
_POSITIONAL_ONLY = _ParameterKind(0, name="POSITIONAL_ONLY")
_POSITIONAL_OR_KEYWORD = _ParameterKind(1, name="POSITIONAL_OR_KEYWORD")
_VAR_POSITIONAL = _ParameterKind(2, name="VAR_POSITIONAL")
_KEYWORD_ONLY = _ParameterKind(3, name="KEYWORD_ONLY")
_VAR_KEYWORD = _ParameterKind(4, name="VAR_KEYWORD")
class Parameter(object):
'''Represents a parameter in a function signature.
"""Represents a parameter in a function signature.
Has the following public attributes:
* name : str
The name of the parameter as a string.
@ -216,9 +221,9 @@ class Parameter(object):
Possible values: `Parameter.POSITIONAL_ONLY`,
`Parameter.POSITIONAL_OR_KEYWORD`, `Parameter.VAR_POSITIONAL`,
`Parameter.KEYWORD_ONLY`, `Parameter.VAR_KEYWORD`.
'''
"""
__slots__ = ('_name', '_kind', '_default', '_annotation', '_partial_kwarg')
__slots__ = ("_name", "_kind", "_default", "_annotation", "_partial_kwarg")
POSITIONAL_ONLY = _POSITIONAL_ONLY
POSITIONAL_OR_KEYWORD = _POSITIONAL_OR_KEYWORD
@ -228,30 +233,37 @@ class Parameter(object):
empty = _empty
def __init__(self, name, kind, default=_empty, annotation=_empty,
_partial_kwarg=False):
def __init__(
self, name, kind, default=_empty, annotation=_empty, _partial_kwarg=False
):
if kind not in (_POSITIONAL_ONLY, _POSITIONAL_OR_KEYWORD,
_VAR_POSITIONAL, _KEYWORD_ONLY, _VAR_KEYWORD):
if kind not in (
_POSITIONAL_ONLY,
_POSITIONAL_OR_KEYWORD,
_VAR_POSITIONAL,
_KEYWORD_ONLY,
_VAR_KEYWORD,
):
raise ValueError("invalid value for 'Parameter.kind' attribute")
self._kind = kind
if default is not _empty:
if kind in (_VAR_POSITIONAL, _VAR_KEYWORD):
msg = '{} parameters cannot have default values'.format(kind)
msg = "{} parameters cannot have default values".format(kind)
raise ValueError(msg)
self._default = default
self._annotation = annotation
if name is None:
if kind != _POSITIONAL_ONLY:
raise ValueError("None is not a valid name for a "
"non-positional-only parameter")
raise ValueError(
"None is not a valid name for a " "non-positional-only parameter"
)
self._name = name
else:
name = str(name)
if kind != _POSITIONAL_ONLY and not re.match(r'[a-z_]\w*$', name, re.I):
msg = '{!r} is not a valid parameter name'.format(name)
if kind != _POSITIONAL_ONLY and not re.match(r"[a-z_]\w*$", name, re.I):
msg = "{!r} is not a valid parameter name".format(name)
raise ValueError(msg)
self._name = name
@ -273,9 +285,15 @@ class Parameter(object):
def kind(self):
return self._kind
def replace(self, name=_void, kind=_void, annotation=_void,
default=_void, _partial_kwarg=_void):
'''Creates a customized copy of the Parameter.'''
def replace(
self,
name=_void,
kind=_void,
annotation=_void,
default=_void,
_partial_kwarg=_void,
):
"""Creates a customized copy of the Parameter."""
if name is _void:
name = self._name
@ -292,8 +310,13 @@ class Parameter(object):
if _partial_kwarg is _void:
_partial_kwarg = self._partial_kwarg
return type(self)(name, kind, default=default, annotation=annotation,
_partial_kwarg=_partial_kwarg)
return type(self)(
name,
kind,
default=default,
annotation=annotation,
_partial_kwarg=_partial_kwarg,
)
def __str__(self):
kind = self.kind
@ -301,45 +324,45 @@ class Parameter(object):
formatted = self._name
if kind == _POSITIONAL_ONLY:
if formatted is None:
formatted = ''
formatted = '<{}>'.format(formatted)
formatted = ""
formatted = "<{}>".format(formatted)
# Add annotation and default value
if self._annotation is not _empty:
formatted = '{}:{}'.format(formatted,
formatannotation(self._annotation))
formatted = "{}:{}".format(formatted, formatannotation(self._annotation))
if self._default is not _empty:
formatted = '{}={}'.format(formatted, repr(self._default))
formatted = "{}={}".format(formatted, repr(self._default))
if kind == _VAR_POSITIONAL:
formatted = '*' + formatted
formatted = "*" + formatted
elif kind == _VAR_KEYWORD:
formatted = '**' + formatted
formatted = "**" + formatted
return formatted
def __repr__(self):
return '<{} at {:#x} {!r}>'.format(self.__class__.__name__,
id(self), self.name)
return "<{} at {:#x} {!r}>".format(self.__class__.__name__, id(self), self.name)
def __hash__(self):
msg = "unhashable type: '{}'".format(self.__class__.__name__)
raise TypeError(msg)
def __eq__(self, other):
return (issubclass(other.__class__, Parameter) and
self._name == other._name and
self._kind == other._kind and
self._default == other._default and
self._annotation == other._annotation)
return (
issubclass(other.__class__, Parameter)
and self._name == other._name
and self._kind == other._kind
and self._default == other._default
and self._annotation == other._annotation
)
def __ne__(self, other):
return not self.__eq__(other)
class BoundArguments(object):
'''Result of `Signature.bind` call. Holds the mapping of arguments
"""Result of `Signature.bind` call. Holds the mapping of arguments
to the function's parameters.
Has the following public attributes:
* arguments : OrderedDict
@ -351,7 +374,7 @@ class BoundArguments(object):
Tuple of positional arguments values.
* kwargs : dict
Dict of keyword arguments values.
'''
"""
def __init__(self, signature, arguments):
self.arguments = arguments
@ -365,8 +388,7 @@ class BoundArguments(object):
def args(self):
args = []
for param_name, param in self._signature.parameters.items():
if (param.kind in (_VAR_KEYWORD, _KEYWORD_ONLY) or
param._partial_kwarg):
if param.kind in (_VAR_KEYWORD, _KEYWORD_ONLY) or param._partial_kwarg:
# Keyword arguments mapped by 'functools.partial'
# (Parameter._partial_kwarg is True) are mapped
# in 'BoundArguments.kwargs', along with VAR_KEYWORD &
@ -395,8 +417,7 @@ class BoundArguments(object):
kwargs_started = False
for param_name, param in self._signature.parameters.items():
if not kwargs_started:
if (param.kind in (_VAR_KEYWORD, _KEYWORD_ONLY) or
param._partial_kwarg):
if param.kind in (_VAR_KEYWORD, _KEYWORD_ONLY) or param._partial_kwarg:
kwargs_started = True
else:
if param_name not in self.arguments:
@ -425,16 +446,18 @@ class BoundArguments(object):
raise TypeError(msg)
def __eq__(self, other):
return (issubclass(other.__class__, BoundArguments) and
self.signature == other.signature and
self.arguments == other.arguments)
return (
issubclass(other.__class__, BoundArguments)
and self.signature == other.signature
and self.arguments == other.arguments
)
def __ne__(self, other):
return not self.__eq__(other)
class Signature(object):
'''A Signature object represents the overall signature of a function.
"""A Signature object represents the overall signature of a function.
It stores a Parameter object for each parameter accepted by the
function, as well as information specific to the function itself.
A Signature object has the following public attributes and methods:
@ -452,20 +475,21 @@ class Signature(object):
* bind_partial(*args, **kwargs) -> BoundArguments
Creates a partial mapping from positional and keyword arguments
to parameters (simulating 'functools.partial' behavior.)
'''
"""
__slots__ = ('_return_annotation', '_parameters')
__slots__ = ("_return_annotation", "_parameters")
_parameter_cls = Parameter
_bound_arguments_cls = BoundArguments
empty = _empty
def __init__(self, parameters=None, return_annotation=_empty,
__validate_parameters__=True):
'''Constructs Signature from the given list of Parameter
def __init__(
self, parameters=None, return_annotation=_empty, __validate_parameters__=True
):
"""Constructs Signature from the given list of Parameter
objects and 'return_annotation'. All arguments are optional.
'''
"""
if parameters is None:
params = OrderedDict()
@ -477,7 +501,7 @@ class Signature(object):
for idx, param in enumerate(parameters):
kind = param.kind
if kind < top_kind:
msg = 'wrong parameter order: {0} before {1}'
msg = "wrong parameter order: {0} before {1}"
msg = msg.format(top_kind, param.kind)
raise ValueError(msg)
else:
@ -489,22 +513,21 @@ class Signature(object):
param = param.replace(name=name)
if name in params:
msg = 'duplicate parameter name: {!r}'.format(name)
msg = "duplicate parameter name: {!r}".format(name)
raise ValueError(msg)
params[name] = param
else:
params = OrderedDict(((param.name, param)
for param in parameters))
params = OrderedDict(((param.name, param) for param in parameters))
self._parameters = params
self._return_annotation = return_annotation
@classmethod
def from_function(cls, func):
'''Constructs Signature for the given python function'''
"""Constructs Signature for the given python function"""
if not isinstance(func, types.FunctionType):
raise TypeError('{!r} is not a Python function'.format(func))
raise TypeError("{!r} is not a Python function".format(func))
Parameter = cls._parameter_cls
@ -513,11 +536,11 @@ class Signature(object):
pos_count = func_code.co_argcount
arg_names = func_code.co_varnames
positional = tuple(arg_names[:pos_count])
keyword_only_count = getattr(func_code, 'co_kwonlyargcount', 0)
keyword_only = arg_names[pos_count:(pos_count + keyword_only_count)]
annotations = getattr(func, '__annotations__', {})
keyword_only_count = getattr(func_code, "co_kwonlyargcount", 0)
keyword_only = arg_names[pos_count : (pos_count + keyword_only_count)]
annotations = getattr(func, "__annotations__", {})
defaults = func.__defaults__
kwdefaults = getattr(func, '__kwdefaults__', None)
kwdefaults = getattr(func, "__kwdefaults__", None)
if defaults:
pos_default_count = len(defaults)
@ -530,22 +553,29 @@ class Signature(object):
non_default_count = pos_count - pos_default_count
for name in positional[:non_default_count]:
annotation = annotations.get(name, _empty)
parameters.append(Parameter(name, annotation=annotation,
kind=_POSITIONAL_OR_KEYWORD))
parameters.append(
Parameter(name, annotation=annotation, kind=_POSITIONAL_OR_KEYWORD)
)
# ... w/ defaults.
for offset, name in enumerate(positional[non_default_count:]):
annotation = annotations.get(name, _empty)
parameters.append(Parameter(name, annotation=annotation,
kind=_POSITIONAL_OR_KEYWORD,
default=defaults[offset]))
parameters.append(
Parameter(
name,
annotation=annotation,
kind=_POSITIONAL_OR_KEYWORD,
default=defaults[offset],
)
)
# *args
if func_code.co_flags & 0x04:
name = arg_names[pos_count + keyword_only_count]
annotation = annotations.get(name, _empty)
parameters.append(Parameter(name, annotation=annotation,
kind=_VAR_POSITIONAL))
parameters.append(
Parameter(name, annotation=annotation, kind=_VAR_POSITIONAL)
)
# Keyword-only parameters.
for name in keyword_only:
@ -554,9 +584,11 @@ class Signature(object):
default = kwdefaults.get(name, _empty)
annotation = annotations.get(name, _empty)
parameters.append(Parameter(name, annotation=annotation,
kind=_KEYWORD_ONLY,
default=default))
parameters.append(
Parameter(
name, annotation=annotation, kind=_KEYWORD_ONLY, default=default
)
)
# **kwargs
if func_code.co_flags & 0x08:
index = pos_count + keyword_only_count
@ -565,12 +597,13 @@ class Signature(object):
name = arg_names[index]
annotation = annotations.get(name, _empty)
parameters.append(Parameter(name, annotation=annotation,
kind=_VAR_KEYWORD))
parameters.append(Parameter(name, annotation=annotation, kind=_VAR_KEYWORD))
return cls(parameters,
return_annotation=annotations.get('return', _empty),
__validate_parameters__=False)
return cls(
parameters,
return_annotation=annotations.get("return", _empty),
__validate_parameters__=False,
)
@property
def parameters(self):
@ -584,10 +617,10 @@ class Signature(object):
return self._return_annotation
def replace(self, parameters=_void, return_annotation=_void):
'''Creates a customized copy of the Signature.
"""Creates a customized copy of the Signature.
Pass 'parameters' and/or 'return_annotation' arguments
to override them in the new copy.
'''
"""
if parameters is _void:
parameters = self.parameters.values()
@ -595,21 +628,23 @@ class Signature(object):
if return_annotation is _void:
return_annotation = self._return_annotation
return type(self)(parameters,
return_annotation=return_annotation)
return type(self)(parameters, return_annotation=return_annotation)
def __hash__(self):
msg = "unhashable type: '{}'".format(self.__class__.__name__)
raise TypeError(msg)
def __eq__(self, other):
if (not issubclass(type(other), Signature) or
self.return_annotation != other.return_annotation or
len(self.parameters) != len(other.parameters)):
if (
not issubclass(type(other), Signature)
or self.return_annotation != other.return_annotation
or len(self.parameters) != len(other.parameters)
):
return False
other_positions = {param: idx
for idx, param in enumerate(other.parameters.keys())}
other_positions = {
param: idx for idx, param in enumerate(other.parameters.keys())
}
for idx, (param_name, param) in enumerate(self.parameters.items()):
if param.kind == _KEYWORD_ONLY:
@ -626,8 +661,7 @@ class Signature(object):
except KeyError:
return False
else:
if (idx != other_idx or
param != other.parameters[param_name]):
if idx != other_idx or param != other.parameters[param_name]:
return False
return True
@ -636,7 +670,7 @@ class Signature(object):
return not self.__eq__(other)
def _bind(self, args, kwargs, partial=False):
'''Private method. Don't use directly.'''
"""Private method. Don't use directly."""
arguments = OrderedDict()
@ -649,7 +683,7 @@ class Signature(object):
# See 'functools.partial' case in 'signature()' implementation
# for details.
for param_name, param in self.parameters.items():
if (param._partial_kwarg and param_name not in kwargs):
if param._partial_kwarg and param_name not in kwargs:
# Simulating 'functools.partial' behavior
kwargs[param_name] = param.default
@ -673,14 +707,12 @@ class Signature(object):
break
elif param.name in kwargs:
if param.kind == _POSITIONAL_ONLY:
msg = '{arg!r} parameter is positional only, ' \
'but was passed as a keyword'
msg = "{arg!r} parameter is positional only, " "but was passed as a keyword"
msg = msg.format(arg=param.name)
raise TypeError(msg)
parameters_ex = (param,)
break
elif (param.kind == _VAR_KEYWORD or
param.default is not _empty):
elif param.kind == _VAR_KEYWORD or param.default is not _empty:
# That's fine too - we have a default value for this
# parameter. So, lets start parsing `kwargs`, starting
# with the current parameter
@ -691,7 +723,7 @@ class Signature(object):
parameters_ex = (param,)
break
else:
msg = '{arg!r} parameter lacking default value'
msg = "{arg!r} parameter lacking default value"
msg = msg.format(arg=param.name)
raise TypeError(msg)
else:
@ -699,12 +731,12 @@ class Signature(object):
try:
param = next(parameters)
except StopIteration:
raise TypeError('too many positional arguments')
raise TypeError("too many positional arguments")
else:
if param.kind in (_VAR_KEYWORD, _KEYWORD_ONLY):
# Looks like we have no parameter for this positional
# argument
raise TypeError('too many positional arguments')
raise TypeError("too many positional arguments")
if param.kind == _VAR_POSITIONAL:
# We have an '*args'-like argument, let's fill it with
@ -716,8 +748,10 @@ class Signature(object):
break
if param.name in kwargs:
raise TypeError('multiple values for argument '
'{arg!r}'.format(arg=param.name))
raise TypeError(
"multiple values for argument "
"{arg!r}".format(arg=param.name)
)
arguments[param.name] = arg_val
@ -729,9 +763,10 @@ class Signature(object):
# This should never happen in case of a properly built
# Signature object (but let's have this check here
# to ensure correct behaviour just in case)
raise TypeError('{arg!r} parameter is positional only, '
'but was passed as a keyword'.
format(arg=param.name))
raise TypeError(
"{arg!r} parameter is positional only, "
"but was passed as a keyword".format(arg=param.name)
)
if param.kind == _VAR_KEYWORD:
# Memorize that we have a '**kwargs'-like parameter
@ -746,10 +781,14 @@ class Signature(object):
# if it has a default value, or it is an '*args'-like
# parameter, left alone by the processing of positional
# arguments.
if (not partial and param.kind != _VAR_POSITIONAL and
param.default is _empty):
raise TypeError('{arg!r} parameter lacking default value'.
format(arg=param_name))
if (
not partial
and param.kind != _VAR_POSITIONAL
and param.default is _empty
):
raise TypeError(
"{arg!r} parameter lacking default value".format(arg=param_name)
)
else:
arguments[param_name] = arg_val
@ -759,22 +798,22 @@ class Signature(object):
# Process our '**kwargs'-like parameter
arguments[kwargs_param.name] = kwargs
else:
raise TypeError('too many keyword arguments')
raise TypeError("too many keyword arguments")
return self._bound_arguments_cls(self, arguments)
def bind(self, *args, **kwargs):
'''Get a BoundArguments object, that maps the passed `args`
"""Get a BoundArguments object, that maps the passed `args`
and `kwargs` to the function's signature. Raises `TypeError`
if the passed arguments can not be bound.
'''
"""
return self._bind(args, kwargs)
def bind_partial(self, *args, **kwargs):
'''Get a BoundArguments object, that partially maps the
"""Get a BoundArguments object, that partially maps the
passed `args` and `kwargs` to the function's signature.
Raises `TypeError` if the passed arguments can not be bound.
'''
"""
return self._bind(args, kwargs, partial=True)
def __str__(self):
@ -792,17 +831,17 @@ class Signature(object):
# We have a keyword-only parameter to render and we haven't
# rendered an '*args'-like parameter before, so add a '*'
# separator to the parameters list ("foo(arg1, *, arg2)" case)
result.append('*')
result.append("*")
# This condition should be only triggered once, so
# reset the flag
render_kw_only_separator = False
result.append(formatted)
rendered = '({})'.format(', '.join(result))
rendered = "({})".format(", ".join(result))
if self.return_annotation is not _empty:
anno = formatannotation(self.return_annotation)
rendered += ' -> {}'.format(anno)
rendered += " -> {}".format(anno)
return rendered

View File

@ -1,42 +0,0 @@
from ..enum import _is_dunder, _is_sunder
def test__is_dunder():
dunder_names = [
'__i__',
'__test__',
]
non_dunder_names = [
'test',
'__test',
'_test',
'_test_',
'test__',
'',
]
for name in dunder_names:
assert _is_dunder(name) is True
for name in non_dunder_names:
assert _is_dunder(name) is False
def test__is_sunder():
sunder_names = [
'_i_',
'_test_',
]
non_sunder_names = [
'__i__',
'_i__',
'__i_',
'',
]
for name in sunder_names:
assert _is_sunder(name) is True
for name in non_sunder_names:
assert _is_sunder(name) is False

View File

@ -16,15 +16,15 @@ def get_version(version=None):
main = get_main_version(version)
sub = ''
if version[3] == 'alpha' and version[4] == 0:
sub = ""
if version[3] == "alpha" and version[4] == 0:
git_changeset = get_git_changeset()
if git_changeset:
sub = '.dev%s' % git_changeset
sub = ".dev%s" % git_changeset
else:
sub = '.dev'
elif version[3] != 'final':
mapping = {'alpha': 'a', 'beta': 'b', 'rc': 'rc'}
sub = ".dev"
elif version[3] != "final":
mapping = {"alpha": "a", "beta": "b", "rc": "rc"}
sub = mapping[version[3]] + str(version[4])
return str(main + sub)
@ -34,7 +34,7 @@ def get_main_version(version=None):
"Returns main version (X.Y[.Z]) from VERSION."
version = get_complete_version(version)
parts = 2 if version[2] == 0 else 3
return '.'.join(str(x) for x in version[:parts])
return ".".join(str(x) for x in version[:parts])
def get_complete_version(version=None):
@ -45,17 +45,17 @@ def get_complete_version(version=None):
from graphene import VERSION as version
else:
assert len(version) == 5
assert version[3] in ('alpha', 'beta', 'rc', 'final')
assert version[3] in ("alpha", "beta", "rc", "final")
return version
def get_docs_version(version=None):
version = get_complete_version(version)
if version[3] != 'final':
return 'dev'
if version[3] != "final":
return "dev"
else:
return '%d.%d' % version[:2]
return "%d.%d" % version[:2]
def get_git_changeset():
@ -67,12 +67,15 @@ def get_git_changeset():
repo_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
try:
git_log = subprocess.Popen(
'git log --pretty=format:%ct --quiet -1 HEAD',
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
shell=True, cwd=repo_dir, universal_newlines=True,
"git log --pretty=format:%ct --quiet -1 HEAD",
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=True,
cwd=repo_dir,
universal_newlines=True,
)
timestamp = git_log.communicate()[0]
timestamp = datetime.datetime.utcfromtimestamp(int(timestamp))
except:
return None
return timestamp.strftime('%Y%m%d%H%M%S')
return timestamp.strftime("%Y%m%d%H%M%S")

View File

@ -3,11 +3,11 @@ from .mutation import ClientIDMutation
from .connection import Connection, ConnectionField, PageInfo
__all__ = [
'Node',
'is_node',
'GlobalID',
'ClientIDMutation',
'Connection',
'ConnectionField',
'PageInfo',
"Node",
"is_node",
"GlobalID",
"ClientIDMutation",
"Connection",
"ConnectionField",
"PageInfo",
]

View File

@ -3,36 +3,41 @@ from collections import Iterable, OrderedDict
from functools import partial
from graphql_relay import connection_from_list
from promise import Promise, is_thenable
from ..types import (Boolean, Enum, Int, Interface, List, NonNull, Scalar,
String, Union)
from ..types import Boolean, Enum, Int, Interface, List, NonNull, Scalar, String, Union
from ..types.field import Field
from ..types.objecttype import ObjectType, ObjectTypeOptions
from ..utils.thenables import maybe_thenable
from .node import is_node
class PageInfo(ObjectType):
class Meta:
description = (
"The Relay compliant `PageInfo` type, containing data necessary to"
" paginate this connection."
)
has_next_page = Boolean(
required=True,
name='hasNextPage',
description='When paginating forwards, are there more items?',
name="hasNextPage",
description="When paginating forwards, are there more items?",
)
has_previous_page = Boolean(
required=True,
name='hasPreviousPage',
description='When paginating backwards, are there more items?',
name="hasPreviousPage",
description="When paginating backwards, are there more items?",
)
start_cursor = String(
name='startCursor',
description='When paginating backwards, the cursor to continue.',
name="startCursor",
description="When paginating backwards, the cursor to continue.",
)
end_cursor = String(
name='endCursor',
description='When paginating forwards, the cursor to continue.',
name="endCursor",
description="When paginating forwards, the cursor to continue.",
)
@ -41,59 +46,78 @@ class ConnectionOptions(ObjectTypeOptions):
class Connection(ObjectType):
class Meta:
abstract = True
@classmethod
def __init_subclass_with_meta__(cls, node=None, name=None, **options):
_meta = ConnectionOptions(cls)
assert node, 'You have to provide a node in {}.Meta'.format(cls.__name__)
assert issubclass(node, (Scalar, Enum, ObjectType, Interface, Union, NonNull)), (
'Received incompatible node "{}" for Connection {}.'
).format(node, cls.__name__)
assert node, "You have to provide a node in {}.Meta".format(cls.__name__)
assert issubclass(
node, (Scalar, Enum, ObjectType, Interface, Union, NonNull)
), ('Received incompatible node "{}" for Connection {}.').format(
node, cls.__name__
)
base_name = re.sub('Connection$', '', name or cls.__name__) or node._meta.name
base_name = re.sub("Connection$", "", name or cls.__name__) or node._meta.name
if not name:
name = '{}Connection'.format(base_name)
name = "{}Connection".format(base_name)
edge_class = getattr(cls, 'Edge', None)
edge_class = getattr(cls, "Edge", None)
_node = node
class EdgeBase(object):
node = Field(_node, description='The item at the end of the edge')
cursor = String(required=True, description='A cursor for use in pagination')
node = Field(_node, description="The item at the end of the edge")
cursor = String(required=True, description="A cursor for use in pagination")
edge_name = '{}Edge'.format(base_name)
class EdgeMeta:
description = "A Relay edge containing a `{}` and its cursor.".format(
base_name
)
edge_name = "{}Edge".format(base_name)
if edge_class:
edge_bases = (edge_class, EdgeBase, ObjectType,)
edge_bases = (edge_class, EdgeBase, ObjectType)
else:
edge_bases = (EdgeBase, ObjectType,)
edge_bases = (EdgeBase, ObjectType)
edge = type(edge_name, edge_bases, {})
edge = type(edge_name, edge_bases, {"Meta": EdgeMeta})
cls.Edge = edge
options['name'] = name
options["name"] = name
_meta.node = node
_meta.fields = OrderedDict([
('page_info', Field(PageInfo, name='pageInfo', required=True)),
('edges', Field(NonNull(List(edge)))),
])
return super(Connection, cls).__init_subclass_with_meta__(_meta=_meta, **options)
_meta.fields = OrderedDict(
[
(
"page_info",
Field(
PageInfo,
name="pageInfo",
required=True,
description="Pagination data for this connection.",
),
),
(
"edges",
Field(
NonNull(List(edge)),
description="Contains the nodes in this connection.",
),
),
]
)
return super(Connection, cls).__init_subclass_with_meta__(
_meta=_meta, **options
)
class IterableConnectionField(Field):
def __init__(self, type, *args, **kwargs):
kwargs.setdefault('before', String())
kwargs.setdefault('after', String())
kwargs.setdefault('first', Int())
kwargs.setdefault('last', Int())
super(IterableConnectionField, self).__init__(
type,
*args,
**kwargs
)
kwargs.setdefault("before", String())
kwargs.setdefault("after", String())
kwargs.setdefault("first", Int())
kwargs.setdefault("last", Int())
super(IterableConnectionField, self).__init__(type, *args, **kwargs)
@property
def type(self):
@ -104,7 +128,7 @@ class IterableConnectionField(Field):
if is_node(connection_type):
raise Exception(
"ConnectionField's now need a explicit ConnectionType for Nodes.\n"
"ConnectionFields now need a explicit ConnectionType for Nodes.\n"
"Read more: https://github.com/graphql-python/graphene/blob/v2.0.0/UPGRADE-v2.0.md#node-connections"
)
@ -119,7 +143,7 @@ class IterableConnectionField(Field):
return resolved
assert isinstance(resolved, Iterable), (
'Resolved value from the connection field have to be iterable or instance of {}. '
"Resolved value from the connection field have to be iterable or instance of {}. "
'Received "{}"'
).format(connection_type, resolved)
connection = connection_from_list(
@ -127,7 +151,7 @@ class IterableConnectionField(Field):
args,
connection_type=connection_type,
edge_type=connection_type.Edge,
pageinfo_type=PageInfo
pageinfo_type=PageInfo,
)
connection.iterable = resolved
return connection
@ -140,10 +164,7 @@ class IterableConnectionField(Field):
connection_type = connection_type.of_type
on_resolve = partial(cls.resolve_connection, connection_type, args)
if is_thenable(resolved):
return Promise.resolve(resolved).then(on_resolve)
return on_resolve(resolved)
return maybe_thenable(resolved, on_resolve)
def get_resolver(self, parent_resolver):
resolver = super(IterableConnectionField, self).get_resolver(parent_resolver)

View File

@ -1,72 +1,71 @@
import re
from collections import OrderedDict
from promise import Promise, is_thenable
from ..types import Field, InputObjectType, String
from ..types.mutation import Mutation
from ..utils.thenables import maybe_thenable
class ClientIDMutation(Mutation):
class Meta:
abstract = True
@classmethod
def __init_subclass_with_meta__(cls, output=None, input_fields=None,
arguments=None, name=None, **options):
input_class = getattr(cls, 'Input', None)
base_name = re.sub('Payload$', '', name or cls.__name__)
def __init_subclass_with_meta__(
cls, output=None, input_fields=None, arguments=None, name=None, **options
):
input_class = getattr(cls, "Input", None)
base_name = re.sub("Payload$", "", name or cls.__name__)
assert not output, "Can't specify any output"
assert not arguments, "Can't specify any arguments"
bases = (InputObjectType, )
bases = (InputObjectType,)
if input_class:
bases += (input_class, )
bases += (input_class,)
if not input_fields:
input_fields = {}
cls.Input = type(
'{}Input'.format(base_name),
"{}Input".format(base_name),
bases,
OrderedDict(input_fields, client_mutation_id=String(
name='clientMutationId'))
OrderedDict(
input_fields, client_mutation_id=String(name="clientMutationId")
),
)
arguments = OrderedDict(
input=cls.Input(required=True)
# 'client_mutation_id': String(name='clientMutationId')
)
mutate_and_get_payload = getattr(cls, 'mutate_and_get_payload', None)
mutate_and_get_payload = getattr(cls, "mutate_and_get_payload", None)
if cls.mutate and cls.mutate.__func__ == ClientIDMutation.mutate.__func__:
assert mutate_and_get_payload, (
"{name}.mutate_and_get_payload method is required"
" in a ClientIDMutation.").format(name=name or cls.__name__)
" in a ClientIDMutation."
).format(name=name or cls.__name__)
if not name:
name = '{}Payload'.format(base_name)
name = "{}Payload".format(base_name)
super(ClientIDMutation, cls).__init_subclass_with_meta__(
output=None, arguments=arguments, name=name, **options)
cls._meta.fields['client_mutation_id'] = (
Field(String, name='clientMutationId')
output=None, arguments=arguments, name=name, **options
)
cls._meta.fields["client_mutation_id"] = Field(String, name="clientMutationId")
@classmethod
def mutate(cls, root, info, input):
def on_resolve(payload):
try:
payload.client_mutation_id = input.get('client_mutation_id')
payload.client_mutation_id = input.get("client_mutation_id")
except Exception:
raise Exception(
('Cannot set client_mutation_id in the payload object {}'
).format(repr(payload)))
("Cannot set client_mutation_id in the payload object {}").format(
repr(payload)
)
)
return payload
result = cls.mutate_and_get_payload(root, info, **input)
if is_thenable(result):
return Promise.resolve(result).then(on_resolve)
return on_resolve(result)
return maybe_thenable(result, on_resolve)

View File

@ -10,9 +10,9 @@ from ..types.utils import get_type
def is_node(objecttype):
'''
"""
Check if the given objecttype has Node as an interface
'''
"""
if not isclass(objecttype):
return False
@ -27,7 +27,6 @@ def is_node(objecttype):
class GlobalID(Field):
def __init__(self, node=None, parent_type=None, required=True, *args, **kwargs):
super(GlobalID, self).__init__(ID, required=required, *args, **kwargs)
self.node = node or Node
@ -41,15 +40,16 @@ class GlobalID(Field):
def get_resolver(self, parent_resolver):
return partial(
self.id_resolver, parent_resolver, self.node, parent_type_name=self.parent_type_name
self.id_resolver,
parent_resolver,
self.node,
parent_type_name=self.parent_type_name,
)
class NodeField(Field):
def __init__(self, node, type=False, deprecation_reason=None,
name=None, **kwargs):
assert issubclass(node, Node), 'NodeField can only operate in Nodes'
def __init__(self, node, type=False, deprecation_reason=None, name=None, **kwargs):
assert issubclass(node, Node), "NodeField can only operate in Nodes"
self.node_type = node
self.field_type = type
@ -57,8 +57,8 @@ class NodeField(Field):
# If we don's specify a type, the field type will be the node
# interface
type or node,
description='The ID of the object',
id=ID(required=True)
description="The ID of the object",
id=ID(required=True),
)
def get_resolver(self, parent_resolver):
@ -66,7 +66,6 @@ class NodeField(Field):
class AbstractNode(Interface):
class Meta:
abstract = True
@ -74,14 +73,13 @@ class AbstractNode(Interface):
def __init_subclass_with_meta__(cls, **options):
_meta = InterfaceOptions(cls)
_meta.fields = OrderedDict(
id=GlobalID(cls, description='The ID of the object.')
id=GlobalID(cls, description="The ID of the object.")
)
super(AbstractNode, cls).__init_subclass_with_meta__(
_meta=_meta, **options)
super(AbstractNode, cls).__init_subclass_with_meta__(_meta=_meta, **options)
class Node(AbstractNode):
'''An object with an ID'''
"""An object with an ID"""
@classmethod
def Field(cls, *args, **kwargs): # noqa: N802
@ -100,15 +98,15 @@ class Node(AbstractNode):
return None
if only_type:
assert graphene_type == only_type, (
'Must receive a {} id.'
).format(only_type._meta.name)
assert graphene_type == only_type, ("Must receive a {} id.").format(
only_type._meta.name
)
# We make sure the ObjectType implements the "Node" interface
if cls not in graphene_type._meta.interfaces:
return None
get_node = getattr(graphene_type, 'get_node', None)
get_node = getattr(graphene_type, "get_node", None)
if get_node:
return get_node(info, _id)

View File

@ -1,15 +1,14 @@
import pytest
from ...types import (Argument, Field, Int, List, NonNull, ObjectType, Schema,
String)
from ...types import Argument, Field, Int, List, NonNull, ObjectType, Schema, String
from ..connection import Connection, ConnectionField, PageInfo
from ..node import Node
class MyObject(ObjectType):
class Meta:
interfaces = [Node]
field = String()
@ -23,11 +22,11 @@ def test_connection():
class Edge:
other = String()
assert MyObjectConnection._meta.name == 'MyObjectConnection'
assert MyObjectConnection._meta.name == "MyObjectConnection"
fields = MyObjectConnection._meta.fields
assert list(fields.keys()) == ['page_info', 'edges', 'extra']
edge_field = fields['edges']
pageinfo_field = fields['page_info']
assert list(fields.keys()) == ["page_info", "edges", "extra"]
edge_field = fields["edges"]
pageinfo_field = fields["page_info"]
assert isinstance(edge_field, Field)
assert isinstance(edge_field.type, NonNull)
@ -44,13 +43,12 @@ def test_connection_inherit_abstracttype():
extra = String()
class MyObjectConnection(BaseConnection, Connection):
class Meta:
node = MyObject
assert MyObjectConnection._meta.name == 'MyObjectConnection'
assert MyObjectConnection._meta.name == "MyObjectConnection"
fields = MyObjectConnection._meta.fields
assert list(fields.keys()) == ['page_info', 'edges', 'extra']
assert list(fields.keys()) == ["page_info", "edges", "extra"]
def test_connection_name():
@ -60,7 +58,6 @@ def test_connection_name():
extra = String()
class MyObjectConnection(BaseConnection, Connection):
class Meta:
node = MyObject
name = custom_name
@ -70,7 +67,6 @@ def test_connection_name():
def test_edge():
class MyObjectConnection(Connection):
class Meta:
node = MyObject
@ -78,15 +74,15 @@ def test_edge():
other = String()
Edge = MyObjectConnection.Edge
assert Edge._meta.name == 'MyObjectEdge'
assert Edge._meta.name == "MyObjectEdge"
edge_fields = Edge._meta.fields
assert list(edge_fields.keys()) == ['node', 'cursor', 'other']
assert list(edge_fields.keys()) == ["node", "cursor", "other"]
assert isinstance(edge_fields['node'], Field)
assert edge_fields['node'].type == MyObject
assert isinstance(edge_fields["node"], Field)
assert edge_fields["node"].type == MyObject
assert isinstance(edge_fields['other'], Field)
assert edge_fields['other'].type == String
assert isinstance(edge_fields["other"], Field)
assert edge_fields["other"].type == String
def test_edge_with_bases():
@ -94,7 +90,6 @@ def test_edge_with_bases():
extra = String()
class MyObjectConnection(Connection):
class Meta:
node = MyObject
@ -102,35 +97,39 @@ def test_edge_with_bases():
other = String()
Edge = MyObjectConnection.Edge
assert Edge._meta.name == 'MyObjectEdge'
assert Edge._meta.name == "MyObjectEdge"
edge_fields = Edge._meta.fields
assert list(edge_fields.keys()) == ['node', 'cursor', 'extra', 'other']
assert list(edge_fields.keys()) == ["node", "cursor", "extra", "other"]
assert isinstance(edge_fields['node'], Field)
assert edge_fields['node'].type == MyObject
assert isinstance(edge_fields["node"], Field)
assert edge_fields["node"].type == MyObject
assert isinstance(edge_fields['other'], Field)
assert edge_fields['other'].type == String
assert isinstance(edge_fields["other"], Field)
assert edge_fields["other"].type == String
def test_pageinfo():
assert PageInfo._meta.name == 'PageInfo'
assert PageInfo._meta.name == "PageInfo"
fields = PageInfo._meta.fields
assert list(fields.keys()) == ['has_next_page', 'has_previous_page', 'start_cursor', 'end_cursor']
assert list(fields.keys()) == [
"has_next_page",
"has_previous_page",
"start_cursor",
"end_cursor",
]
def test_connectionfield():
class MyObjectConnection(Connection):
class Meta:
node = MyObject
field = ConnectionField(MyObjectConnection)
assert field.args == {
'before': Argument(String),
'after': Argument(String),
'first': Argument(Int),
'last': Argument(Int),
"before": Argument(String),
"after": Argument(String),
"first": Argument(Int),
"last": Argument(Int),
}
@ -139,28 +138,30 @@ def test_connectionfield_node_deprecated():
with pytest.raises(Exception) as exc_info:
field.type
assert "ConnectionField's now need a explicit ConnectionType for Nodes." in str(exc_info.value)
assert "ConnectionFields now need a explicit ConnectionType for Nodes." in str(
exc_info.value
)
def test_connectionfield_custom_args():
class MyObjectConnection(Connection):
class Meta:
node = MyObject
field = ConnectionField(MyObjectConnection, before=String(required=True), extra=String())
field = ConnectionField(
MyObjectConnection, before=String(required=True), extra=String()
)
assert field.args == {
'before': Argument(NonNull(String)),
'after': Argument(String),
'first': Argument(Int),
'last': Argument(Int),
'extra': Argument(String),
"before": Argument(NonNull(String)),
"after": Argument(String),
"first": Argument(Int),
"last": Argument(Int),
"extra": Argument(String),
}
def test_connectionfield_required():
class MyObjectConnection(Connection):
class Meta:
node = MyObject
@ -171,8 +172,6 @@ def test_connectionfield_required():
return []
schema = Schema(query=Query)
executed = schema.execute(
'{ testConnection { edges { cursor } } }'
)
executed = schema.execute("{ testConnection { edges { cursor } } }")
assert not executed.errors
assert executed.data == {'testConnection': {'edges': []}}
assert executed.data == {"testConnection": {"edges": []}}

View File

@ -7,19 +7,17 @@ from ...types import ObjectType, Schema, String
from ..connection import Connection, ConnectionField, PageInfo
from ..node import Node
letter_chars = ['A', 'B', 'C', 'D', 'E']
letter_chars = ["A", "B", "C", "D", "E"]
class Letter(ObjectType):
class Meta:
interfaces = (Node, )
interfaces = (Node,)
letter = String()
class LetterConnection(Connection):
class Meta:
node = Letter
@ -39,16 +37,10 @@ class Query(ObjectType):
def resolve_connection_letters(self, info, **args):
return LetterConnection(
page_info=PageInfo(
has_next_page=True,
has_previous_page=False
),
page_info=PageInfo(has_next_page=True, has_previous_page=False),
edges=[
LetterConnection.Edge(
node=Letter(id=0, letter='A'),
cursor='a-cursor'
),
]
LetterConnection.Edge(node=Letter(id=0, letter="A"), cursor="a-cursor")
],
)
@ -62,11 +54,8 @@ for i, letter in enumerate(letter_chars):
def edges(selected_letters):
return [
{
'node': {
'id': base64('Letter:%s' % l.id),
'letter': l.letter
},
'cursor': base64('arrayconnection:%s' % l.id)
"node": {"id": base64("Letter:%s" % l.id), "letter": l.letter},
"cursor": base64("arrayconnection:%s" % l.id),
}
for l in [letters[i] for i in selected_letters]
]
@ -74,14 +63,15 @@ def edges(selected_letters):
def cursor_for(ltr):
letter = letters[ltr]
return base64('arrayconnection:%s' % letter.id)
return base64("arrayconnection:%s" % letter.id)
def execute(args=''):
def execute(args=""):
if args:
args = '(' + args + ')'
args = "(" + args + ")"
return schema.execute('''
return schema.execute(
"""
{
letters%s {
edges {
@ -99,112 +89,136 @@ def execute(args=''):
}
}
}
''' % args)
"""
% args
)
def check(args, letters, has_previous_page=False, has_next_page=False):
result = execute(args)
expected_edges = edges(letters)
expected_page_info = {
'hasPreviousPage': has_previous_page,
'hasNextPage': has_next_page,
'endCursor': expected_edges[-1]['cursor'] if expected_edges else None,
'startCursor': expected_edges[0]['cursor'] if expected_edges else None
"hasPreviousPage": has_previous_page,
"hasNextPage": has_next_page,
"endCursor": expected_edges[-1]["cursor"] if expected_edges else None,
"startCursor": expected_edges[0]["cursor"] if expected_edges else None,
}
assert not result.errors
assert result.data == {
'letters': {
'edges': expected_edges,
'pageInfo': expected_page_info
}
"letters": {"edges": expected_edges, "pageInfo": expected_page_info}
}
def test_returns_all_elements_without_filters():
check('', 'ABCDE')
check("", "ABCDE")
def test_respects_a_smaller_first():
check('first: 2', 'AB', has_next_page=True)
check("first: 2", "AB", has_next_page=True)
def test_respects_an_overly_large_first():
check('first: 10', 'ABCDE')
check("first: 10", "ABCDE")
def test_respects_a_smaller_last():
check('last: 2', 'DE', has_previous_page=True)
check("last: 2", "DE", has_previous_page=True)
def test_respects_an_overly_large_last():
check('last: 10', 'ABCDE')
check("last: 10", "ABCDE")
def test_respects_first_and_after():
check('first: 2, after: "{}"'.format(cursor_for('B')), 'CD', has_next_page=True)
check('first: 2, after: "{}"'.format(cursor_for("B")), "CD", has_next_page=True)
def test_respects_first_and_after_with_long_first():
check('first: 10, after: "{}"'.format(cursor_for('B')), 'CDE')
check('first: 10, after: "{}"'.format(cursor_for("B")), "CDE")
def test_respects_last_and_before():
check('last: 2, before: "{}"'.format(cursor_for('D')), 'BC', has_previous_page=True)
check('last: 2, before: "{}"'.format(cursor_for("D")), "BC", has_previous_page=True)
def test_respects_last_and_before_with_long_last():
check('last: 10, before: "{}"'.format(cursor_for('D')), 'ABC')
check('last: 10, before: "{}"'.format(cursor_for("D")), "ABC")
def test_respects_first_and_after_and_before_too_few():
check('first: 2, after: "{}", before: "{}"'.format(cursor_for('A'), cursor_for('E')), 'BC', has_next_page=True)
check(
'first: 2, after: "{}", before: "{}"'.format(cursor_for("A"), cursor_for("E")),
"BC",
has_next_page=True,
)
def test_respects_first_and_after_and_before_too_many():
check('first: 4, after: "{}", before: "{}"'.format(cursor_for('A'), cursor_for('E')), 'BCD')
check(
'first: 4, after: "{}", before: "{}"'.format(cursor_for("A"), cursor_for("E")),
"BCD",
)
def test_respects_first_and_after_and_before_exactly_right():
check('first: 3, after: "{}", before: "{}"'.format(cursor_for('A'), cursor_for('E')), "BCD")
check(
'first: 3, after: "{}", before: "{}"'.format(cursor_for("A"), cursor_for("E")),
"BCD",
)
def test_respects_last_and_after_and_before_too_few():
check('last: 2, after: "{}", before: "{}"'.format(cursor_for('A'), cursor_for('E')), 'CD', has_previous_page=True)
check(
'last: 2, after: "{}", before: "{}"'.format(cursor_for("A"), cursor_for("E")),
"CD",
has_previous_page=True,
)
def test_respects_last_and_after_and_before_too_many():
check('last: 4, after: "{}", before: "{}"'.format(cursor_for('A'), cursor_for('E')), 'BCD')
check(
'last: 4, after: "{}", before: "{}"'.format(cursor_for("A"), cursor_for("E")),
"BCD",
)
def test_respects_last_and_after_and_before_exactly_right():
check('last: 3, after: "{}", before: "{}"'.format(cursor_for('A'), cursor_for('E')), 'BCD')
check(
'last: 3, after: "{}", before: "{}"'.format(cursor_for("A"), cursor_for("E")),
"BCD",
)
def test_returns_no_elements_if_first_is_0():
check('first: 0', '', has_next_page=True)
check("first: 0", "", has_next_page=True)
def test_returns_all_elements_if_cursors_are_invalid():
check('before: "invalid" after: "invalid"', 'ABCDE')
check('before: "invalid" after: "invalid"', "ABCDE")
def test_returns_all_elements_if_cursors_are_on_the_outside():
check(
'before: "{}" after: "{}"'.format(
base64(
'arrayconnection:%s' % 6),
base64(
'arrayconnection:%s' % -1)),
'ABCDE')
base64("arrayconnection:%s" % 6), base64("arrayconnection:%s" % -1)
),
"ABCDE",
)
def test_returns_no_elements_if_cursors_cross():
check('before: "{}" after: "{}"'.format(base64('arrayconnection:%s' % 2), base64('arrayconnection:%s' % 4)), '')
check(
'before: "{}" after: "{}"'.format(
base64("arrayconnection:%s" % 2), base64("arrayconnection:%s" % 4)
),
"",
)
def test_connection_type_nodes():
result = schema.execute('''
result = schema.execute(
"""
{
connectionLetters {
edges {
@ -220,28 +234,23 @@ def test_connection_type_nodes():
}
}
}
''')
"""
)
assert not result.errors
assert result.data == {
'connectionLetters': {
'edges': [{
'node': {
'id': 'TGV0dGVyOjA=',
'letter': 'A',
},
'cursor': 'a-cursor',
}],
'pageInfo': {
'hasPreviousPage': False,
'hasNextPage': True,
}
"connectionLetters": {
"edges": [
{"node": {"id": "TGV0dGVyOjA=", "letter": "A"}, "cursor": "a-cursor"}
],
"pageInfo": {"hasPreviousPage": False, "hasNextPage": True},
}
}
def test_connection_promise():
result = schema.execute('''
result = schema.execute(
"""
{
promiseLetters(first:1) {
edges {
@ -256,20 +265,13 @@ def test_connection_promise():
}
}
}
''')
"""
)
assert not result.errors
assert result.data == {
'promiseLetters': {
'edges': [{
'node': {
'id': 'TGV0dGVyOjA=',
'letter': 'A',
},
}],
'pageInfo': {
'hasPreviousPage': False,
'hasNextPage': True,
}
"promiseLetters": {
"edges": [{"node": {"id": "TGV0dGVyOjA=", "letter": "A"}}],
"pageInfo": {"hasPreviousPage": False, "hasNextPage": True},
}
}

View File

@ -6,20 +6,18 @@ from ..node import GlobalID, Node
class CustomNode(Node):
class Meta:
name = 'Node'
name = "Node"
class User(ObjectType):
class Meta:
interfaces = [CustomNode]
name = String()
class Info(object):
def __init__(self, parent_type):
self.parent_type = GrapheneObjectType(
graphene_type=parent_type,
@ -27,7 +25,7 @@ class Info(object):
description=parent_type._meta.description,
fields=None,
is_type_of=parent_type.is_type_of,
interfaces=None
interfaces=None,
)
@ -45,7 +43,7 @@ def test_global_id_allows_overriding_of_node_and_required():
def test_global_id_defaults_to_info_parent_type():
my_id = '1'
my_id = "1"
gid = GlobalID()
id_resolver = gid.get_resolver(lambda *_: my_id)
my_global_id = id_resolver(None, Info(User))
@ -53,7 +51,7 @@ def test_global_id_defaults_to_info_parent_type():
def test_global_id_allows_setting_customer_parent_type():
my_id = '1'
my_id = "1"
gid = GlobalID(parent_type=User)
id_resolver = gid.get_resolver(lambda *_: my_id)
my_global_id = id_resolver(None, None)

View File

@ -1,8 +1,16 @@
import pytest
from promise import Promise
from ...types import (ID, Argument, Field, InputField, InputObjectType,
NonNull, ObjectType, Schema)
from ...types import (
ID,
Argument,
Field,
InputField,
InputObjectType,
NonNull,
ObjectType,
Schema,
)
from ...types.scalars import String
from ..mutation import ClientIDMutation
@ -19,7 +27,6 @@ class MyNode(ObjectType):
class SaySomething(ClientIDMutation):
class Input:
what = String()
@ -31,14 +38,13 @@ class SaySomething(ClientIDMutation):
class FixedSaySomething(object):
__slots__ = 'phrase',
__slots__ = ("phrase",)
def __init__(self, phrase):
self.phrase = phrase
class SaySomethingFixed(ClientIDMutation):
class Input:
what = String()
@ -50,7 +56,6 @@ class SaySomethingFixed(ClientIDMutation):
class SaySomethingPromise(ClientIDMutation):
class Input:
what = String()
@ -68,7 +73,6 @@ class MyEdge(ObjectType):
class OtherMutation(ClientIDMutation):
class Input(SharedFields):
additional_field = String()
@ -76,11 +80,14 @@ class OtherMutation(ClientIDMutation):
my_node_edge = Field(MyEdge)
@staticmethod
def mutate_and_get_payload(self, info, shared='', additional_field='', client_mutation_id=None):
def mutate_and_get_payload(
self, info, shared="", additional_field="", client_mutation_id=None
):
edge_type = MyEdge
return OtherMutation(
name=shared + additional_field,
my_node_edge=edge_type(cursor='1', node=MyNode(name='name')))
my_node_edge=edge_type(cursor="1", node=MyNode(name="name")),
)
class RootQuery(ObjectType):
@ -103,64 +110,62 @@ def test_no_mutate_and_get_payload():
class MyMutation(ClientIDMutation):
pass
assert "MyMutation.mutate_and_get_payload method is required in a ClientIDMutation." == str(
excinfo.value)
assert (
"MyMutation.mutate_and_get_payload method is required in a ClientIDMutation."
== str(excinfo.value)
)
def test_mutation():
fields = SaySomething._meta.fields
assert list(fields.keys()) == ['phrase', 'client_mutation_id']
assert list(fields.keys()) == ["phrase", "client_mutation_id"]
assert SaySomething._meta.name == "SaySomethingPayload"
assert isinstance(fields['phrase'], Field)
assert isinstance(fields["phrase"], Field)
field = SaySomething.Field()
assert field.type == SaySomething
assert list(field.args.keys()) == ['input']
assert isinstance(field.args['input'], Argument)
assert isinstance(field.args['input'].type, NonNull)
assert field.args['input'].type.of_type == SaySomething.Input
assert isinstance(fields['client_mutation_id'], Field)
assert fields['client_mutation_id'].name == 'clientMutationId'
assert fields['client_mutation_id'].type == String
assert list(field.args.keys()) == ["input"]
assert isinstance(field.args["input"], Argument)
assert isinstance(field.args["input"].type, NonNull)
assert field.args["input"].type.of_type == SaySomething.Input
assert isinstance(fields["client_mutation_id"], Field)
assert fields["client_mutation_id"].name == "clientMutationId"
assert fields["client_mutation_id"].type == String
def test_mutation_input():
Input = SaySomething.Input
assert issubclass(Input, InputObjectType)
fields = Input._meta.fields
assert list(fields.keys()) == ['what', 'client_mutation_id']
assert isinstance(fields['what'], InputField)
assert fields['what'].type == String
assert isinstance(fields['client_mutation_id'], InputField)
assert fields['client_mutation_id'].type == String
assert list(fields.keys()) == ["what", "client_mutation_id"]
assert isinstance(fields["what"], InputField)
assert fields["what"].type == String
assert isinstance(fields["client_mutation_id"], InputField)
assert fields["client_mutation_id"].type == String
def test_subclassed_mutation():
fields = OtherMutation._meta.fields
assert list(fields.keys()) == [
'name', 'my_node_edge', 'client_mutation_id'
]
assert isinstance(fields['name'], Field)
assert list(fields.keys()) == ["name", "my_node_edge", "client_mutation_id"]
assert isinstance(fields["name"], Field)
field = OtherMutation.Field()
assert field.type == OtherMutation
assert list(field.args.keys()) == ['input']
assert isinstance(field.args['input'], Argument)
assert isinstance(field.args['input'].type, NonNull)
assert field.args['input'].type.of_type == OtherMutation.Input
assert list(field.args.keys()) == ["input"]
assert isinstance(field.args["input"], Argument)
assert isinstance(field.args["input"].type, NonNull)
assert field.args["input"].type.of_type == OtherMutation.Input
def test_subclassed_mutation_input():
Input = OtherMutation.Input
assert issubclass(Input, InputObjectType)
fields = Input._meta.fields
assert list(fields.keys()) == [
'shared', 'additional_field', 'client_mutation_id'
]
assert isinstance(fields['shared'], InputField)
assert fields['shared'].type == String
assert isinstance(fields['additional_field'], InputField)
assert fields['additional_field'].type == String
assert isinstance(fields['client_mutation_id'], InputField)
assert fields['client_mutation_id'].type == String
assert list(fields.keys()) == ["shared", "additional_field", "client_mutation_id"]
assert isinstance(fields["shared"], InputField)
assert fields["shared"].type == String
assert isinstance(fields["additional_field"], InputField)
assert fields["additional_field"].type == String
assert isinstance(fields["client_mutation_id"], InputField)
assert fields["client_mutation_id"].type == String
def test_node_query():
@ -168,14 +173,16 @@ def test_node_query():
'mutation a { say(input: {what:"hello", clientMutationId:"1"}) { phrase } }'
)
assert not executed.errors
assert executed.data == {'say': {'phrase': 'hello'}}
assert executed.data == {"say": {"phrase": "hello"}}
def test_node_query_fixed():
executed = schema.execute(
'mutation a { sayFixed(input: {what:"hello", clientMutationId:"1"}) { phrase } }'
)
assert "Cannot set client_mutation_id in the payload object" in str(executed.errors[0])
assert "Cannot set client_mutation_id in the payload object" in str(
executed.errors[0]
)
def test_node_query_promise():
@ -183,7 +190,7 @@ def test_node_query_promise():
'mutation a { sayPromise(input: {what:"hello", clientMutationId:"1"}) { phrase } }'
)
assert not executed.errors
assert executed.data == {'sayPromise': {'phrase': 'hello'}}
assert executed.data == {"sayPromise": {"phrase": "hello"}}
def test_edge_query():
@ -192,13 +199,8 @@ def test_edge_query():
)
assert not executed.errors
assert dict(executed.data) == {
'other': {
'clientMutationId': '1',
'myNodeEdge': {
'cursor': '1',
'node': {
'name': 'name'
}
}
"other": {
"clientMutationId": "1",
"myNodeEdge": {"cursor": "1", "node": {"name": "name"}},
}
}

View File

@ -12,13 +12,13 @@ class SharedNodeFields(object):
something_else = String()
def resolve_something_else(*_):
return '----'
return "----"
class MyNode(ObjectType):
class Meta:
interfaces = (Node, )
interfaces = (Node,)
name = String()
@staticmethod
@ -30,10 +30,10 @@ class MyOtherNode(SharedNodeFields, ObjectType):
extra_field = String()
class Meta:
interfaces = (Node, )
interfaces = (Node,)
def resolve_extra_field(self, *_):
return 'extra field info.'
return "extra field info."
@staticmethod
def get_node(info, id):
@ -51,7 +51,7 @@ schema = Schema(query=RootQuery, types=[MyNode, MyOtherNode])
def test_node_good():
assert 'id' in MyNode._meta.fields
assert "id" in MyNode._meta.fields
assert is_node(MyNode)
assert not is_node(object)
@ -61,25 +61,33 @@ def test_node_query():
'{ node(id:"%s") { ... on MyNode { name } } }' % Node.to_global_id("MyNode", 1)
)
assert not executed.errors
assert executed.data == {'node': {'name': '1'}}
assert executed.data == {"node": {"name": "1"}}
def test_subclassed_node_query():
executed = schema.execute(
'{ node(id:"%s") { ... on MyOtherNode { shared, extraField, somethingElse } } }' %
to_global_id("MyOtherNode", 1))
'{ node(id:"%s") { ... on MyOtherNode { shared, extraField, somethingElse } } }'
% to_global_id("MyOtherNode", 1)
)
assert not executed.errors
assert executed.data == OrderedDict({'node': OrderedDict(
[('shared', '1'), ('extraField', 'extra field info.'), ('somethingElse', '----')])})
assert executed.data == OrderedDict(
{
"node": OrderedDict(
[
("shared", "1"),
("extraField", "extra field info."),
("somethingElse", "----"),
]
)
}
)
def test_node_requesting_non_node():
executed = schema.execute(
'{ node(id:"%s") { __typename } } ' % Node.to_global_id("RootQuery", 1)
)
assert executed.data == {
'node': None
}
assert executed.data == {"node": None}
def test_node_query_incorrect_id():
@ -87,7 +95,7 @@ def test_node_query_incorrect_id():
'{ node(id:"%s") { ... on MyNode { name } } }' % "something:2"
)
assert not executed.errors
assert executed.data == {'node': None}
assert executed.data == {"node": None}
def test_node_field():
@ -107,37 +115,42 @@ def test_node_field_only_type():
'{ onlyNode(id:"%s") { __typename, name } } ' % Node.to_global_id("MyNode", 1)
)
assert not executed.errors
assert executed.data == {'onlyNode': {'__typename': 'MyNode', 'name': '1'}}
assert executed.data == {"onlyNode": {"__typename": "MyNode", "name": "1"}}
def test_node_field_only_type_wrong():
executed = schema.execute(
'{ onlyNode(id:"%s") { __typename, name } } ' % Node.to_global_id("MyOtherNode", 1)
'{ onlyNode(id:"%s") { __typename, name } } '
% Node.to_global_id("MyOtherNode", 1)
)
assert len(executed.errors) == 1
assert str(executed.errors[0]) == 'Must receive a MyNode id.'
assert executed.data == {'onlyNode': None}
assert str(executed.errors[0]) == "Must receive a MyNode id."
assert executed.data == {"onlyNode": None}
def test_node_field_only_lazy_type():
executed = schema.execute(
'{ onlyNodeLazy(id:"%s") { __typename, name } } ' % Node.to_global_id("MyNode", 1)
'{ onlyNodeLazy(id:"%s") { __typename, name } } '
% Node.to_global_id("MyNode", 1)
)
assert not executed.errors
assert executed.data == {'onlyNodeLazy': {'__typename': 'MyNode', 'name': '1'}}
assert executed.data == {"onlyNodeLazy": {"__typename": "MyNode", "name": "1"}}
def test_node_field_only_lazy_type_wrong():
executed = schema.execute(
'{ onlyNodeLazy(id:"%s") { __typename, name } } ' % Node.to_global_id("MyOtherNode", 1)
'{ onlyNodeLazy(id:"%s") { __typename, name } } '
% Node.to_global_id("MyOtherNode", 1)
)
assert len(executed.errors) == 1
assert str(executed.errors[0]) == 'Must receive a MyNode id.'
assert executed.data == {'onlyNodeLazy': None}
assert str(executed.errors[0]) == "Must receive a MyNode id."
assert executed.data == {"onlyNodeLazy": None}
def test_str_schema():
assert str(schema) == """
assert (
str(schema)
== """
schema {
query: RootQuery
}
@ -165,3 +178,4 @@ type RootQuery {
onlyNodeLazy(id: ID!): MyNode
}
""".lstrip()
)

View File

@ -6,9 +6,8 @@ from ..node import Node
class CustomNode(Node):
class Meta:
name = 'Node'
name = "Node"
@staticmethod
def to_global_id(type, id):
@ -28,27 +27,20 @@ class BasePhoto(Interface):
class User(ObjectType):
class Meta:
interfaces = [CustomNode]
name = String()
class Photo(ObjectType):
class Meta:
interfaces = [CustomNode, BasePhoto]
user_data = {
'1': User(id='1', name='John Doe'),
'2': User(id='2', name='Jane Smith'),
}
user_data = {"1": User(id="1", name="John Doe"), "2": User(id="2", name="Jane Smith")}
photo_data = {
'3': Photo(id='3', width=300),
'4': Photo(id='4', width=400),
}
photo_data = {"3": Photo(id="3", width=300), "4": Photo(id="4", width=400)}
class RootQuery(ObjectType):
@ -59,7 +51,9 @@ schema = Schema(query=RootQuery, types=[User, Photo])
def test_str_schema_correct():
assert str(schema) == '''schema {
assert (
str(schema)
== """schema {
query: RootQuery
}
@ -84,47 +78,40 @@ type User implements Node {
id: ID!
name: String
}
'''
"""
)
def test_gets_the_correct_id_for_users():
query = '''
query = """
{
node(id: "1") {
id
}
}
'''
expected = {
'node': {
'id': '1',
}
}
"""
expected = {"node": {"id": "1"}}
result = graphql(schema, query)
assert not result.errors
assert result.data == expected
def test_gets_the_correct_id_for_photos():
query = '''
query = """
{
node(id: "4") {
id
}
}
'''
expected = {
'node': {
'id': '4',
}
}
"""
expected = {"node": {"id": "4"}}
result = graphql(schema, query)
assert not result.errors
assert result.data == expected
def test_gets_the_correct_name_for_users():
query = '''
query = """
{
node(id: "1") {
id
@ -133,20 +120,15 @@ def test_gets_the_correct_name_for_users():
}
}
}
'''
expected = {
'node': {
'id': '1',
'name': 'John Doe'
}
}
"""
expected = {"node": {"id": "1", "name": "John Doe"}}
result = graphql(schema, query)
assert not result.errors
assert result.data == expected
def test_gets_the_correct_width_for_photos():
query = '''
query = """
{
node(id: "4") {
id
@ -155,60 +137,45 @@ def test_gets_the_correct_width_for_photos():
}
}
}
'''
expected = {
'node': {
'id': '4',
'width': 400
}
}
"""
expected = {"node": {"id": "4", "width": 400}}
result = graphql(schema, query)
assert not result.errors
assert result.data == expected
def test_gets_the_correct_typename_for_users():
query = '''
query = """
{
node(id: "1") {
id
__typename
}
}
'''
expected = {
'node': {
'id': '1',
'__typename': 'User'
}
}
"""
expected = {"node": {"id": "1", "__typename": "User"}}
result = graphql(schema, query)
assert not result.errors
assert result.data == expected
def test_gets_the_correct_typename_for_photos():
query = '''
query = """
{
node(id: "4") {
id
__typename
}
}
'''
expected = {
'node': {
'id': '4',
'__typename': 'Photo'
}
}
"""
expected = {"node": {"id": "4", "__typename": "Photo"}}
result = graphql(schema, query)
assert not result.errors
assert result.data == expected
def test_ignores_photo_fragments_on_user():
query = '''
query = """
{
node(id: "1") {
id
@ -217,35 +184,29 @@ def test_ignores_photo_fragments_on_user():
}
}
}
'''
expected = {
'node': {
'id': '1',
}
}
"""
expected = {"node": {"id": "1"}}
result = graphql(schema, query)
assert not result.errors
assert result.data == expected
def test_returns_null_for_bad_ids():
query = '''
query = """
{
node(id: "5") {
id
}
}
'''
expected = {
'node': None
}
"""
expected = {"node": None}
result = graphql(schema, query)
assert not result.errors
assert result.data == expected
def test_have_correct_node_interface():
query = '''
query = """
{
__type(name: "Node") {
name
@ -262,23 +223,20 @@ def test_have_correct_node_interface():
}
}
}
'''
"""
expected = {
'__type': {
'name': 'Node',
'kind': 'INTERFACE',
'fields': [
"__type": {
"name": "Node",
"kind": "INTERFACE",
"fields": [
{
'name': 'id',
'type': {
'kind': 'NON_NULL',
'ofType': {
'name': 'ID',
'kind': 'SCALAR'
}
}
"name": "id",
"type": {
"kind": "NON_NULL",
"ofType": {"name": "ID", "kind": "SCALAR"},
},
}
]
],
}
}
result = graphql(schema, query)
@ -287,7 +245,7 @@ def test_have_correct_node_interface():
def test_has_correct_node_root_field():
query = '''
query = """
{
__schema {
queryType {
@ -311,29 +269,23 @@ def test_has_correct_node_root_field():
}
}
}
'''
"""
expected = {
'__schema': {
'queryType': {
'fields': [
"__schema": {
"queryType": {
"fields": [
{
'name': 'node',
'type': {
'name': 'Node',
'kind': 'INTERFACE'
},
'args': [
"name": "node",
"type": {"name": "Node", "kind": "INTERFACE"},
"args": [
{
'name': 'id',
'type': {
'kind': 'NON_NULL',
'ofType': {
'name': 'ID',
'kind': 'SCALAR'
}
}
"name": "id",
"type": {
"kind": "NON_NULL",
"ofType": {"name": "ID", "kind": "SCALAR"},
},
}
]
],
}
]
}

View File

@ -10,7 +10,7 @@ def default_format_error(error):
if isinstance(error, GraphQLError):
return format_graphql_error(error)
return {'message': six.text_type(error)}
return {"message": six.text_type(error)}
def format_execution_result(execution_result, format_error):
@ -18,18 +18,15 @@ def format_execution_result(execution_result, format_error):
response = {}
if execution_result.errors:
response['errors'] = [
format_error(e) for e in execution_result.errors
]
response["errors"] = [format_error(e) for e in execution_result.errors]
if not execution_result.invalid:
response['data'] = execution_result.data
response["data"] = execution_result.data
return response
class Client(object):
def __init__(self, schema, format_error=None, **execute_options):
assert isinstance(schema, Schema)
self.schema = schema
@ -40,8 +37,7 @@ class Client(object):
return format_execution_result(result, self.format_error)
def execute(self, *args, **kwargs):
executed = self.schema.execute(*args,
**dict(self.execute_options, **kwargs))
executed = self.schema.execute(*args, **dict(self.execute_options, **kwargs))
if is_thenable(executed):
return Promise.resolve(executed).then(self.format_result)

View File

@ -16,20 +16,18 @@ class Error(graphene.ObjectType):
class CreatePostResult(graphene.Union):
class Meta:
types = [Success, Error]
class CreatePost(graphene.Mutation):
class Input:
text = graphene.String(required=True)
result = graphene.Field(CreatePostResult)
def mutate(self, info, text):
result = Success(yeah='yeah')
result = Success(yeah="yeah")
return CreatePost(result=result)
@ -37,11 +35,12 @@ class CreatePost(graphene.Mutation):
class Mutations(graphene.ObjectType):
create_post = CreatePost.Field()
# tests.py
def test_create_post():
query_string = '''
query_string = """
mutation {
createPost(text: "Try this out") {
result {
@ -49,10 +48,10 @@ def test_create_post():
}
}
}
'''
"""
schema = graphene.Schema(query=Query, mutation=Mutations)
result = schema.execute(query_string)
assert not result.errors
assert result.data['createPost']['result']['__typename'] == 'Success'
assert result.data["createPost"]["result"]["__typename"] == "Success"

View File

@ -15,7 +15,6 @@ class SomeTypeTwo(graphene.ObjectType):
class MyUnion(graphene.Union):
class Meta:
types = (SomeTypeOne, SomeTypeTwo)
@ -28,6 +27,6 @@ def test_issue():
graphene.Schema(query=Query)
assert str(exc_info.value) == (
'IterableConnectionField type have to be a subclass of Connection. '
"IterableConnectionField type have to be a subclass of Connection. "
'Received "MyUnion".'
)

View File

@ -12,35 +12,35 @@ class SpecialOptions(ObjectTypeOptions):
class SpecialObjectType(ObjectType):
@classmethod
def __init_subclass_with_meta__(cls, other_attr='default', **options):
def __init_subclass_with_meta__(cls, other_attr="default", **options):
_meta = SpecialOptions(cls)
_meta.other_attr = other_attr
super(SpecialObjectType, cls).__init_subclass_with_meta__(_meta=_meta, **options)
super(SpecialObjectType, cls).__init_subclass_with_meta__(
_meta=_meta, **options
)
def test_special_objecttype_could_be_subclassed():
class MyType(SpecialObjectType):
class Meta:
other_attr = 'yeah!'
other_attr = "yeah!"
assert MyType._meta.other_attr == 'yeah!'
assert MyType._meta.other_attr == "yeah!"
def test_special_objecttype_could_be_subclassed_default():
class MyType(SpecialObjectType):
pass
assert MyType._meta.other_attr == 'default'
assert MyType._meta.other_attr == "default"
def test_special_objecttype_inherit_meta_options():
class MyType(SpecialObjectType):
pass
assert MyType._meta.name == 'MyType'
assert MyType._meta.name == "MyType"
assert MyType._meta.default_resolver is None
assert MyType._meta.interfaces == ()
@ -51,35 +51,35 @@ class SpecialInputObjectTypeOptions(ObjectTypeOptions):
class SpecialInputObjectType(InputObjectType):
@classmethod
def __init_subclass_with_meta__(cls, other_attr='default', **options):
def __init_subclass_with_meta__(cls, other_attr="default", **options):
_meta = SpecialInputObjectTypeOptions(cls)
_meta.other_attr = other_attr
super(SpecialInputObjectType, cls).__init_subclass_with_meta__(_meta=_meta, **options)
super(SpecialInputObjectType, cls).__init_subclass_with_meta__(
_meta=_meta, **options
)
def test_special_inputobjecttype_could_be_subclassed():
class MyInputObjectType(SpecialInputObjectType):
class Meta:
other_attr = 'yeah!'
other_attr = "yeah!"
assert MyInputObjectType._meta.other_attr == 'yeah!'
assert MyInputObjectType._meta.other_attr == "yeah!"
def test_special_inputobjecttype_could_be_subclassed_default():
class MyInputObjectType(SpecialInputObjectType):
pass
assert MyInputObjectType._meta.other_attr == 'default'
assert MyInputObjectType._meta.other_attr == "default"
def test_special_inputobjecttype_inherit_meta_options():
class MyInputObjectType(SpecialInputObjectType):
pass
assert MyInputObjectType._meta.name == 'MyInputObjectType'
assert MyInputObjectType._meta.name == "MyInputObjectType"
# Enum
@ -88,9 +88,8 @@ class SpecialEnumOptions(EnumOptions):
class SpecialEnum(Enum):
@classmethod
def __init_subclass_with_meta__(cls, other_attr='default', **options):
def __init_subclass_with_meta__(cls, other_attr="default", **options):
_meta = SpecialEnumOptions(cls)
_meta.other_attr = other_attr
super(SpecialEnum, cls).__init_subclass_with_meta__(_meta=_meta, **options)
@ -98,22 +97,21 @@ class SpecialEnum(Enum):
def test_special_enum_could_be_subclassed():
class MyEnum(SpecialEnum):
class Meta:
other_attr = 'yeah!'
other_attr = "yeah!"
assert MyEnum._meta.other_attr == 'yeah!'
assert MyEnum._meta.other_attr == "yeah!"
def test_special_enum_could_be_subclassed_default():
class MyEnum(SpecialEnum):
pass
assert MyEnum._meta.other_attr == 'default'
assert MyEnum._meta.other_attr == "default"
def test_special_enum_inherit_meta_options():
class MyEnum(SpecialEnum):
pass
assert MyEnum._meta.name == 'MyEnum'
assert MyEnum._meta.name == "MyEnum"

View File

@ -11,14 +11,14 @@ class Query(graphene.ObjectType):
def test_issue():
query_string = '''
query_string = """
query myQuery {
someField(from: "Oh")
}
'''
"""
schema = graphene.Schema(query=Query)
result = schema.execute(query_string)
assert not result.errors
assert result.data['someField'] == 'Oh'
assert result.data["someField"] == "Oh"

View File

@ -7,19 +7,19 @@ import graphene
class MyInputClass(graphene.InputObjectType):
@classmethod
def __init_subclass_with_meta__(
cls, container=None, _meta=None, fields=None, **options):
cls, container=None, _meta=None, fields=None, **options
):
if _meta is None:
_meta = graphene.types.inputobjecttype.InputObjectTypeOptions(cls)
_meta.fields = fields
super(MyInputClass, cls).__init_subclass_with_meta__(
container=container, _meta=_meta, **options)
container=container, _meta=_meta, **options
)
class MyInput(MyInputClass):
class Meta:
fields = dict(x=graphene.Field(graphene.Int))
@ -28,15 +28,15 @@ class Query(graphene.ObjectType):
myField = graphene.Field(graphene.String, input=graphene.Argument(MyInput))
def resolve_myField(parent, info, input):
return 'ok'
return "ok"
def test_issue():
query_string = '''
query_string = """
query myQuery {
myField(input: {x: 1})
}
'''
"""
schema = graphene.Schema(query=Query)
result = schema.execute(query_string)

View File

@ -6,6 +6,7 @@ from .interface import Interface
from .mutation import Mutation
from .scalars import Scalar, String, ID, Int, Float, Boolean
from .datetime import Date, DateTime, Time
from .decimal import Decimal
from .json import JSONString
from .uuid import UUID
from .schema import Schema
@ -24,33 +25,33 @@ from .abstracttype import AbstractType
__all__ = [
'ObjectType',
'InputObjectType',
'Interface',
'Mutation',
'Enum',
'Field',
'InputField',
'Schema',
'Scalar',
'String',
'ID',
'Int',
'Float',
'Date',
'DateTime',
'Time',
'JSONString',
'UUID',
'Boolean',
'List',
'NonNull',
'Argument',
'Dynamic',
'Union',
'Context',
'ResolveInfo',
"ObjectType",
"InputObjectType",
"Interface",
"Mutation",
"Enum",
"Field",
"InputField",
"Schema",
"Scalar",
"String",
"ID",
"Int",
"Float",
"Date",
"DateTime",
"Time",
"Decimal",
"JSONString",
"UUID",
"Boolean",
"List",
"NonNull",
"Argument",
"Dynamic",
"Union",
"Context",
"ResolveInfo",
# Deprecated
'AbstractType',
"AbstractType",
]

View File

@ -3,7 +3,6 @@ from ..utils.subclass_with_meta import SubclassWithMeta
class AbstractType(SubclassWithMeta):
def __init_subclass__(cls, *args, **kwargs):
warn_deprecation(
"Abstract type is deprecated, please use normal object inheritance instead.\n"

View File

@ -8,8 +8,15 @@ from .utils import get_type
class Argument(MountedType):
def __init__(self, type, default_value=None, description=None, name=None, required=False, _creation_counter=None):
def __init__(
self,
type,
default_value=None,
description=None,
name=None,
required=False,
_creation_counter=None,
):
super(Argument, self).__init__(_creation_counter=_creation_counter)
if required:
@ -26,10 +33,10 @@ class Argument(MountedType):
def __eq__(self, other):
return isinstance(other, Argument) and (
self.name == other.name and
self.type == other.type and
self.default_value == other.default_value and
self.description == other.description
self.name == other.name
and self.type == other.type
and self.default_value == other.default_value
and self.description == other.description
)
@ -37,6 +44,7 @@ def to_arguments(args, extra_args=None):
from .unmountedtype import UnmountedType
from .field import Field
from .inputfield import InputField
if extra_args:
extra_args = sorted(extra_args.items(), key=lambda f: f[1])
else:
@ -55,17 +63,21 @@ def to_arguments(args, extra_args=None):
arg = Argument.mounted(arg)
if isinstance(arg, (InputField, Field)):
raise ValueError('Expected {} to be Argument, but received {}. Try using Argument({}).'.format(
default_name,
type(arg).__name__,
arg.type
))
raise ValueError(
"Expected {} to be Argument, but received {}. Try using Argument({}).".format(
default_name, type(arg).__name__, arg.type
)
)
if not isinstance(arg, Argument):
raise ValueError('Unknown argument "{}".'.format(default_name))
arg_name = default_name or arg.name
assert arg_name not in arguments, 'More than one Argument have same name "{}".'.format(arg_name)
assert (
arg_name not in arguments
), 'More than one Argument have same name "{}".'.format(
arg_name
)
arguments[arg_name] = arg
return arguments

View File

@ -25,10 +25,9 @@ class BaseOptions(object):
class BaseType(SubclassWithMeta):
@classmethod
def create_type(cls, class_name, **options):
return type(class_name, (cls, ), {'Meta': options})
return type(class_name, (cls,), {"Meta": options})
@classmethod
def __init_subclass_with_meta__(cls, name=None, description=None, _meta=None):

View File

@ -9,19 +9,19 @@ from .scalars import Scalar
class Date(Scalar):
'''
"""
The `Date` scalar type represents a Date
value as specified by
[iso8601](https://en.wikipedia.org/wiki/ISO_8601).
'''
"""
@staticmethod
def serialize(date):
if isinstance(date, datetime.datetime):
date = date.date()
assert isinstance(date, datetime.date), (
'Received not compatible date "{}"'.format(repr(date))
)
assert isinstance(
date, datetime.date
), 'Received not compatible date "{}"'.format(repr(date))
return date.isoformat()
@classmethod
@ -38,17 +38,17 @@ class Date(Scalar):
class DateTime(Scalar):
'''
"""
The `DateTime` scalar type represents a DateTime
value as specified by
[iso8601](https://en.wikipedia.org/wiki/ISO_8601).
'''
"""
@staticmethod
def serialize(dt):
assert isinstance(dt, (datetime.datetime, datetime.date)), (
'Received not compatible datetime "{}"'.format(repr(dt))
)
assert isinstance(
dt, (datetime.datetime, datetime.date)
), 'Received not compatible datetime "{}"'.format(repr(dt))
return dt.isoformat()
@classmethod
@ -65,17 +65,17 @@ class DateTime(Scalar):
class Time(Scalar):
'''
"""
The `Time` scalar type represents a Time value as
specified by
[iso8601](https://en.wikipedia.org/wiki/ISO_8601).
'''
"""
@staticmethod
def serialize(time):
assert isinstance(time, datetime.time), (
'Received not compatible time "{}"'.format(repr(time))
)
assert isinstance(
time, datetime.time
), 'Received not compatible time "{}"'.format(repr(time))
return time.isoformat()
@classmethod

34
graphene/types/decimal.py Normal file
View File

@ -0,0 +1,34 @@
from __future__ import absolute_import
from decimal import Decimal as _Decimal
from graphql.language import ast
from .scalars import Scalar
class Decimal(Scalar):
"""
The `Decimal` scalar type represents a python Decimal.
"""
@staticmethod
def serialize(dec):
if isinstance(dec, str):
dec = _Decimal(dec)
assert isinstance(dec, _Decimal), 'Received not compatible Decimal "{}"'.format(
repr(dec)
)
return str(dec)
@classmethod
def parse_literal(cls, node):
if isinstance(node, ast.StringValue):
return cls.parse_value(node.value)
@staticmethod
def parse_value(value):
try:
return _Decimal(value)
except ValueError:
return None

View File

@ -1,16 +1,21 @@
from graphql import (GraphQLEnumType, GraphQLInputObjectType,
GraphQLInterfaceType, GraphQLObjectType,
GraphQLScalarType, GraphQLUnionType)
from graphql import (
GraphQLEnumType,
GraphQLInputObjectType,
GraphQLInterfaceType,
GraphQLObjectType,
GraphQLScalarType,
GraphQLUnionType,
)
class GrapheneGraphQLType(object):
'''
"""
A class for extending the base GraphQLType with the related
graphene_type
'''
"""
def __init__(self, *args, **kwargs):
self.graphene_type = kwargs.pop('graphene_type')
self.graphene_type = kwargs.pop("graphene_type")
super(GrapheneGraphQLType, self).__init__(*args, **kwargs)

View File

@ -5,10 +5,10 @@ from .mountedtype import MountedType
class Dynamic(MountedType):
'''
"""
A Dynamic Type let us get the type in runtime when we generate
the schema. So we can have lazy fields.
'''
"""
def __init__(self, type, with_schema=False, _creation_counter=None):
super(Dynamic, self).__init__(_creation_counter=_creation_counter)

View File

@ -24,14 +24,15 @@ class EnumOptions(BaseOptions):
class EnumMeta(SubclassWithMeta_Meta):
def __new__(cls, name, bases, classdict, **options):
enum_members = OrderedDict(classdict, __eq__=eq_enum)
# We remove the Meta attribute from the class to not collide
# with the enum values.
enum_members.pop('Meta', None)
enum_members.pop("Meta", None)
enum = PyEnum(cls.__name__, enum_members)
return SubclassWithMeta_Meta.__new__(cls, name, bases, OrderedDict(classdict, __enum__=enum), **options)
return SubclassWithMeta_Meta.__new__(
cls, name, bases, OrderedDict(classdict, __enum__=enum), **options
)
def get(cls, value):
return cls._meta.enum(value)
@ -44,7 +45,7 @@ class EnumMeta(SubclassWithMeta_Meta):
def __call__(cls, *args, **kwargs): # noqa: N805
if cls is Enum:
description = kwargs.pop('description', None)
description = kwargs.pop("description", None)
return cls.from_enum(PyEnum(*args, **kwargs), description=description)
return super(EnumMeta, cls).__call__(*args, **kwargs)
# return cls._meta.enum(*args, **kwargs)
@ -52,22 +53,21 @@ class EnumMeta(SubclassWithMeta_Meta):
def from_enum(cls, enum, description=None, deprecation_reason=None): # noqa: N805
description = description or enum.__doc__
meta_dict = {
'enum': enum,
'description': description,
'deprecation_reason': deprecation_reason
"enum": enum,
"description": description,
"deprecation_reason": deprecation_reason,
}
meta_class = type('Meta', (object,), meta_dict)
return type(meta_class.enum.__name__, (Enum,), {'Meta': meta_class})
meta_class = type("Meta", (object,), meta_dict)
return type(meta_class.enum.__name__, (Enum,), {"Meta": meta_class})
class Enum(six.with_metaclass(EnumMeta, UnmountedType, BaseType)):
@classmethod
def __init_subclass_with_meta__(cls, enum=None, _meta=None, **options):
if not _meta:
_meta = EnumOptions(cls)
_meta.enum = enum or cls.__enum__
_meta.deprecation_reason = options.pop('deprecation_reason', None)
_meta.deprecation_reason = options.pop("deprecation_reason", None)
for key, value in _meta.enum.__members__.items():
setattr(cls, key, value)
@ -75,8 +75,8 @@ class Enum(six.with_metaclass(EnumMeta, UnmountedType, BaseType)):
@classmethod
def get_type(cls):
'''
"""
This function is called when the unmounted type (Enum instance)
is mounted (as a Field, InputField or Argument)
'''
"""
return cls

View File

@ -19,18 +19,27 @@ def source_resolver(source, root, info, **args):
class Field(MountedType):
def __init__(self, type, args=None, resolver=None, source=None,
deprecation_reason=None, name=None, description=None,
required=False, _creation_counter=None, default_value=None,
**extra_args):
def __init__(
self,
type,
args=None,
resolver=None,
source=None,
deprecation_reason=None,
name=None,
description=None,
required=False,
_creation_counter=None,
default_value=None,
**extra_args
):
super(Field, self).__init__(_creation_counter=_creation_counter)
assert not args or isinstance(args, Mapping), (
'Arguments in a field have to be a mapping, received "{}".'
).format(args)
assert not (source and resolver), (
'A Field cannot have a source and a resolver in at the same time.'
)
assert not (
source and resolver
), "A Field cannot have a source and a resolver in at the same time."
assert not callable(default_value), (
'The default value can not be a function but received "{}".'
).format(base_type(default_value))
@ -40,12 +49,12 @@ class Field(MountedType):
# Check if name is actually an argument of the field
if isinstance(name, (Argument, UnmountedType)):
extra_args['name'] = name
extra_args["name"] = name
name = None
# Check if source is actually an argument of the field
if isinstance(source, (Argument, UnmountedType)):
extra_args['source'] = source
extra_args["source"] = source
source = None
self.name = name

View File

@ -1,7 +1,13 @@
from __future__ import unicode_literals
from graphql.language.ast import (BooleanValue, FloatValue, IntValue,
ListValue, ObjectValue, StringValue)
from graphql.language.ast import (
BooleanValue,
FloatValue,
IntValue,
ListValue,
ObjectValue,
StringValue,
)
from graphene.types.scalars import MAX_INT, MIN_INT
@ -35,6 +41,9 @@ class GenericScalar(Scalar):
elif isinstance(ast, ListValue):
return [GenericScalar.parse_literal(value) for value in ast.values]
elif isinstance(ast, ObjectValue):
return {field.name.value: GenericScalar.parse_literal(field.value) for field in ast.fields}
return {
field.name.value: GenericScalar.parse_literal(field.value)
for field in ast.fields
}
else:
return None

View File

@ -4,10 +4,17 @@ from .utils import get_type
class InputField(MountedType):
def __init__(self, type, name=None, default_value=None,
deprecation_reason=None, description=None,
required=False, _creation_counter=None, **extra_args):
def __init__(
self,
type,
name=None,
default_value=None,
deprecation_reason=None,
description=None,
required=False,
_creation_counter=None,
**extra_args
):
super(InputField, self).__init__(_creation_counter=_creation_counter)
self.name = name
if required:

View File

@ -30,14 +30,14 @@ class InputObjectTypeContainer(dict, BaseType):
class InputObjectType(UnmountedType, BaseType):
'''
"""
Input Object Type Definition
An input object defines a structured collection of fields which may be
supplied to a field argument.
Using `NonNull` will ensure that a value must be provided by the query
'''
"""
@classmethod
def __init_subclass_with_meta__(cls, container=None, _meta=None, **options):
@ -46,9 +46,7 @@ class InputObjectType(UnmountedType, BaseType):
fields = OrderedDict()
for base in reversed(cls.__mro__):
fields.update(
yank_fields_from_attrs(base.__dict__, _as=InputField)
)
fields.update(yank_fields_from_attrs(base.__dict__, _as=InputField))
if _meta.fields:
_meta.fields.update(fields)
@ -57,13 +55,12 @@ class InputObjectType(UnmountedType, BaseType):
if container is None:
container = type(cls.__name__, (InputObjectTypeContainer, cls), {})
_meta.container = container
super(InputObjectType, cls).__init_subclass_with_meta__(
_meta=_meta, **options)
super(InputObjectType, cls).__init_subclass_with_meta__(_meta=_meta, **options)
@classmethod
def get_type(cls):
'''
"""
This function is called when the unmounted type (InputObjectType instance)
is mounted (as a Field, InputField or Argument)
'''
"""
return cls

View File

@ -15,14 +15,15 @@ class InterfaceOptions(BaseOptions):
class Interface(BaseType):
'''
"""
Interface Type Definition
When a field can return one of a heterogeneous set of types, a Interface type
is used to describe what types are possible, what fields are in common across
all types, as well as a function to determine which type is actually used
when the field is resolved.
'''
"""
@classmethod
def __init_subclass_with_meta__(cls, _meta=None, **options):
if not _meta:
@ -30,9 +31,7 @@ class Interface(BaseType):
fields = OrderedDict()
for base in reversed(cls.__mro__):
fields.update(
yank_fields_from_attrs(base.__dict__, _as=Field)
)
fields.update(yank_fields_from_attrs(base.__dict__, _as=Field))
if _meta.fields:
_meta.fields.update(fields)
@ -44,6 +43,7 @@ class Interface(BaseType):
@classmethod
def resolve_type(cls, instance, info):
from .objecttype import ObjectType
if isinstance(instance, ObjectType):
return type(instance)

View File

@ -8,7 +8,7 @@ from .scalars import Scalar
class JSONString(Scalar):
'''JSON String'''
"""JSON String"""
@staticmethod
def serialize(dt):

View File

@ -3,15 +3,14 @@ from .unmountedtype import UnmountedType
class MountedType(OrderedType):
@classmethod
def mounted(cls, unmounted): # noqa: N802
'''
"""
Mount the UnmountedType instance
'''
assert isinstance(unmounted, UnmountedType), (
'{} can\'t mount {}'
).format(cls.__name__, repr(unmounted))
"""
assert isinstance(unmounted, UnmountedType), ("{} can't mount {}").format(
cls.__name__, repr(unmounted)
)
return cls(
unmounted.get_type(),

View File

@ -21,37 +21,39 @@ class MutationOptions(ObjectTypeOptions):
class Mutation(ObjectType):
'''
"""
Mutation Type Definition
'''
"""
@classmethod
def __init_subclass_with_meta__(cls, resolver=None, output=None, arguments=None,
_meta=None, **options):
def __init_subclass_with_meta__(
cls, resolver=None, output=None, arguments=None, _meta=None, **options
):
if not _meta:
_meta = MutationOptions(cls)
output = output or getattr(cls, 'Output', None)
output = output or getattr(cls, "Output", None)
fields = {}
if not output:
# If output is defined, we don't need to get the fields
fields = OrderedDict()
for base in reversed(cls.__mro__):
fields.update(
yank_fields_from_attrs(base.__dict__, _as=Field)
)
fields.update(yank_fields_from_attrs(base.__dict__, _as=Field))
output = cls
if not arguments:
input_class = getattr(cls, 'Arguments', None)
input_class = getattr(cls, "Arguments", None)
if not input_class:
input_class = getattr(cls, 'Input', None)
input_class = getattr(cls, "Input", None)
if input_class:
warn_deprecation((
"Please use {name}.Arguments instead of {name}.Input."
"Input is now only used in ClientMutationID.\n"
"Read more:"
" https://github.com/graphql-python/graphene/blob/v2.0.0/UPGRADE-v2.0.md#mutation-input"
).format(name=cls.__name__))
warn_deprecation(
(
"Please use {name}.Arguments instead of {name}.Input."
"Input is now only used in ClientMutationID.\n"
"Read more:"
" https://github.com/graphql-python/graphene/blob/v2.0.0/UPGRADE-v2.0.md#mutation-input"
).format(name=cls.__name__)
)
if input_class:
arguments = props(input_class)
@ -59,8 +61,8 @@ class Mutation(ObjectType):
arguments = {}
if not resolver:
mutate = getattr(cls, 'mutate', None)
assert mutate, 'All mutations must define a mutate method in it'
mutate = getattr(cls, "mutate", None)
assert mutate, "All mutations must define a mutate method in it"
resolver = get_unbound_function(mutate)
if _meta.fields:
@ -72,17 +74,18 @@ class Mutation(ObjectType):
_meta.resolver = resolver
_meta.arguments = arguments
super(Mutation, cls).__init_subclass_with_meta__(
_meta=_meta, **options)
super(Mutation, cls).__init_subclass_with_meta__(_meta=_meta, **options)
@classmethod
def Field(cls, name=None, description=None, deprecation_reason=None, required=False):
def Field(
cls, name=None, description=None, deprecation_reason=None, required=False
):
return Field(
cls._meta.output,
args=cls._meta.arguments,
resolver=cls._meta.resolver,
name=name,
description=description,
description=description or cls._meta.description,
deprecation_reason=deprecation_reason,
required=required,
)

View File

@ -17,17 +17,22 @@ class ObjectTypeOptions(BaseOptions):
class ObjectType(BaseType):
'''
"""
Object Type Definition
Almost all of the GraphQL types you define will be object types. Object types
have a name, but most importantly describe their fields.
'''
"""
@classmethod
def __init_subclass_with_meta__(
cls, interfaces=(),
possible_types=(),
default_resolver=None, _meta=None, **options):
cls,
interfaces=(),
possible_types=(),
default_resolver=None,
_meta=None,
**options
):
if not _meta:
_meta = ObjectTypeOptions(cls)
@ -40,13 +45,11 @@ class ObjectType(BaseType):
fields.update(interface._meta.fields)
for base in reversed(cls.__mro__):
fields.update(
yank_fields_from_attrs(base.__dict__, _as=Field)
)
fields.update(yank_fields_from_attrs(base.__dict__, _as=Field))
assert not (possible_types and cls.is_type_of), (
'{name}.Meta.possible_types will cause type collision with {name}.is_type_of. '
'Please use one or other.'
"{name}.Meta.possible_types will cause type collision with {name}.is_type_of. "
"Please use one or other."
).format(name=cls.__name__)
if _meta.fields:
@ -82,8 +85,7 @@ class ObjectType(BaseType):
for name, field in fields_iter:
try:
val = kwargs.pop(
name,
field.default_value if isinstance(field, Field) else None
name, field.default_value if isinstance(field, Field) else None
)
setattr(self, name, val)
except KeyError:
@ -92,14 +94,15 @@ class ObjectType(BaseType):
if kwargs:
for prop in list(kwargs):
try:
if isinstance(getattr(self.__class__, prop), property) or prop.startswith('_'):
if isinstance(
getattr(self.__class__, prop), property
) or prop.startswith("_"):
setattr(self, prop, kwargs.pop(prop))
except AttributeError:
pass
if kwargs:
raise TypeError(
"'{}' is an invalid keyword argument for {}".format(
list(kwargs)[0],
self.__class__.__name__
list(kwargs)[0], self.__class__.__name__
)
)

View File

@ -11,7 +11,7 @@ default_resolver = attr_resolver
def set_default_resolver(resolver):
global default_resolver
assert callable(resolver), 'Received non-callable resolver.'
assert callable(resolver), "Received non-callable resolver."
default_resolver = resolver

View File

@ -1,6 +1,5 @@
import six
from graphql.language.ast import (BooleanValue, FloatValue, IntValue,
StringValue)
from graphql.language.ast import BooleanValue, FloatValue, IntValue, StringValue
from .base import BaseOptions, BaseType
from .unmountedtype import UnmountedType
@ -11,13 +10,14 @@ class ScalarOptions(BaseOptions):
class Scalar(UnmountedType, BaseType):
'''
"""
Scalar Type Definition
The leaf values of any request and input values to arguments are
Scalars (or Enums) and are defined with a name and a series of functions
used to parse input from ast or variables and to ensure validity.
'''
"""
@classmethod
def __init_subclass_with_meta__(cls, **options):
_meta = ScalarOptions(cls)
@ -29,10 +29,10 @@ class Scalar(UnmountedType, BaseType):
@classmethod
def get_type(cls):
'''
"""
This function is called when the unmounted type (Scalar instance)
is mounted (as a Field, InputField or Argument)
'''
"""
return cls
@ -46,12 +46,12 @@ MIN_INT = -2147483648
class Int(Scalar):
'''
"""
The `Int` scalar type represents non-fractional signed whole numeric
values. Int can represent values between -(2^53 - 1) and 2^53 - 1 since
represented in JSON as double-precision floating point numbers specified
by [IEEE 754](http://en.wikipedia.org/wiki/IEEE_floating_point).
'''
"""
@staticmethod
def coerce_int(value):
@ -77,11 +77,11 @@ class Int(Scalar):
class Float(Scalar):
'''
"""
The `Float` scalar type represents signed double-precision fractional
values as specified by
[IEEE 754](http://en.wikipedia.org/wiki/IEEE_floating_point).
'''
"""
@staticmethod
def coerce_float(value):
@ -101,16 +101,16 @@ class Float(Scalar):
class String(Scalar):
'''
"""
The `String` scalar type represents textual data, represented as UTF-8
character sequences. The String type is most often used by GraphQL to
represent free-form human-readable text.
'''
"""
@staticmethod
def coerce_string(value):
if isinstance(value, bool):
return u'true' if value else u'false'
return u"true" if value else u"false"
return six.text_type(value)
serialize = coerce_string
@ -123,9 +123,9 @@ class String(Scalar):
class Boolean(Scalar):
'''
"""
The `Boolean` scalar type represents `true` or `false`.
'''
"""
serialize = bool
parse_value = bool
@ -137,13 +137,13 @@ class Boolean(Scalar):
class ID(Scalar):
'''
"""
The `ID` scalar type represents a unique identifier, often used to
refetch an object or as key for a cache. The ID type appears in a JSON
response as a String; however, it is not intended to be human-readable.
When expected as an input type, any string (such as `"4"`) or integer
(such as `4`) input value will be accepted as an ID.
'''
"""
serialize = str
parse_value = str

View File

@ -1,8 +1,11 @@
import inspect
from graphql import GraphQLObjectType, GraphQLSchema, graphql, is_type
from graphql.type.directives import (GraphQLDirective, GraphQLIncludeDirective,
GraphQLSkipDirective)
from graphql.type.directives import (
GraphQLDirective,
GraphQLIncludeDirective,
GraphQLSkipDirective,
)
from graphql.type.introspection import IntrospectionSchema
from graphql.utils.introspection_query import introspection_query
from graphql.utils.schema_printer import print_schema
@ -15,8 +18,7 @@ from .typemap import TypeMap, is_graphene_type
def assert_valid_root_type(_type):
if _type is None:
return
is_graphene_objecttype = inspect.isclass(
_type) and issubclass(_type, ObjectType)
is_graphene_objecttype = inspect.isclass(_type) and issubclass(_type, ObjectType)
is_graphql_objecttype = isinstance(_type, GraphQLObjectType)
assert is_graphene_objecttype or is_graphql_objecttype, (
"Type {} is not a valid ObjectType."
@ -24,20 +26,22 @@ def assert_valid_root_type(_type):
class Schema(GraphQLSchema):
'''
"""
Schema Definition
A Schema is created by supplying the root types of each type of operation,
query and mutation (optional).
'''
"""
def __init__(self,
query=None,
mutation=None,
subscription=None,
directives=None,
types=None,
auto_camelcase=True):
def __init__(
self,
query=None,
mutation=None,
subscription=None,
directives=None,
types=None,
auto_camelcase=True,
):
assert_valid_root_type(query)
assert_valid_root_type(mutation)
assert_valid_root_type(subscription)
@ -49,9 +53,10 @@ class Schema(GraphQLSchema):
if directives is None:
directives = [GraphQLIncludeDirective, GraphQLSkipDirective]
assert all(isinstance(d, GraphQLDirective) for d in directives), \
'Schema directives must be List[GraphQLDirective] if provided but got: {}.'.format(
directives
assert all(
isinstance(d, GraphQLDirective) for d in directives
), "Schema directives must be List[GraphQLDirective] if provided but got: {}.".format(
directives
)
self._directives = directives
self.build_typemap()
@ -66,16 +71,15 @@ class Schema(GraphQLSchema):
return self.get_graphql_type(self._subscription)
def __getattr__(self, type_name):
'''
"""
This function let the developer select a type in a given schema
by accessing its attrs.
Example: using schema.Query for accessing the "Query" type in the Schema
'''
"""
_type = super(Schema, self).get_type(type_name)
if _type is None:
raise AttributeError(
'Type "{}" not found in the Schema'.format(type_name))
raise AttributeError('Type "{}" not found in the Schema'.format(type_name))
if isinstance(_type, GrapheneGraphQLType):
return _type.graphene_type
return _type
@ -88,7 +92,8 @@ class Schema(GraphQLSchema):
if is_graphene_type(_type):
graphql_type = self.get_type(_type._meta.name)
assert graphql_type, "Type {} not found in this schema.".format(
_type._meta.name)
_type._meta.name
)
assert graphql_type.graphene_type == _type
return graphql_type
raise Exception("{} is not a valid GraphQL type.".format(_type))
@ -113,12 +118,10 @@ class Schema(GraphQLSchema):
self._query,
self._mutation,
self._subscription,
IntrospectionSchema
IntrospectionSchema,
]
if self.types:
initial_types += self.types
self._type_map = TypeMap(
initial_types,
auto_camelcase=self.auto_camelcase,
schema=self
initial_types, auto_camelcase=self.auto_camelcase, schema=self
)

View File

@ -3,22 +3,21 @@ from .utils import get_type
class Structure(UnmountedType):
'''
"""
A structure is a GraphQL type instance that
wraps a main type with certain structure.
'''
"""
def __init__(self, of_type, *args, **kwargs):
super(Structure, self).__init__(*args, **kwargs)
if not isinstance(of_type, Structure) and isinstance(of_type, UnmountedType):
cls_name = type(self).__name__
of_type_name = type(of_type).__name__
raise Exception("{} could not have a mounted {}() as inner type. Try with {}({}).".format(
cls_name,
of_type_name,
cls_name,
of_type_name,
))
raise Exception(
"{} could not have a mounted {}() as inner type. Try with {}({}).".format(
cls_name, of_type_name, cls_name, of_type_name
)
)
self._of_type = of_type
@property
@ -26,35 +25,35 @@ class Structure(UnmountedType):
return get_type(self._of_type)
def get_type(self):
'''
"""
This function is called when the unmounted type (List or NonNull instance)
is mounted (as a Field, InputField or Argument)
'''
"""
return self
class List(Structure):
'''
"""
List Modifier
A list is a kind of type marker, a wrapping type which points to another
type. Lists are often created within the context of defining the fields of
an object type.
'''
"""
def __str__(self):
return '[{}]'.format(self.of_type)
return "[{}]".format(self.of_type)
def __eq__(self, other):
return isinstance(other, List) and (
self.of_type == other.of_type and
self.args == other.args and
self.kwargs == other.kwargs
self.of_type == other.of_type
and self.args == other.args
and self.kwargs == other.kwargs
)
class NonNull(Structure):
'''
"""
Non-Null Modifier
A non-null is a kind of type marker, a wrapping type which points to another
@ -64,20 +63,20 @@ class NonNull(Structure):
usually the id field of a database row will never be null.
Note: the enforcement of non-nullability occurs within the executor.
'''
"""
def __init__(self, *args, **kwargs):
super(NonNull, self).__init__(*args, **kwargs)
assert not isinstance(self._of_type, NonNull), (
'Can only create NonNull of a Nullable GraphQLType but got: {}.'
"Can only create NonNull of a Nullable GraphQLType but got: {}."
).format(self._of_type)
def __str__(self):
return '{}!'.format(self.of_type)
return "{}!".format(self.of_type)
def __eq__(self, other):
return isinstance(other, NonNull) and (
self.of_type == other.of_type and
self.args == other.args and
self.kwargs == other.kwargs
self.of_type == other.of_type
and self.args == other.args
and self.kwargs == other.kwargs
)

View File

@ -10,13 +10,12 @@ class MyType(ObjectType):
class MyScalar(UnmountedType):
def get_type(self):
return MyType
def test_abstract_objecttype_warn_deprecation(mocker):
mocker.patch.object(abstracttype, 'warn_deprecation')
mocker.patch.object(abstracttype, "warn_deprecation")
class MyAbstractType(AbstractType):
field1 = MyScalar()
@ -34,5 +33,5 @@ def test_generate_objecttype_inherit_abstracttype():
assert MyObjectType._meta.description is None
assert MyObjectType._meta.interfaces == ()
assert MyObjectType._meta.name == "MyObjectType"
assert list(MyObjectType._meta.fields.keys()) == ['field1', 'field2']
assert list(MyObjectType._meta.fields.keys()) == ["field1", "field2"]
assert list(map(type, MyObjectType._meta.fields.values())) == [Field, Field]

View File

@ -10,16 +10,16 @@ from ..structures import NonNull
def test_argument():
arg = Argument(String, default_value='a', description='desc', name='b')
arg = Argument(String, default_value="a", description="desc", name="b")
assert arg.type == String
assert arg.default_value == 'a'
assert arg.description == 'desc'
assert arg.name == 'b'
assert arg.default_value == "a"
assert arg.description == "desc"
assert arg.name == "b"
def test_argument_comparasion():
arg1 = Argument(String, name='Hey', description='Desc', default_value='default')
arg2 = Argument(String, name='Hey', description='Desc', default_value='default')
arg1 = Argument(String, name="Hey", description="Desc", default_value="default")
arg2 = Argument(String, name="Hey", description="Desc", default_value="default")
assert arg1 == arg2
assert arg1 != String()
@ -31,43 +31,36 @@ def test_argument_required():
def test_to_arguments():
args = {
'arg_string': Argument(String),
'unmounted_arg': String(required=True)
}
args = {"arg_string": Argument(String), "unmounted_arg": String(required=True)}
my_args = to_arguments(args)
assert my_args == {
'arg_string': Argument(String),
'unmounted_arg': Argument(String, required=True)
"arg_string": Argument(String),
"unmounted_arg": Argument(String, required=True),
}
def test_to_arguments_raises_if_field():
args = {
'arg_string': Field(String),
}
args = {"arg_string": Field(String)}
with pytest.raises(ValueError) as exc_info:
to_arguments(args)
assert str(exc_info.value) == (
'Expected arg_string to be Argument, but received Field. Try using '
'Argument(String).'
"Expected arg_string to be Argument, but received Field. Try using "
"Argument(String)."
)
def test_to_arguments_raises_if_inputfield():
args = {
'arg_string': InputField(String),
}
args = {"arg_string": InputField(String)}
with pytest.raises(ValueError) as exc_info:
to_arguments(args)
assert str(exc_info.value) == (
'Expected arg_string to be Argument, but received InputField. Try '
'using Argument(String).'
"Expected arg_string to be Argument, but received InputField. Try "
"using Argument(String)."
)

View File

@ -23,7 +23,8 @@ def test_basetype():
def test_basetype_nones():
class MyBaseType(CustomType):
'''Documentation'''
"""Documentation"""
class Meta:
name = None
description = None
@ -35,10 +36,11 @@ def test_basetype_nones():
def test_basetype_custom():
class MyBaseType(CustomType):
'''Documentation'''
"""Documentation"""
class Meta:
name = 'Base'
description = 'Desc'
name = "Base"
description = "Desc"
assert isinstance(MyBaseType._meta, CustomOptions)
assert MyBaseType._meta.name == "Base"
@ -46,7 +48,7 @@ def test_basetype_custom():
def test_basetype_create():
MyBaseType = CustomType.create_type('MyBaseType')
MyBaseType = CustomType.create_type("MyBaseType")
assert isinstance(MyBaseType._meta, CustomOptions)
assert MyBaseType._meta.name == "MyBaseType"
@ -54,7 +56,7 @@ def test_basetype_create():
def test_basetype_create_extra():
MyBaseType = CustomType.create_type('MyBaseType', name='Base', description='Desc')
MyBaseType = CustomType.create_type("MyBaseType", name="Base", description="Desc")
assert isinstance(MyBaseType._meta, CustomOptions)
assert MyBaseType._meta.name == "Base"

View File

@ -9,9 +9,9 @@ from ..schema import Schema
class Query(ObjectType):
datetime = DateTime(_in=DateTime(name='in'))
date = Date(_in=Date(name='in'))
time = Time(_at=Time(name='at'))
datetime = DateTime(_in=DateTime(name="in"))
date = Date(_in=Date(name="in"))
time = Time(_at=Time(name="at"))
def resolve_datetime(self, info, _in=None):
return _in
@ -30,35 +30,34 @@ def test_datetime_query():
now = datetime.datetime.now().replace(tzinfo=pytz.utc)
isoformat = now.isoformat()
result = schema.execute('''{ datetime(in: "%s") }''' % isoformat)
result = schema.execute("""{ datetime(in: "%s") }""" % isoformat)
assert not result.errors
assert result.data == {'datetime': isoformat}
assert result.data == {"datetime": isoformat}
def test_date_query():
now = datetime.datetime.now().replace(tzinfo=pytz.utc).date()
isoformat = now.isoformat()
result = schema.execute('''{ date(in: "%s") }''' % isoformat)
result = schema.execute("""{ date(in: "%s") }""" % isoformat)
assert not result.errors
assert result.data == {'date': isoformat}
assert result.data == {"date": isoformat}
def test_time_query():
now = datetime.datetime.now().replace(tzinfo=pytz.utc)
time = datetime.time(now.hour, now.minute, now.second, now.microsecond,
now.tzinfo)
time = datetime.time(now.hour, now.minute, now.second, now.microsecond, now.tzinfo)
isoformat = time.isoformat()
result = schema.execute('''{ time(at: "%s") }''' % isoformat)
result = schema.execute("""{ time(at: "%s") }""" % isoformat)
assert not result.errors
assert result.data == {'time': isoformat}
assert result.data == {"time": isoformat}
def test_bad_datetime_query():
not_a_date = "Some string that's not a date"
result = schema.execute('''{ datetime(in: "%s") }''' % not_a_date)
result = schema.execute("""{ datetime(in: "%s") }""" % not_a_date)
assert len(result.errors) == 1
assert isinstance(result.errors[0], GraphQLError)
@ -68,7 +67,7 @@ def test_bad_datetime_query():
def test_bad_date_query():
not_a_date = "Some string that's not a date"
result = schema.execute('''{ date(in: "%s") }''' % not_a_date)
result = schema.execute("""{ date(in: "%s") }""" % not_a_date)
assert len(result.errors) == 1
assert isinstance(result.errors[0], GraphQLError)
@ -78,7 +77,7 @@ def test_bad_date_query():
def test_bad_time_query():
not_a_date = "Some string that's not a date"
result = schema.execute('''{ time(at: "%s") }''' % not_a_date)
result = schema.execute("""{ time(at: "%s") }""" % not_a_date)
assert len(result.errors) == 1
assert isinstance(result.errors[0], GraphQLError)
@ -90,10 +89,11 @@ def test_datetime_query_variable():
isoformat = now.isoformat()
result = schema.execute(
'''query Test($date: DateTime){ datetime(in: $date) }''',
variable_values={'date': isoformat})
"""query Test($date: DateTime){ datetime(in: $date) }""",
variables={"date": isoformat},
)
assert not result.errors
assert result.data == {'datetime': isoformat}
assert result.data == {"datetime": isoformat}
def test_date_query_variable():
@ -101,20 +101,19 @@ def test_date_query_variable():
isoformat = now.isoformat()
result = schema.execute(
'''query Test($date: Date){ date(in: $date) }''',
variable_values={'date': isoformat})
"""query Test($date: Date){ date(in: $date) }""", variables={"date": isoformat}
)
assert not result.errors
assert result.data == {'date': isoformat}
assert result.data == {"date": isoformat}
def test_time_query_variable():
now = datetime.datetime.now().replace(tzinfo=pytz.utc)
time = datetime.time(now.hour, now.minute, now.second, now.microsecond,
now.tzinfo)
time = datetime.time(now.hour, now.minute, now.second, now.microsecond, now.tzinfo)
isoformat = time.isoformat()
result = schema.execute(
'''query Test($time: Time){ time(at: $time) }''',
variable_values={'time': isoformat})
"""query Test($time: Time){ time(at: $time) }""", variables={"time": isoformat}
)
assert not result.errors
assert result.data == {'time': isoformat}
assert result.data == {"time": isoformat}

View File

@ -0,0 +1,43 @@
import decimal
from ..decimal import Decimal
from ..objecttype import ObjectType
from ..schema import Schema
class Query(ObjectType):
decimal = Decimal(input=Decimal())
def resolve_decimal(self, info, input):
return input
schema = Schema(query=Query)
def test_decimal_string_query():
decimal_value = decimal.Decimal("1969.1974")
result = schema.execute("""{ decimal(input: "%s") }""" % decimal_value)
assert not result.errors
assert result.data == {"decimal": str(decimal_value)}
assert decimal.Decimal(result.data["decimal"]) == decimal_value
def test_decimal_string_query_variable():
decimal_value = decimal.Decimal("1969.1974")
result = schema.execute(
"""query Test($decimal: Decimal){ decimal(input: $decimal) }""",
variables={"decimal": decimal_value},
)
assert not result.errors
assert result.data == {"decimal": str(decimal_value)}
assert decimal.Decimal(result.data["decimal"]) == decimal_value
def test_bad_decimal_query():
not_a_decimal = "Nobody expects the Spanish Inquisition!"
result = schema.execute("""{ decimal(input: "%s") }""" % not_a_decimal)
assert len(result.errors) == 1
assert result.data is None

View File

@ -56,13 +56,12 @@ class MyInterface(Interface):
class MyUnion(Union):
class Meta:
types = (Article, )
types = (Article,)
class MyEnum(Enum):
foo = 'foo'
foo = "foo"
class MyInputObjectType(InputObjectType):
@ -74,24 +73,24 @@ def test_defines_a_query_only_schema():
assert blog_schema.get_query_type().graphene_type == Query
article_field = Query._meta.fields['article']
article_field = Query._meta.fields["article"]
assert article_field.type == Article
assert article_field.type._meta.name == 'Article'
assert article_field.type._meta.name == "Article"
article_field_type = article_field.type
assert issubclass(article_field_type, ObjectType)
title_field = article_field_type._meta.fields['title']
title_field = article_field_type._meta.fields["title"]
assert title_field.type == String
author_field = article_field_type._meta.fields['author']
author_field = article_field_type._meta.fields["author"]
author_field_type = author_field.type
assert issubclass(author_field_type, ObjectType)
recent_article_field = author_field_type._meta.fields['recent_article']
recent_article_field = author_field_type._meta.fields["recent_article"]
assert recent_article_field.type == Article
feed_field = Query._meta.fields['feed']
feed_field = Query._meta.fields["feed"]
assert feed_field.type.of_type == Article
@ -100,9 +99,9 @@ def test_defines_a_mutation_schema():
assert blog_schema.get_mutation_type().graphene_type == Mutation
write_mutation = Mutation._meta.fields['write_article']
write_mutation = Mutation._meta.fields["write_article"]
assert write_mutation.type == Article
assert write_mutation.type._meta.name == 'Article'
assert write_mutation.type._meta.name == "Article"
def test_defines_a_subscription_schema():
@ -110,9 +109,9 @@ def test_defines_a_subscription_schema():
assert blog_schema.get_subscription_type().graphene_type == Subscription
subscription = Subscription._meta.fields['article_subscribe']
subscription = Subscription._meta.fields["article_subscribe"]
assert subscription.type == Article
assert subscription.type._meta.name == 'Article'
assert subscription.type._meta.name == "Article"
def test_includes_nested_input_objects_in_the_map():
@ -128,13 +127,9 @@ def test_includes_nested_input_objects_in_the_map():
class SomeSubscription(Mutation):
subscribe_to_something = Field(Article, input=Argument(SomeInputObject))
schema = Schema(
query=Query,
mutation=SomeMutation,
subscription=SomeSubscription
)
schema = Schema(query=Query, mutation=SomeMutation, subscription=SomeSubscription)
assert schema.get_type_map()['NestedInputObject'].graphene_type is NestedInputObject
assert schema.get_type_map()["NestedInputObject"].graphene_type is NestedInputObject
def test_includes_interfaces_thunk_subtypes_in_the_type_map():
@ -142,19 +137,15 @@ def test_includes_interfaces_thunk_subtypes_in_the_type_map():
f = Int()
class SomeSubtype(ObjectType):
class Meta:
interfaces = (SomeInterface, )
interfaces = (SomeInterface,)
class Query(ObjectType):
iface = Field(lambda: SomeInterface)
schema = Schema(
query=Query,
types=[SomeSubtype]
)
schema = Schema(query=Query, types=[SomeSubtype])
assert schema.get_type_map()['SomeSubtype'].graphene_type is SomeSubtype
assert schema.get_type_map()["SomeSubtype"].graphene_type is SomeSubtype
def test_includes_types_in_union():
@ -165,19 +156,16 @@ def test_includes_types_in_union():
b = String()
class MyUnion(Union):
class Meta:
types = (SomeType, OtherType)
class Query(ObjectType):
union = Field(MyUnion)
schema = Schema(
query=Query,
)
schema = Schema(query=Query)
assert schema.get_type_map()['OtherType'].graphene_type is OtherType
assert schema.get_type_map()['SomeType'].graphene_type is SomeType
assert schema.get_type_map()["OtherType"].graphene_type is OtherType
assert schema.get_type_map()["SomeType"].graphene_type is SomeType
def test_maps_enum():
@ -188,19 +176,16 @@ def test_maps_enum():
b = String()
class MyUnion(Union):
class Meta:
types = (SomeType, OtherType)
class Query(ObjectType):
union = Field(MyUnion)
schema = Schema(
query=Query,
)
schema = Schema(query=Query)
assert schema.get_type_map()['OtherType'].graphene_type is OtherType
assert schema.get_type_map()['SomeType'].graphene_type is SomeType
assert schema.get_type_map()["OtherType"].graphene_type is OtherType
assert schema.get_type_map()["SomeType"].graphene_type is SomeType
def test_includes_interfaces_subtypes_in_the_type_map():
@ -208,33 +193,29 @@ def test_includes_interfaces_subtypes_in_the_type_map():
f = Int()
class SomeSubtype(ObjectType):
class Meta:
interfaces = (SomeInterface, )
interfaces = (SomeInterface,)
class Query(ObjectType):
iface = Field(SomeInterface)
schema = Schema(
query=Query,
types=[SomeSubtype]
)
schema = Schema(query=Query, types=[SomeSubtype])
assert schema.get_type_map()['SomeSubtype'].graphene_type is SomeSubtype
assert schema.get_type_map()["SomeSubtype"].graphene_type is SomeSubtype
def test_stringifies_simple_types():
assert str(Int) == 'Int'
assert str(Article) == 'Article'
assert str(MyInterface) == 'MyInterface'
assert str(MyUnion) == 'MyUnion'
assert str(MyEnum) == 'MyEnum'
assert str(MyInputObjectType) == 'MyInputObjectType'
assert str(NonNull(Int)) == 'Int!'
assert str(List(Int)) == '[Int]'
assert str(NonNull(List(Int))) == '[Int]!'
assert str(List(NonNull(Int))) == '[Int!]'
assert str(List(List(Int))) == '[[Int]]'
assert str(Int) == "Int"
assert str(Article) == "Article"
assert str(MyInterface) == "MyInterface"
assert str(MyUnion) == "MyUnion"
assert str(MyEnum) == "MyEnum"
assert str(MyInputObjectType) == "MyInputObjectType"
assert str(NonNull(Int)) == "Int!"
assert str(List(Int)) == "[Int]"
assert str(NonNull(List(Int))) == "[Int]!"
assert str(List(NonNull(Int))) == "[Int!]"
assert str(List(List(Int))) == "[[Int]]"
# def test_identifies_input_types():

View File

@ -8,30 +8,31 @@ from ..structures import List, NonNull
def test_dynamic():
dynamic = Dynamic(lambda: String)
assert dynamic.get_type() == String
assert str(dynamic.get_type()) == 'String'
assert str(dynamic.get_type()) == "String"
def test_nonnull():
dynamic = Dynamic(lambda: NonNull(String))
assert dynamic.get_type().of_type == String
assert str(dynamic.get_type()) == 'String!'
assert str(dynamic.get_type()) == "String!"
def test_list():
dynamic = Dynamic(lambda: List(String))
assert dynamic.get_type().of_type == String
assert str(dynamic.get_type()) == '[String]'
assert str(dynamic.get_type()) == "[String]"
def test_list_non_null():
dynamic = Dynamic(lambda: List(NonNull(String)))
assert dynamic.get_type().of_type.of_type == String
assert str(dynamic.get_type()) == '[String!]'
assert str(dynamic.get_type()) == "[String!]"
def test_partial():
def __type(_type):
return _type
dynamic = Dynamic(partial(__type, String))
assert dynamic.get_type() == String
assert str(dynamic.get_type()) == 'String'
assert str(dynamic.get_type()) == "String"

View File

@ -9,7 +9,8 @@ from ..schema import ObjectType, Schema
def test_enum_construction():
class RGB(Enum):
'''Description'''
"""Description"""
RED = 1
GREEN = 2
BLUE = 3
@ -18,49 +19,41 @@ def test_enum_construction():
def description(self):
return "Description {}".format(self.name)
assert RGB._meta.name == 'RGB'
assert RGB._meta.description == 'Description'
assert RGB._meta.name == "RGB"
assert RGB._meta.description == "Description"
values = RGB._meta.enum.__members__.values()
assert sorted([v.name for v in values]) == [
'BLUE',
'GREEN',
'RED'
]
assert sorted([v.name for v in values]) == ["BLUE", "GREEN", "RED"]
assert sorted([v.description for v in values]) == [
'Description BLUE',
'Description GREEN',
'Description RED'
"Description BLUE",
"Description GREEN",
"Description RED",
]
def test_enum_construction_meta():
class RGB(Enum):
class Meta:
name = 'RGBEnum'
description = 'Description'
name = "RGBEnum"
description = "Description"
RED = 1
GREEN = 2
BLUE = 3
assert RGB._meta.name == 'RGBEnum'
assert RGB._meta.description == 'Description'
assert RGB._meta.name == "RGBEnum"
assert RGB._meta.description == "Description"
def test_enum_instance_construction():
RGB = Enum('RGB', 'RED,GREEN,BLUE')
RGB = Enum("RGB", "RED,GREEN,BLUE")
values = RGB._meta.enum.__members__.values()
assert sorted([v.name for v in values]) == [
'BLUE',
'GREEN',
'RED'
]
assert sorted([v.name for v in values]) == ["BLUE", "GREEN", "RED"]
def test_enum_from_builtin_enum():
PyRGB = PyEnum('RGB', 'RED,GREEN,BLUE')
PyRGB = PyEnum("RGB", "RED,GREEN,BLUE")
RGB = Enum.from_enum(PyRGB)
assert RGB._meta.enum == PyRGB
@ -74,30 +67,51 @@ def test_enum_from_builtin_enum_accepts_lambda_description():
if not value:
return "StarWars Episodes"
return 'New Hope Episode' if value == Episode.NEWHOPE else 'Other'
return "New Hope Episode" if value == Episode.NEWHOPE else "Other"
def custom_deprecation_reason(value):
return 'meh' if value == Episode.NEWHOPE else None
return "meh" if value == Episode.NEWHOPE else None
PyEpisode = PyEnum('PyEpisode', 'NEWHOPE,EMPIRE,JEDI')
Episode = Enum.from_enum(PyEpisode, description=custom_description,
deprecation_reason=custom_deprecation_reason)
PyEpisode = PyEnum("PyEpisode", "NEWHOPE,EMPIRE,JEDI")
Episode = Enum.from_enum(
PyEpisode,
description=custom_description,
deprecation_reason=custom_deprecation_reason,
)
class Query(ObjectType):
foo = Episode()
schema = Schema(query=Query)
GraphQLPyEpisode = schema._type_map['PyEpisode'].values
GraphQLPyEpisode = schema._type_map["PyEpisode"].values
assert schema._type_map['PyEpisode'].description == "StarWars Episodes"
assert GraphQLPyEpisode[0].name == 'NEWHOPE' and GraphQLPyEpisode[0].description == 'New Hope Episode'
assert GraphQLPyEpisode[1].name == 'EMPIRE' and GraphQLPyEpisode[1].description == 'Other'
assert GraphQLPyEpisode[2].name == 'JEDI' and GraphQLPyEpisode[2].description == 'Other'
assert schema._type_map["PyEpisode"].description == "StarWars Episodes"
assert (
GraphQLPyEpisode[0].name == "NEWHOPE"
and GraphQLPyEpisode[0].description == "New Hope Episode"
)
assert (
GraphQLPyEpisode[1].name == "EMPIRE"
and GraphQLPyEpisode[1].description == "Other"
)
assert (
GraphQLPyEpisode[2].name == "JEDI"
and GraphQLPyEpisode[2].description == "Other"
)
assert GraphQLPyEpisode[0].name == 'NEWHOPE' and GraphQLPyEpisode[0].deprecation_reason == 'meh'
assert GraphQLPyEpisode[1].name == 'EMPIRE' and GraphQLPyEpisode[1].deprecation_reason is None
assert GraphQLPyEpisode[2].name == 'JEDI' and GraphQLPyEpisode[2].deprecation_reason is None
assert (
GraphQLPyEpisode[0].name == "NEWHOPE"
and GraphQLPyEpisode[0].deprecation_reason == "meh"
)
assert (
GraphQLPyEpisode[1].name == "EMPIRE"
and GraphQLPyEpisode[1].deprecation_reason is None
)
assert (
GraphQLPyEpisode[2].name == "JEDI"
and GraphQLPyEpisode[2].deprecation_reason is None
)
def test_enum_from_python3_enum_uses_enum_doc():
@ -108,6 +122,7 @@ def test_enum_from_python3_enum_uses_enum_doc():
class Color(PyEnum):
"""This is the description"""
RED = 1
GREEN = 2
BLUE = 3
@ -196,9 +211,9 @@ def test_enum_can_retrieve_members():
GREEN = 2
BLUE = 3
assert RGB['RED'] == RGB.RED
assert RGB['GREEN'] == RGB.GREEN
assert RGB['BLUE'] == RGB.BLUE
assert RGB["RED"] == RGB.RED
assert RGB["GREEN"] == RGB.GREEN
assert RGB["BLUE"] == RGB.BLUE
def test_enum_to_enum_comparison_should_differ():
@ -220,14 +235,14 @@ def test_enum_to_enum_comparison_should_differ():
def test_enum_skip_meta_from_members():
class RGB1(Enum):
class Meta:
name = 'RGB'
name = "RGB"
RED = 1
GREEN = 2
BLUE = 3
assert dict(RGB1._meta.enum.__members__) == {
'RED': RGB1.RED,
'GREEN': RGB1.GREEN,
'BLUE': RGB1.BLUE,
"RED": RGB1.RED,
"GREEN": RGB1.GREEN,
"BLUE": RGB1.BLUE,
}

View File

@ -10,31 +10,33 @@ from .utils import MyLazyType
class MyInstance(object):
value = 'value'
value_func = staticmethod(lambda: 'value_func')
value = "value"
value_func = staticmethod(lambda: "value_func")
def value_method(self):
return 'value_method'
return "value_method"
def test_field_basic():
MyType = object()
args = {'my arg': Argument(True)}
args = {"my arg": Argument(True)}
def resolver(): return None
deprecation_reason = 'Deprecated now'
description = 'My Field'
my_default = 'something'
def resolver():
return None
deprecation_reason = "Deprecated now"
description = "My Field"
my_default = "something"
field = Field(
MyType,
name='name',
name="name",
args=args,
resolver=resolver,
description=description,
deprecation_reason=deprecation_reason,
default_value=my_default,
)
assert field.name == 'name'
assert field.name == "name"
assert field.args == args
assert field.resolver == resolver
assert field.deprecation_reason == deprecation_reason
@ -55,12 +57,12 @@ def test_field_default_value_not_callable():
Field(MyType, default_value=lambda: True)
except AssertionError as e:
# substring comparison for py 2/3 compatibility
assert 'The default value can not be a function but received' in str(e)
assert "The default value can not be a function but received" in str(e)
def test_field_source():
MyType = object()
field = Field(MyType, source='value')
field = Field(MyType, source="value")
assert field.resolver(MyInstance(), None) == MyInstance.value
@ -84,46 +86,48 @@ def test_field_with_string_type():
def test_field_not_source_and_resolver():
MyType = object()
with pytest.raises(Exception) as exc_info:
Field(MyType, source='value', resolver=lambda: None)
assert str(
exc_info.value) == 'A Field cannot have a source and a resolver in at the same time.'
Field(MyType, source="value", resolver=lambda: None)
assert (
str(exc_info.value)
== "A Field cannot have a source and a resolver in at the same time."
)
def test_field_source_func():
MyType = object()
field = Field(MyType, source='value_func')
field = Field(MyType, source="value_func")
assert field.resolver(MyInstance(), None) == MyInstance.value_func()
def test_field_source_method():
MyType = object()
field = Field(MyType, source='value_method')
field = Field(MyType, source="value_method")
assert field.resolver(MyInstance(), None) == MyInstance().value_method()
def test_field_source_as_argument():
MyType = object()
field = Field(MyType, source=String())
assert 'source' in field.args
assert field.args['source'].type == String
assert "source" in field.args
assert field.args["source"].type == String
def test_field_name_as_argument():
MyType = object()
field = Field(MyType, name=String())
assert 'name' in field.args
assert field.args['name'].type == String
assert "name" in field.args
assert field.args["name"].type == String
def test_field_source_argument_as_kw():
MyType = object()
field = Field(MyType, b=NonNull(True), c=Argument(None), a=NonNull(False))
assert list(field.args.keys()) == ['b', 'c', 'a']
assert isinstance(field.args['b'], Argument)
assert isinstance(field.args['b'].type, NonNull)
assert field.args['b'].type.of_type is True
assert isinstance(field.args['c'], Argument)
assert field.args['c'].type is None
assert isinstance(field.args['a'], Argument)
assert isinstance(field.args['a'].type, NonNull)
assert field.args['a'].type.of_type is False
assert list(field.args.keys()) == ["b", "c", "a"]
assert isinstance(field.args["b"], Argument)
assert isinstance(field.args["b"].type, NonNull)
assert field.args["b"].type.of_type is True
assert isinstance(field.args["c"], Argument)
assert field.args["c"].type is None
assert isinstance(field.args["a"], Argument)
assert isinstance(field.args["a"].type, NonNull)
assert field.args["a"].type.of_type is False

View File

@ -18,44 +18,36 @@ def test_generic_query_variable():
1,
1.1,
True,
'str',
"str",
[1, 2, 3],
[1.1, 2.2, 3.3],
[True, False],
['str1', 'str2'],
["str1", "str2"],
{"key_a": "a", "key_b": "b"},
{
'key_a': 'a',
'key_b': 'b'
"int": 1,
"float": 1.1,
"boolean": True,
"string": "str",
"int_list": [1, 2, 3],
"float_list": [1.1, 2.2, 3.3],
"boolean_list": [True, False],
"string_list": ["str1", "str2"],
"nested_dict": {"key_a": "a", "key_b": "b"},
},
{
'int': 1,
'float': 1.1,
'boolean': True,
'string': 'str',
'int_list': [1, 2, 3],
'float_list': [1.1, 2.2, 3.3],
'boolean_list': [True, False],
'string_list': ['str1', 'str2'],
'nested_dict': {
'key_a': 'a',
'key_b': 'b'
}
},
None
None,
]:
result = schema.execute(
'''query Test($generic: GenericScalar){ generic(input: $generic) }''',
variable_values={'generic': generic_value}
"""query Test($generic: GenericScalar){ generic(input: $generic) }""",
variables={"generic": generic_value},
)
assert not result.errors
assert result.data == {
'generic': generic_value
}
assert result.data == {"generic": generic_value}
def test_generic_parse_literal_query():
result = schema.execute(
'''
"""
query {
generic(input: {
int: 1,
@ -73,23 +65,20 @@ def test_generic_parse_literal_query():
empty_key: undefined
})
}
'''
"""
)
assert not result.errors
assert result.data == {
'generic': {
'int': 1,
'float': 1.1,
'boolean': True,
'string': 'str',
'int_list': [1, 2, 3],
'float_list': [1.1, 2.2, 3.3],
'boolean_list': [True, False],
'string_list': ['str1', 'str2'],
'nested_dict': {
'key_a': 'a',
'key_b': 'b'
},
'empty_key': None
"generic": {
"int": 1,
"float": 1.1,
"boolean": True,
"string": "str",
"int_list": [1, 2, 3],
"float_list": [1.1, 2.2, 3.3],
"boolean_list": [True, False],
"string_list": ["str1", "str2"],
"nested_dict": {"key_a": "a", "key_b": "b"},
"empty_key": None,
}
}

View File

@ -14,14 +14,13 @@ class MyType(object):
class MyScalar(UnmountedType):
def get_type(self):
return MyType
def test_generate_inputobjecttype():
class MyInputObjectType(InputObjectType):
'''Documentation'''
"""Documentation"""
assert MyInputObjectType._meta.name == "MyInputObjectType"
assert MyInputObjectType._meta.description == "Documentation"
@ -30,10 +29,9 @@ def test_generate_inputobjecttype():
def test_generate_inputobjecttype_with_meta():
class MyInputObjectType(InputObjectType):
class Meta:
name = 'MyOtherInputObjectType'
description = 'Documentation'
name = "MyOtherInputObjectType"
description = "Documentation"
assert MyInputObjectType._meta.name == "MyOtherInputObjectType"
assert MyInputObjectType._meta.description == "Documentation"
@ -43,7 +41,7 @@ def test_generate_inputobjecttype_with_fields():
class MyInputObjectType(InputObjectType):
field = Field(MyType)
assert 'field' in MyInputObjectType._meta.fields
assert "field" in MyInputObjectType._meta.fields
def test_ordered_fields_in_inputobjecttype():
@ -53,16 +51,15 @@ def test_ordered_fields_in_inputobjecttype():
field = MyScalar()
asa = InputField(MyType)
assert list(MyInputObjectType._meta.fields.keys()) == [
'b', 'a', 'field', 'asa']
assert list(MyInputObjectType._meta.fields.keys()) == ["b", "a", "field", "asa"]
def test_generate_inputobjecttype_unmountedtype():
class MyInputObjectType(InputObjectType):
field = MyScalar(MyType)
assert 'field' in MyInputObjectType._meta.fields
assert isinstance(MyInputObjectType._meta.fields['field'], InputField)
assert "field" in MyInputObjectType._meta.fields
assert isinstance(MyInputObjectType._meta.fields["field"], InputField)
def test_generate_inputobjecttype_as_argument():
@ -72,13 +69,13 @@ def test_generate_inputobjecttype_as_argument():
class MyObjectType(ObjectType):
field = Field(MyType, input=MyInputObjectType())
assert 'field' in MyObjectType._meta.fields
field = MyObjectType._meta.fields['field']
assert "field" in MyObjectType._meta.fields
field = MyObjectType._meta.fields["field"]
assert isinstance(field, Field)
assert field.type == MyType
assert 'input' in field.args
assert isinstance(field.args['input'], Argument)
assert field.args['input'].type == MyInputObjectType
assert "input" in field.args
assert isinstance(field.args["input"], Argument)
assert field.args["input"].type == MyInputObjectType
def test_generate_inputobjecttype_inherit_abstracttype():
@ -88,9 +85,11 @@ def test_generate_inputobjecttype_inherit_abstracttype():
class MyInputObjectType(InputObjectType, MyAbstractType):
field2 = MyScalar(MyType)
assert list(MyInputObjectType._meta.fields.keys()) == ['field1', 'field2']
assert list(MyInputObjectType._meta.fields.keys()) == ["field1", "field2"]
assert [type(x) for x in MyInputObjectType._meta.fields.values()] == [
InputField, InputField]
InputField,
InputField,
]
def test_generate_inputobjecttype_inherit_abstracttype_reversed():
@ -100,9 +99,11 @@ def test_generate_inputobjecttype_inherit_abstracttype_reversed():
class MyInputObjectType(MyAbstractType, InputObjectType):
field2 = MyScalar(MyType)
assert list(MyInputObjectType._meta.fields.keys()) == ['field1', 'field2']
assert list(MyInputObjectType._meta.fields.keys()) == ["field1", "field2"]
assert [type(x) for x in MyInputObjectType._meta.fields.values()] == [
InputField, InputField]
InputField,
InputField,
]
def test_inputobjecttype_of_input():
@ -121,14 +122,17 @@ def test_inputobjecttype_of_input():
is_child = Boolean(parent=Parent())
def resolve_is_child(self, info, parent):
return isinstance(parent.child, Child) and parent.child.full_name == "Peter Griffin"
return (
isinstance(parent.child, Child)
and parent.child.full_name == "Peter Griffin"
)
schema = Schema(query=Query)
result = schema.execute('''query basequery {
result = schema.execute(
"""query basequery {
isChild(parent: {child: {firstName: "Peter", lastName: "Griffin"}})
}
''')
"""
)
assert not result.errors
assert result.data == {
'isChild': True
}
assert result.data == {"isChild": True}

View File

@ -8,14 +8,13 @@ class MyType(object):
class MyScalar(UnmountedType):
def get_type(self):
return MyType
def test_generate_interface():
class MyInterface(Interface):
'''Documentation'''
"""Documentation"""
assert MyInterface._meta.name == "MyInterface"
assert MyInterface._meta.description == "Documentation"
@ -24,10 +23,9 @@ def test_generate_interface():
def test_generate_interface_with_meta():
class MyInterface(Interface):
class Meta:
name = 'MyOtherInterface'
description = 'Documentation'
name = "MyOtherInterface"
description = "Documentation"
assert MyInterface._meta.name == "MyOtherInterface"
assert MyInterface._meta.description == "Documentation"
@ -37,7 +35,7 @@ def test_generate_interface_with_fields():
class MyInterface(Interface):
field = Field(MyType)
assert 'field' in MyInterface._meta.fields
assert "field" in MyInterface._meta.fields
def test_ordered_fields_in_interface():
@ -47,15 +45,15 @@ def test_ordered_fields_in_interface():
field = MyScalar()
asa = Field(MyType)
assert list(MyInterface._meta.fields.keys()) == ['b', 'a', 'field', 'asa']
assert list(MyInterface._meta.fields.keys()) == ["b", "a", "field", "asa"]
def test_generate_interface_unmountedtype():
class MyInterface(Interface):
field = MyScalar()
assert 'field' in MyInterface._meta.fields
assert isinstance(MyInterface._meta.fields['field'], Field)
assert "field" in MyInterface._meta.fields
assert isinstance(MyInterface._meta.fields["field"], Field)
def test_generate_interface_inherit_abstracttype():
@ -65,7 +63,7 @@ def test_generate_interface_inherit_abstracttype():
class MyInterface(Interface, MyAbstractType):
field2 = MyScalar()
assert list(MyInterface._meta.fields.keys()) == ['field1', 'field2']
assert list(MyInterface._meta.fields.keys()) == ["field1", "field2"]
assert [type(x) for x in MyInterface._meta.fields.values()] == [Field, Field]
@ -76,8 +74,8 @@ def test_generate_interface_inherit_interface():
class MyInterface(MyBaseInterface):
field2 = MyScalar()
assert MyInterface._meta.name == 'MyInterface'
assert list(MyInterface._meta.fields.keys()) == ['field1', 'field2']
assert MyInterface._meta.name == "MyInterface"
assert list(MyInterface._meta.fields.keys()) == ["field1", "field2"]
assert [type(x) for x in MyInterface._meta.fields.values()] == [Field, Field]
@ -88,5 +86,5 @@ def test_generate_interface_inherit_abstracttype_reversed():
class MyInterface(MyAbstractType, Interface):
field2 = MyScalar()
assert list(MyInterface._meta.fields.keys()) == ['field1', 'field2']
assert list(MyInterface._meta.fields.keys()) == ["field1", "field2"]
assert [type(x) for x in MyInterface._meta.fields.values()] == [Field, Field]

View File

@ -18,21 +18,17 @@ def test_jsonstring_query():
json_value = '{"key": "value"}'
json_value_quoted = json_value.replace('"', '\\"')
result = schema.execute('''{ json(input: "%s") }''' % json_value_quoted)
result = schema.execute("""{ json(input: "%s") }""" % json_value_quoted)
assert not result.errors
assert result.data == {
'json': json_value
}
assert result.data == {"json": json_value}
def test_jsonstring_query_variable():
json_value = '{"key": "value"}'
result = schema.execute(
'''query Test($json: JSONString){ json(input: $json) }''',
variable_values={'json': json_value}
"""query Test($json: JSONString){ json(input: $json) }""",
variables={"json": json_value},
)
assert not result.errors
assert result.data == {
'json': json_value
}
assert result.data == {"json": json_value}

View File

@ -4,9 +4,8 @@ from ..scalars import String
class CustomField(Field):
def __init__(self, *args, **kwargs):
self.metadata = kwargs.pop('metadata', None)
self.metadata = kwargs.pop("metadata", None)
super(CustomField, self).__init__(*args, **kwargs)
@ -18,8 +17,8 @@ def test_mounted_type():
def test_mounted_type_custom():
unmounted = String(metadata={'hey': 'yo!'})
unmounted = String(metadata={"hey": "yo!"})
mounted = CustomField.mounted(unmounted)
assert isinstance(mounted, CustomField)
assert mounted.type == String
assert mounted.metadata == {'hey': 'yo!'}
assert mounted.metadata == {"hey": "yo!"}

View File

@ -11,7 +11,7 @@ from ..structures import NonNull
def test_generate_mutation_no_args():
class MyMutation(Mutation):
'''Documentation'''
"""Documentation"""
def mutate(self, info, **args):
return args
@ -19,24 +19,23 @@ def test_generate_mutation_no_args():
assert issubclass(MyMutation, ObjectType)
assert MyMutation._meta.name == "MyMutation"
assert MyMutation._meta.description == "Documentation"
resolved = MyMutation.Field().resolver(None, None, name='Peter')
assert resolved == {'name': 'Peter'}
resolved = MyMutation.Field().resolver(None, None, name="Peter")
assert resolved == {"name": "Peter"}
def test_generate_mutation_with_meta():
class MyMutation(Mutation):
class Meta:
name = 'MyOtherMutation'
description = 'Documentation'
name = "MyOtherMutation"
description = "Documentation"
def mutate(self, info, **args):
return args
assert MyMutation._meta.name == "MyOtherMutation"
assert MyMutation._meta.description == "Documentation"
resolved = MyMutation.Field().resolver(None, None, name='Peter')
assert resolved == {'name': 'Peter'}
resolved = MyMutation.Field().resolver(None, None, name="Peter")
assert resolved == {"name": "Peter"}
def test_mutation_raises_exception_if_no_mutate():
@ -45,8 +44,7 @@ def test_mutation_raises_exception_if_no_mutate():
class MyMutation(Mutation):
pass
assert "All mutations must define a mutate method in it" == str(
excinfo.value)
assert "All mutations must define a mutate method in it" == str(excinfo.value)
def test_mutation_custom_output_type():
@ -54,7 +52,6 @@ def test_mutation_custom_output_type():
name = String()
class CreateUser(Mutation):
class Arguments:
name = String()
@ -65,15 +62,14 @@ def test_mutation_custom_output_type():
field = CreateUser.Field()
assert field.type == User
assert field.args == {'name': Argument(String)}
resolved = field.resolver(None, None, name='Peter')
assert field.args == {"name": Argument(String)}
resolved = field.resolver(None, None, name="Peter")
assert isinstance(resolved, User)
assert resolved.name == 'Peter'
assert resolved.name == "Peter"
def test_mutation_execution():
class CreateUser(Mutation):
class Arguments:
name = String()
dynamic = Dynamic(lambda: String())
@ -92,20 +88,17 @@ def test_mutation_execution():
create_user = CreateUser.Field()
schema = Schema(query=Query, mutation=MyMutation)
result = schema.execute(''' mutation mymutation {
result = schema.execute(
""" mutation mymutation {
createUser(name:"Peter", dynamic: "dynamic") {
name
dynamic
}
}
''')
"""
)
assert not result.errors
assert result.data == {
'createUser': {
'name': 'Peter',
'dynamic': 'dynamic',
}
}
assert result.data == {"createUser": {"name": "Peter", "dynamic": "dynamic"}}
def test_mutation_no_fields_output():
@ -122,23 +115,20 @@ def test_mutation_no_fields_output():
create_user = CreateUser.Field()
schema = Schema(query=Query, mutation=MyMutation)
result = schema.execute(''' mutation mymutation {
result = schema.execute(
""" mutation mymutation {
createUser {
name
}
}
''')
"""
)
assert not result.errors
assert result.data == {
'createUser': {
'name': None,
}
}
assert result.data == {"createUser": {"name": None}}
def test_mutation_allow_to_have_custom_args():
class CreateUser(Mutation):
class Arguments:
name = String()
@ -149,12 +139,75 @@ def test_mutation_allow_to_have_custom_args():
class MyMutation(ObjectType):
create_user = CreateUser.Field(
description='Create a user',
deprecation_reason='Is deprecated',
required=True
name="createUser",
description="Create a user",
deprecation_reason="Is deprecated",
required=True,
)
field = MyMutation._meta.fields['create_user']
assert field.description == 'Create a user'
assert field.deprecation_reason == 'Is deprecated'
field = MyMutation._meta.fields["create_user"]
assert field.name == "createUser"
assert field.description == "Create a user"
assert field.deprecation_reason == "Is deprecated"
assert field.type == NonNull(CreateUser)
def test_mutation_default_args_output():
class CreateUser(Mutation):
"""Description."""
class Arguments:
name = String()
name = String()
def mutate(self, info, name):
return CreateUser(name=name)
class MyMutation(ObjectType):
create_user = CreateUser.Field()
field = MyMutation._meta.fields["create_user"]
assert field.name is None
assert field.description == "Description."
assert field.deprecation_reason is None
assert field.type == CreateUser
def test_mutation_as_subclass():
class BaseCreateUser(Mutation):
class Arguments:
name = String()
name = String()
def mutate(self, info, **args):
return args
class CreateUserWithPlanet(BaseCreateUser):
class Arguments(BaseCreateUser.Arguments):
planet = String()
planet = String()
def mutate(self, info, **args):
return CreateUserWithPlanet(**args)
class MyMutation(ObjectType):
create_user_with_planet = CreateUserWithPlanet.Field()
class Query(ObjectType):
a = String()
schema = Schema(query=Query, mutation=MyMutation)
result = schema.execute(
""" mutation mymutation {
createUserWithPlanet(name:"Peter", planet: "earth") {
name
planet
}
}
"""
)
assert not result.errors
assert result.data == {"createUserWithPlanet": {"name": "Peter", "planet": "earth"}}

View File

@ -23,42 +23,42 @@ class MyInterface(Interface):
class ContainerWithInterface(ObjectType):
class Meta:
interfaces = (MyInterface, )
interfaces = (MyInterface,)
field1 = Field(MyType)
field2 = Field(MyType)
class MyScalar(UnmountedType):
def get_type(self):
return MyType
def test_generate_objecttype():
class MyObjectType(ObjectType):
'''Documentation'''
"""Documentation"""
assert MyObjectType._meta.name == "MyObjectType"
assert MyObjectType._meta.description == "Documentation"
assert MyObjectType._meta.interfaces == tuple()
assert MyObjectType._meta.fields == {}
assert repr(
MyObjectType) == "<MyObjectType meta=<ObjectTypeOptions name='MyObjectType'>>"
assert (
repr(MyObjectType)
== "<MyObjectType meta=<ObjectTypeOptions name='MyObjectType'>>"
)
def test_generate_objecttype_with_meta():
class MyObjectType(ObjectType):
class Meta:
name = 'MyOtherObjectType'
description = 'Documentation'
interfaces = (MyType, )
name = "MyOtherObjectType"
description = "Documentation"
interfaces = (MyType,)
assert MyObjectType._meta.name == "MyOtherObjectType"
assert MyObjectType._meta.description == "Documentation"
assert MyObjectType._meta.interfaces == (MyType, )
assert MyObjectType._meta.interfaces == (MyType,)
def test_generate_lazy_objecttype():
@ -69,7 +69,7 @@ def test_generate_lazy_objecttype():
field = Field(MyType)
assert MyObjectType._meta.name == "MyObjectType"
example_field = MyObjectType._meta.fields['example']
example_field = MyObjectType._meta.fields["example"]
assert isinstance(example_field.type, NonNull)
assert example_field.type.of_type == InnerObjectType
@ -78,21 +78,21 @@ def test_generate_objecttype_with_fields():
class MyObjectType(ObjectType):
field = Field(MyType)
assert 'field' in MyObjectType._meta.fields
assert "field" in MyObjectType._meta.fields
def test_generate_objecttype_with_private_attributes():
class MyObjectType(ObjectType):
_private_state = None
assert '_private_state' not in MyObjectType._meta.fields
assert hasattr(MyObjectType, '_private_state')
assert "_private_state" not in MyObjectType._meta.fields
assert hasattr(MyObjectType, "_private_state")
m = MyObjectType(_private_state='custom')
assert m._private_state == 'custom'
m = MyObjectType(_private_state="custom")
assert m._private_state == "custom"
with pytest.raises(TypeError):
MyObjectType(_other_private_state='Wrong')
MyObjectType(_other_private_state="Wrong")
def test_ordered_fields_in_objecttype():
@ -102,7 +102,7 @@ def test_ordered_fields_in_objecttype():
field = MyScalar()
asa = Field(MyType)
assert list(MyObjectType._meta.fields.keys()) == ['b', 'a', 'field', 'asa']
assert list(MyObjectType._meta.fields.keys()) == ["b", "a", "field", "asa"]
def test_generate_objecttype_inherit_abstracttype():
@ -115,9 +115,8 @@ def test_generate_objecttype_inherit_abstracttype():
assert MyObjectType._meta.description is None
assert MyObjectType._meta.interfaces == ()
assert MyObjectType._meta.name == "MyObjectType"
assert list(MyObjectType._meta.fields.keys()) == ['field1', 'field2']
assert list(map(type, MyObjectType._meta.fields.values())) == [
Field, Field]
assert list(MyObjectType._meta.fields.keys()) == ["field1", "field2"]
assert list(map(type, MyObjectType._meta.fields.values())) == [Field, Field]
def test_generate_objecttype_inherit_abstracttype_reversed():
@ -130,26 +129,28 @@ def test_generate_objecttype_inherit_abstracttype_reversed():
assert MyObjectType._meta.description is None
assert MyObjectType._meta.interfaces == ()
assert MyObjectType._meta.name == "MyObjectType"
assert list(MyObjectType._meta.fields.keys()) == ['field1', 'field2']
assert list(map(type, MyObjectType._meta.fields.values())) == [
Field, Field]
assert list(MyObjectType._meta.fields.keys()) == ["field1", "field2"]
assert list(map(type, MyObjectType._meta.fields.values())) == [Field, Field]
def test_generate_objecttype_unmountedtype():
class MyObjectType(ObjectType):
field = MyScalar()
assert 'field' in MyObjectType._meta.fields
assert isinstance(MyObjectType._meta.fields['field'], Field)
assert "field" in MyObjectType._meta.fields
assert isinstance(MyObjectType._meta.fields["field"], Field)
def test_parent_container_get_fields():
assert list(Container._meta.fields.keys()) == ['field1', 'field2']
assert list(Container._meta.fields.keys()) == ["field1", "field2"]
def test_parent_container_interface_get_fields():
assert list(ContainerWithInterface._meta.fields.keys()) == [
'ifield', 'field1', 'field2']
"ifield",
"field1",
"field2",
]
def test_objecttype_as_container_only_args():
@ -187,49 +188,49 @@ def test_objecttype_as_container_invalid_kwargs():
Container(unexisting_field="3")
assert "'unexisting_field' is an invalid keyword argument for Container" == str(
excinfo.value)
excinfo.value
)
def test_objecttype_container_benchmark(benchmark):
@benchmark
def create_objecttype():
Container(field1='field1', field2='field2')
Container(field1="field1", field2="field2")
def test_generate_objecttype_description():
class MyObjectType(ObjectType):
'''
"""
Documentation
Documentation line 2
'''
"""
assert MyObjectType._meta.description == "Documentation\n\nDocumentation line 2"
def test_objecttype_with_possible_types():
class MyObjectType(ObjectType):
class Meta:
possible_types = (dict, )
possible_types = (dict,)
assert MyObjectType._meta.possible_types == (dict, )
assert MyObjectType._meta.possible_types == (dict,)
def test_objecttype_with_possible_types_and_is_type_of_should_raise():
with pytest.raises(AssertionError) as excinfo:
class MyObjectType(ObjectType):
class MyObjectType(ObjectType):
class Meta:
possible_types = (dict, )
possible_types = (dict,)
@classmethod
def is_type_of(cls, root, context, info):
return False
assert str(excinfo.value) == (
'MyObjectType.Meta.possible_types will cause type collision with '
'MyObjectType.is_type_of. Please use one or other.'
"MyObjectType.Meta.possible_types will cause type collision with "
"MyObjectType.is_type_of. Please use one or other."
)
@ -244,24 +245,23 @@ def test_objecttype_no_fields_output():
return User()
schema = Schema(query=Query)
result = schema.execute(''' query basequery {
result = schema.execute(
""" query basequery {
user {
name
}
}
''')
"""
)
assert not result.errors
assert result.data == {
'user': {
'name': None,
}
}
assert result.data == {"user": {"name": None}}
def test_abstract_objecttype_can_str():
class MyObjectType(ObjectType):
class Meta:
abstract = True
field = MyScalar()
assert str(MyObjectType) == "MyObjectType"

View File

@ -18,13 +18,13 @@ from ..union import Union
def test_query():
class Query(ObjectType):
hello = String(resolver=lambda *_: 'World')
hello = String(resolver=lambda *_: "World")
hello_schema = Schema(Query)
executed = hello_schema.execute('{ hello }')
executed = hello_schema.execute("{ hello }")
assert not executed.errors
assert executed.data == {'hello': 'World'}
assert executed.data == {"hello": "World"}
def test_query_source():
@ -39,9 +39,9 @@ def test_query_source():
hello_schema = Schema(Query)
executed = hello_schema.execute('{ hello }', Root())
executed = hello_schema.execute("{ hello }", Root())
assert not executed.errors
assert executed.data == {'hello': 'World'}
assert executed.data == {"hello": "World"}
def test_query_union():
@ -66,7 +66,6 @@ def test_query_union():
return isinstance(root, two_object)
class MyUnion(Union):
class Meta:
types = (One, Two)
@ -78,15 +77,9 @@ def test_query_union():
hello_schema = Schema(Query)
executed = hello_schema.execute('{ unions { __typename } }')
executed = hello_schema.execute("{ unions { __typename } }")
assert not executed.errors
assert executed.data == {
'unions': [{
'__typename': 'One'
}, {
'__typename': 'Two'
}]
}
assert executed.data == {"unions": [{"__typename": "One"}, {"__typename": "Two"}]}
def test_query_interface():
@ -100,9 +93,8 @@ def test_query_interface():
base = String()
class One(ObjectType):
class Meta:
interfaces = (MyInterface, )
interfaces = (MyInterface,)
one = String()
@ -111,9 +103,8 @@ def test_query_interface():
return isinstance(root, one_object)
class Two(ObjectType):
class Meta:
interfaces = (MyInterface, )
interfaces = (MyInterface,)
two = String()
@ -129,30 +120,28 @@ def test_query_interface():
hello_schema = Schema(Query, types=[One, Two])
executed = hello_schema.execute('{ interfaces { __typename } }')
executed = hello_schema.execute("{ interfaces { __typename } }")
assert not executed.errors
assert executed.data == {
'interfaces': [{
'__typename': 'One'
}, {
'__typename': 'Two'
}]
"interfaces": [{"__typename": "One"}, {"__typename": "Two"}]
}
def test_query_dynamic():
class Query(ObjectType):
hello = Dynamic(lambda: String(resolver=lambda *_: 'World'))
hellos = Dynamic(lambda: List(String, resolver=lambda *_: ['Worlds']))
hello_field = Dynamic(lambda: Field(
String, resolver=lambda *_: 'Field World'))
hello = Dynamic(lambda: String(resolver=lambda *_: "World"))
hellos = Dynamic(lambda: List(String, resolver=lambda *_: ["Worlds"]))
hello_field = Dynamic(lambda: Field(String, resolver=lambda *_: "Field World"))
hello_schema = Schema(Query)
executed = hello_schema.execute('{ hello hellos helloField }')
executed = hello_schema.execute("{ hello hellos helloField }")
assert not executed.errors
assert executed.data == {'hello': 'World', 'hellos': [
'Worlds'], 'helloField': 'Field World'}
assert executed.data == {
"hello": "World",
"hellos": ["Worlds"],
"helloField": "Field World",
}
def test_query_default_value():
@ -160,13 +149,13 @@ def test_query_default_value():
field = String()
class Query(ObjectType):
hello = Field(MyType, default_value=MyType(field='something else!'))
hello = Field(MyType, default_value=MyType(field="something else!"))
hello_schema = Schema(Query)
executed = hello_schema.execute('{ hello { field } }')
executed = hello_schema.execute("{ hello { field } }")
assert not executed.errors
assert executed.data == {'hello': {'field': 'something else!'}}
assert executed.data == {"hello": {"field": "something else!"}}
def test_query_wrong_default_value():
@ -178,15 +167,17 @@ def test_query_wrong_default_value():
return isinstance(root, MyType)
class Query(ObjectType):
hello = Field(MyType, default_value='hello')
hello = Field(MyType, default_value="hello")
hello_schema = Schema(Query)
executed = hello_schema.execute('{ hello { field } }')
executed = hello_schema.execute("{ hello { field } }")
assert len(executed.errors) == 1
assert executed.errors[0].message == GraphQLError(
'Expected value of type "MyType" but got: str.').message
assert executed.data == {'hello': None}
assert (
executed.errors[0].message
== GraphQLError('Expected value of type "MyType" but got: str.').message
)
assert executed.data == {"hello": None}
def test_query_default_value_ignored_by_resolver():
@ -194,14 +185,17 @@ def test_query_default_value_ignored_by_resolver():
field = String()
class Query(ObjectType):
hello = Field(MyType, default_value='hello',
resolver=lambda *_: MyType(field='no default.'))
hello = Field(
MyType,
default_value="hello",
resolver=lambda *_: MyType(field="no default."),
)
hello_schema = Schema(Query)
executed = hello_schema.execute('{ hello { field } }')
executed = hello_schema.execute("{ hello { field } }")
assert not executed.errors
assert executed.data == {'hello': {'field': 'no default.'}}
assert executed.data == {"hello": {"field": "no default."}}
def test_query_resolve_function():
@ -209,13 +203,13 @@ def test_query_resolve_function():
hello = String()
def resolve_hello(self, info):
return 'World'
return "World"
hello_schema = Schema(Query)
executed = hello_schema.execute('{ hello }')
executed = hello_schema.execute("{ hello }")
assert not executed.errors
assert executed.data == {'hello': 'World'}
assert executed.data == {"hello": "World"}
def test_query_arguments():
@ -223,24 +217,23 @@ def test_query_arguments():
test = String(a_str=String(), a_int=Int())
def resolve_test(self, info, **args):
return json.dumps([self, args], separators=(',', ':'))
return json.dumps([self, args], separators=(",", ":"))
test_schema = Schema(Query)
result = test_schema.execute('{ test }', None)
result = test_schema.execute("{ test }", None)
assert not result.errors
assert result.data == {'test': '[null,{}]'}
assert result.data == {"test": "[null,{}]"}
result = test_schema.execute('{ test(aStr: "String!") }', 'Source!')
result = test_schema.execute('{ test(aStr: "String!") }', "Source!")
assert not result.errors
assert result.data == {'test': '["Source!",{"a_str":"String!"}]'}
assert result.data == {"test": '["Source!",{"a_str":"String!"}]'}
result = test_schema.execute(
'{ test(aInt: -123, aStr: "String!") }', 'Source!')
result = test_schema.execute('{ test(aInt: -123, aStr: "String!") }', "Source!")
assert not result.errors
assert result.data in [
{'test': '["Source!",{"a_str":"String!","a_int":-123}]'},
{'test': '["Source!",{"a_int":-123,"a_str":"String!"}]'}
{"test": '["Source!",{"a_str":"String!","a_int":-123}]'},
{"test": '["Source!",{"a_int":-123,"a_str":"String!"}]'},
]
@ -253,25 +246,25 @@ def test_query_input_field():
test = String(a_input=Input())
def resolve_test(self, info, **args):
return json.dumps([self, args], separators=(',', ':'))
return json.dumps([self, args], separators=(",", ":"))
test_schema = Schema(Query)
result = test_schema.execute('{ test }', None)
result = test_schema.execute("{ test }", None)
assert not result.errors
assert result.data == {'test': '[null,{}]'}
assert result.data == {"test": "[null,{}]"}
result = test_schema.execute('{ test(aInput: {aField: "String!"} ) }', "Source!")
assert not result.errors
assert result.data == {"test": '["Source!",{"a_input":{"a_field":"String!"}}]'}
result = test_schema.execute(
'{ test(aInput: {aField: "String!"} ) }', 'Source!')
'{ test(aInput: {recursiveField: {aField: "String!"}}) }', "Source!"
)
assert not result.errors
assert result.data == {
'test': '["Source!",{"a_input":{"a_field":"String!"}}]'}
result = test_schema.execute(
'{ test(aInput: {recursiveField: {aField: "String!"}}) }', 'Source!')
assert not result.errors
assert result.data == {
'test': '["Source!",{"a_input":{"recursive_field":{"a_field":"String!"}}}]'}
"test": '["Source!",{"a_input":{"recursive_field":{"a_field":"String!"}}}]'
}
def test_query_middlewares():
@ -280,10 +273,10 @@ def test_query_middlewares():
other = String()
def resolve_hello(self, info):
return 'World'
return "World"
def resolve_other(self, info):
return 'other'
return "other"
def reversed_middleware(next, *args, **kwargs):
p = next(*args, **kwargs)
@ -292,14 +285,14 @@ def test_query_middlewares():
hello_schema = Schema(Query)
executed = hello_schema.execute(
'{ hello, other }', middleware=[reversed_middleware])
"{ hello, other }", middleware=[reversed_middleware]
)
assert not executed.errors
assert executed.data == {'hello': 'dlroW', 'other': 'rehto'}
assert executed.data == {"hello": "dlroW", "other": "rehto"}
def test_objecttype_on_instances():
class Ship:
def __init__(self, name):
self.name = name
@ -314,12 +307,12 @@ def test_objecttype_on_instances():
ship = Field(ShipType)
def resolve_ship(self, info):
return Ship(name='xwing')
return Ship(name="xwing")
schema = Schema(query=Query)
executed = schema.execute('{ ship { name } }')
executed = schema.execute("{ ship { name } }")
assert not executed.errors
assert executed.data == {'ship': {'name': 'xwing'}}
assert executed.data == {"ship": {"name": "xwing"}}
def test_big_list_query_benchmark(benchmark):
@ -333,10 +326,10 @@ def test_big_list_query_benchmark(benchmark):
hello_schema = Schema(Query)
big_list_query = partial(hello_schema.execute, '{ allInts }')
big_list_query = partial(hello_schema.execute, "{ allInts }")
result = benchmark(big_list_query)
assert not result.errors
assert result.data == {'allInts': list(big_list)}
assert result.data == {"allInts": list(big_list)}
def test_big_list_query_compiled_query_benchmark(benchmark):
@ -349,13 +342,13 @@ def test_big_list_query_compiled_query_benchmark(benchmark):
return big_list
hello_schema = Schema(Query)
source = Source('{ allInts }')
source = Source("{ allInts }")
query_ast = parse(source)
big_list_query = partial(execute, hello_schema, query_ast)
result = benchmark(big_list_query)
assert not result.errors
assert result.data == {'allInts': list(big_list)}
assert result.data == {"allInts": list(big_list)}
def test_big_list_of_containers_query_benchmark(benchmark):
@ -372,11 +365,10 @@ def test_big_list_of_containers_query_benchmark(benchmark):
hello_schema = Schema(Query)
big_list_query = partial(hello_schema.execute, '{ allContainers { x } }')
big_list_query = partial(hello_schema.execute, "{ allContainers { x } }")
result = benchmark(big_list_query)
assert not result.errors
assert result.data == {'allContainers': [
{'x': c.x} for c in big_container_list]}
assert result.data == {"allContainers": [{"x": c.x} for c in big_container_list]}
def test_big_list_of_containers_multiple_fields_query_benchmark(benchmark):
@ -396,15 +388,19 @@ def test_big_list_of_containers_multiple_fields_query_benchmark(benchmark):
hello_schema = Schema(Query)
big_list_query = partial(hello_schema.execute,
'{ allContainers { x, y, z, o } }')
big_list_query = partial(hello_schema.execute, "{ allContainers { x, y, z, o } }")
result = benchmark(big_list_query)
assert not result.errors
assert result.data == {'allContainers': [
{'x': c.x, 'y': c.y, 'z': c.z, 'o': c.o} for c in big_container_list]}
assert result.data == {
"allContainers": [
{"x": c.x, "y": c.y, "z": c.z, "o": c.o} for c in big_container_list
]
}
def test_big_list_of_containers_multiple_fields_custom_resolvers_query_benchmark(benchmark):
def test_big_list_of_containers_multiple_fields_custom_resolvers_query_benchmark(
benchmark
):
class Container(ObjectType):
x = Int()
y = Int()
@ -433,12 +429,14 @@ def test_big_list_of_containers_multiple_fields_custom_resolvers_query_benchmark
hello_schema = Schema(Query)
big_list_query = partial(hello_schema.execute,
'{ allContainers { x, y, z, o } }')
big_list_query = partial(hello_schema.execute, "{ allContainers { x, y, z, o } }")
result = benchmark(big_list_query)
assert not result.errors
assert result.data == {'allContainers': [
{'x': c.x, 'y': c.y, 'z': c.z, 'o': c.o} for c in big_container_list]}
assert result.data == {
"allContainers": [
{"x": c.x, "y": c.y, "z": c.z, "o": c.o} for c in big_container_list
]
}
def test_query_annotated_resolvers():
@ -464,15 +462,15 @@ def test_query_annotated_resolvers():
result = test_schema.execute('{ annotated(id:"self") }', "base")
assert not result.errors
assert result.data == {'annotated': 'base-self'}
assert result.data == {"annotated": "base-self"}
result = test_schema.execute('{ context }', "base", context_value=context)
result = test_schema.execute("{ context }", "base", context=context)
assert not result.errors
assert result.data == {'context': 'base-context'}
assert result.data == {"context": "base-context"}
result = test_schema.execute('{ info }', "base")
result = test_schema.execute("{ info }", "base")
assert not result.errors
assert result.data == {'info': 'base-info'}
assert result.data == {"info": "base-info"}
def test_default_as_kwarg_to_NonNull():
@ -488,7 +486,7 @@ def test_default_as_kwarg_to_NonNull():
return User(name="foo")
schema = Schema(query=Query)
expected = {'user': {'name': 'foo', 'isAdmin': False}}
expected = {"user": {"name": "foo", "isAdmin": False}}
result = schema.execute("{ user { name isAdmin } }")
assert not result.errors

View File

@ -1,38 +1,40 @@
from ..resolver import (attr_resolver, dict_resolver, get_default_resolver,
set_default_resolver)
from ..resolver import (
attr_resolver,
dict_resolver,
get_default_resolver,
set_default_resolver,
)
args = {}
context = None
info = None
demo_dict = {
'attr': 'value'
}
demo_dict = {"attr": "value"}
class demo_obj(object):
attr = 'value'
attr = "value"
def test_attr_resolver():
resolved = attr_resolver('attr', None, demo_obj, info, **args)
assert resolved == 'value'
resolved = attr_resolver("attr", None, demo_obj, info, **args)
assert resolved == "value"
def test_attr_resolver_default_value():
resolved = attr_resolver('attr2', 'default', demo_obj, info, **args)
assert resolved == 'default'
resolved = attr_resolver("attr2", "default", demo_obj, info, **args)
assert resolved == "default"
def test_dict_resolver():
resolved = dict_resolver('attr', None, demo_dict, info, **args)
assert resolved == 'value'
resolved = dict_resolver("attr", None, demo_dict, info, **args)
assert resolved == "value"
def test_dict_resolver_default_value():
resolved = dict_resolver('attr2', 'default', demo_dict, info, **args)
assert resolved == 'default'
resolved = dict_resolver("attr2", "default", demo_dict, info, **args)
assert resolved == "default"
def test_get_default_resolver_is_attr_resolver():

View File

@ -4,7 +4,7 @@ from ..scalars import Scalar
def test_scalar():
class JSONScalar(Scalar):
'''Documentation'''
"""Documentation"""
assert JSONScalar._meta.name == "JSONScalar"
assert JSONScalar._meta.description == "Documentation"

View File

@ -13,8 +13,8 @@ def test_serializes_output_int():
assert Int.serialize(-9876504321) is None
assert Int.serialize(1e100) is None
assert Int.serialize(-1e100) is None
assert Int.serialize('-1.1') == -1
assert Int.serialize('one') is None
assert Int.serialize("-1.1") == -1
assert Int.serialize("one") is None
assert Int.serialize(False) == 0
assert Int.serialize(True) == 1
@ -26,24 +26,24 @@ def test_serializes_output_float():
assert Float.serialize(0.1) == 0.1
assert Float.serialize(1.1) == 1.1
assert Float.serialize(-1.1) == -1.1
assert Float.serialize('-1.1') == -1.1
assert Float.serialize('one') is None
assert Float.serialize("-1.1") == -1.1
assert Float.serialize("one") is None
assert Float.serialize(False) == 0
assert Float.serialize(True) == 1
def test_serializes_output_string():
assert String.serialize('string') == 'string'
assert String.serialize(1) == '1'
assert String.serialize(-1.1) == '-1.1'
assert String.serialize(True) == 'true'
assert String.serialize(False) == 'false'
assert String.serialize(u'\U0001F601') == u'\U0001F601'
assert String.serialize("string") == "string"
assert String.serialize(1) == "1"
assert String.serialize(-1.1) == "-1.1"
assert String.serialize(True) == "true"
assert String.serialize(False) == "false"
assert String.serialize(u"\U0001F601") == u"\U0001F601"
def test_serializes_output_boolean():
assert Boolean.serialize('string') is True
assert Boolean.serialize('') is False
assert Boolean.serialize("string") is True
assert Boolean.serialize("") is False
assert Boolean.serialize(1) is True
assert Boolean.serialize(0) is False
assert Boolean.serialize(True) is True

Some files were not shown because too many files have changed in this diff Show More