Compare commits

...

113 Commits

Author SHA1 Message Date
Erik Wrede
8290326308
release: 3.4.3 2024-11-09 21:43:17 +01:00
Philipp Hagemeister
4a274b8424
fix: raise proper error when UUID parsing fails (#1582)
* Do not raise AttributeError when parsing non-string UUIDs

When a user sends a dictionary or other object as a UUID variable like `{[123]}`, previously graphene crashed with an `AttributeError`, like this:

```
(…)
  File "…/lib/python3.12/site-packages/graphql/utils/is_valid_value.py", line 78, in is_valid_value
    parse_result = type.parse_value(value)
                   ^^^^^^^^^^^^^^^^^^^^^^^
  File "…/lib/python3.12/site-packages/graphene/types/uuid.py", line 33, in parse_value
    return _UUID(value)
           ^^^^^^^^^^^^
  File "/usr/lib/python3.12/uuid.py", line 175, in __init__
    hex = hex.replace('urn:', '').replace('uuid:', '')
          ^^^^^^^^^^^
AttributeError: 'dict' object has no attribute 'replace'
```

But an `AttributeError` makes it seem like this is the server's fault, when it's obviously the client's.

Report a proper GraphQLError.

* fix: adjust exception message structure

---------

Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2024-11-09 21:42:51 +01:00
Erik Wrede
b3db1c0cb2
release: 3.4.2 2024-11-09 18:18:36 +01:00
Muhammed Al-Dulaimi
3ed7bf6362
chore: Make Union meta overridable (#1583)
This PR makes the Union Options configurable, similar to how it works with ObjectTypes
---------

Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2024-11-09 18:17:42 +01:00
Erik Wrede
ccae7364e5
release: 3.4.1 2024-10-27 21:16:40 +01:00
Erik Wrede
cf97cbb1de
fix: use dateutil-parse for < 3.11 support (#1581)
* fix: use dateutil-parse for < 3.11 support

* chore: lint

* chore: lint

* fix mypy deps

* fix mypy deps

* chore: lint

* chore: fix test
2024-10-27 21:14:55 +01:00
Erik Wrede
dca31dc61d
release: 3.4.0 2024-10-18 13:43:07 +02:00
Erik Wrede
73df50e3dc
housekeeping: switch 3.13 to non-dev 2024-10-18 13:40:31 +02:00
Dulmandakh
821451fddc
CI: bump upload-artifact and codecov actions (#1567)
CI: bump actions/upload-artifact and codecov/codecov-action actions
2024-09-29 15:23:21 +02:00
Dulmandakh
f2e68141fd
CI: build package (#1564) 2024-09-29 13:40:19 +02:00
Dulmandakh
431826814d
lint: use ruff pre commit hook (#1566)
* lint: use ruff pre commit hook

* dont install ruff

---------

Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2024-09-29 13:33:10 +02:00
Dulmandakh
5b3ed2c2ba
bump pre-commit to 3.7 (#1568) 2024-09-29 13:32:26 +02:00
Erik Wrede
f95e9221bb
refactor: replace @deprecated decorator with upcoming native support (via typing-extensions), bump mypy (#1578)
* refactor: replace @deprecated decorator with upcoming native support (via typing-extensions)

* chore: fix tests

* chore: ruff fmt
2024-09-29 13:31:24 +02:00
Florian Zimmermann
48678afba4
fix: run the tests in python 3.12 and 3.13 and remove snapshottest dependency (#1572)
* actually run the tests in python 3.12 and 3.13

* remove snapshottest from the example tests

so that the tests pass in 3.12 and 3.13 again

* remove the section about snapshot testing from the testing docs

because the snapshottest package doesn't work on Python 3.12 and above

* fix assertion for badly formed JSON input on Python 3.13

* fix deprecation warning about datetime.utcfromtimestamp()
2024-08-08 11:49:26 +02:00
Dulmandakh
dc3b2e49c1
CI: fix tests on Python 3.13 (#1562) 2024-07-01 17:03:49 +02:00
Dulmandakh
d53a102b08
Lint using Ruff (#1563)
* lint using Ruff

* remove isort config, flake8 comments
2024-07-01 17:03:13 +02:00
Dulmandakh
fd9ecef36e
CI: format check using Ruff (#1557)
* CI: format check using Ruff

* precommit, setup py

* gitignore ruff_cache

---------

Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2024-06-28 15:05:04 +02:00
Dulmandakh
1263e9b41e
pytest 8 (#1549)
* pytest 8

* bump coveralls, pytest-cov

---------

Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2024-06-28 15:04:25 +02:00
Dulmandakh
74b33ae148
remove README.rst, leave only README.md (#1559)
remove README.rst
2024-06-28 15:03:48 +02:00
Dulmandakh
6834385786
support python 3.13 (#1561) 2024-06-28 15:03:34 +02:00
Dulmandakh
c335c5f529
fix lint error in SECURITY.md (#1556)
fix lint SECURITY.md
2024-06-23 18:24:34 +02:00
Erik Wrede
d90d65cafe
chore: adjust incorrect development status 2024-06-22 12:31:14 +02:00
Dulmandakh
5924cc4150
remove Python 2 (#1547)
Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2024-06-13 16:52:06 +02:00
Erik Wrede
6a668514de
docs: create security.md (#1554) 2024-06-13 16:51:43 +02:00
tijuca
88c3ec539b
pytest: Don't use nose like syntax in graphene/relay/tests/test_custom_global_id.py (#1539) (#1540)
pytest: Don't use nose like syntax

The tests in test_custom_global_id.py use the old nose specific method
'setup(self)' which isn't supported anymore in Pytest 8+. The tests fail
with this error message without modification.

E               pytest.PytestRemovedIn8Warning: Support for nose tests is deprecated and will be removed in a future release.
E               graphene/relay/tests/test_custom_global_id.py::TestIncompleteCustomGlobalID::test_must_define_resolve_global_id is using nose-specific method: `setup(self)`
E               To remove this warning, rename it to `setup_method(self)`
E               See docs: https://docs.pytest.org/en/stable/deprecations.html#support-for-tests-written-for-nose

Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2024-06-13 14:38:48 +00:00
Dulmandakh
17d09c8ded
remove aniso8601, mock, iso8601 (#1548)
* remove aniso8601

* remove mock, iso8601

---------

Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2024-06-13 16:35:00 +02:00
Dulmandakh
614449e651
Python 3.12 (#1550)
* python 3.12

* update classifiers
2024-06-13 14:34:16 +00:00
Dulmandakh
44dcdad182
CI: fix deprecation warning (#1551) 2024-06-13 16:32:50 +02:00
Dulmandakh
221afaf4c4
bump pytest to 7 (#1546)
* bump pytest

* downgrade pytest-cov
2024-05-16 10:17:46 +02:00
Dulmandakh
5db1af039f
Remove Python 3.7 (#1543)
* CI: add Python 3.12

* dd

* remove python 3.12
2024-05-16 10:17:26 +02:00
Dulmandakh
82d0a68a81
remove polyfill for dataclasses (#1545)
* remove polyfill for dataclasses

* fix lint
2024-05-16 10:09:37 +02:00
Dulmandakh
3cd0c30de8
CI: bump GH actions (#1544) 2024-05-16 10:09:19 +02:00
Andrew Swait
5fb7b54377
docs: update docstring for type arg of Field (#1527) 2023-10-06 22:15:26 +02:00
wongcht
baaef0d21a
chore: remove pytz (#1520) 2023-08-30 23:41:17 +02:00
Erik Wrede
93cb33d359
housekeeping: delete outdated ROADMAP.md 2023-07-26 09:43:40 +02:00
Erik Wrede
f5aba2c027
release: 3.3.0 2023-07-26 08:26:30 +02:00
garo (they/them)
ea7ccc350e
feat(relay): add option for strict connection types (#1504)
* types: add option for strict connection types

* chore: appease linter

* chore: appease linter

* test: add test

---------

Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2023-07-26 08:25:57 +02:00
Dulmandakh
6b8cd2dc78
ci: drop python 3.6 (#1507)
Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2023-07-19 09:08:19 +02:00
Naoya Yamashita
74db349da4
docs: add get_human function (#1380)
Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2023-07-19 09:01:00 +02:00
Ransom Williams
99f0103e37
test: print schema with InputObjectType with DateTime field with default_value (#1293) (#1513)
* test [1293]: regression test print schema with InputObjectType with DateTime field with default_value

* chore: clarify test title and assertion

---------

Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2023-07-19 07:00:30 +00:00
Erik Wrede
03cf2e131e
chore: remove travis ci link 2023-06-06 20:45:01 +02:00
Jeongseok Kang
d77d0b0571
chore: Use typing.TYPE_CHECKING instead of MYPY (#1503)
Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2023-06-04 23:49:26 +02:00
senseysensor
c636d984c6
fix: Corrected enum metaclass to fix pickle.dumps() (#1495)
* Corrected enum metaclass to fix pickle.dumps()

* considered case with colliding class names (try to distinguish by file name)

* reverted simple solution back (without attempt to support duplicate Enum class names)

---------

Co-authored-by: sgrekov <sgrekov@lohika.com>
Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2023-06-04 23:10:05 +02:00
Cadu
2da8e9db5c
feat: Enable use of Undefined in InputObjectTypes (#1506)
* Changed InputObjectType's default builder-from-dict argument to be `Undefined` instead of `None`, removing ambiguity of undefined optional inputs using dot notation access syntax.

* Move `set_default_input_object_type_to_undefined()` fixture into conftest.py for sharing it between multiple test files.
2023-06-04 23:01:05 +02:00
Firas Kafri
8ede21e063
chore: default enum description to "An enumeration." (#1502)
* Default enum description to "An enumeration."

default to this string, which is used in many tests, is causing

* Use the docstring descriptions of enums when they are present

* Added tests

* chore: add missing newline

* Fix new line

---------

Co-authored-by: Erik Wrede <erikwrede@users.noreply.github.com>
2023-05-25 12:21:55 +02:00
Erik Wrede
57cbef6666
release: 3.2.2 2023-03-13 21:24:16 +01:00
Erik Wrede
d33e38a391
chore: make relay type fields extendable (#1499) 2023-03-13 21:23:28 +01:00
Roman Solomatin
b76e89c0c2
docs: remove unpair bracket (#1500) 2023-03-09 10:09:15 +01:00
Erik Wrede
81e7eee5da
Update README.md 2023-03-03 17:35:46 +01:00
Erik Wrede
969a630541
Update README.md 2023-03-03 17:35:05 +01:00
QuentinN42
8b89afeff1
docs: update sphinx to the latest version (#1497) 2023-02-28 13:21:45 +01:00
Peder Johnsen
52143473ef
docs: Remove prerelease notice (#1487) 2022-12-25 22:59:05 +01:00
Pei-Lun H
8eb2807ce5
docs: Correct the module name of custom scalar example in documentation (#1486) 2022-12-23 07:57:45 +01:00
Erik Wrede
340d5ed12f
release: 3.2.1 2022-12-11 21:05:25 +01:00
Vladyslav Hutov
19ea63b9c5
fix: Input fields and Arguments can now be deprecated (#1472)
Non-required InputFields and arguments now support deprecation via setting the `deprecation_reason` argument upon creation.
2022-12-10 12:25:07 +01:00
Erik Wrede
d5dadb7b1b
release: 3.2.0
fixes previous release number 3.1.2 due to a pending feature release
2022-12-09 10:53:50 +01:00
Erik Wrede
8596349405
release: 3.1.2 2022-12-09 10:46:24 +01:00
Mike Roberts
a141e848c3
Do not interpret Enum members called 'description' as description properties (#1478)
This is a workaround for `TypeError`s being raised when initialising schemas with
Enum members named `description` or `deprecation_reason`.


Fixes #1321
2022-12-01 11:06:24 +01:00
Erik Wrede
f09b2e5a81
housekeeping: pin ubuntu to 20.04 for python 3.6
Ubuntu:latest doesn't include py36 anymore. Keep this until we add 3.11 and drop 3.6.
See:
https://github.com/actions/setup-python/issues/544
https://github.com/rwth-i6/returnn/issues/1226
2022-11-21 15:40:05 +01:00
Mike Roberts
7f6fa16194
feat_ (#1476)
Previously, installing graphene and trying to do `from graphene.test import Client`
as recommended in the docs caused an `ImportError`, as the 'promise' library
is imported but only listed as a requirement in the 'test' section of the setup.py
file.
2022-11-16 21:38:15 +01:00
Rens Groothuijsen
0b1bfbf65b
chore: Make Graphene enums iterable like Python enums (#1473)
* Makes Graphene enums iterable like Python enums by implementing __iter__
2022-11-16 21:30:49 +01:00
Rens Groothuijsen
f891a3683d
docs: Disambiguate argument name in quickstart docs (#1474) 2022-11-16 21:27:34 +01:00
Erik Wrede
a2b63d8d84
fix: MyPy findings due to a mypy version upgrade were corrected (#1477) 2022-11-16 21:23:37 +01:00
Rens Groothuijsen
b349632a82
Clarify execution order in middleware docs (#1475) 2022-11-15 08:48:48 +01:00
Kevin Le
ccdd35b354
hashable Enum (#1461) 2022-10-27 13:55:38 +02:00
Kristian Uzhca
6969023491
Add copy function for GrapheneGraphQLType (#1463) 2022-10-24 20:06:24 +02:00
Thomas Leonard
ee1ff975d7
feat: Add support for custom global (Issue #1276) (#1428)
Co-authored-by: Thomas Leonard <thomas@loftorbital.com>
2022-09-19 10:17:31 +02:00
Erik Wrede
b20bbdcdf7
v3.1.1 2022-09-08 10:55:05 +02:00
Cadu
694c1db21e
Vendor DataLoader from aiodataloader and move get_event_loop() out of __init__ function. (#1459)
* Vendor DataLoader from aiodataloader and also move get_event_loop behavior from `__init__` to a property which only gets resolved when actually needed (this will solve PyTest-related to early get_event_loop() issues)

* Added DataLoader's specific tests

* plug `loop` parameter into `self._loop`, so that we still have the ability to pass in a custom event loop, if needed.


Co-authored-by: Erik Wrede <erikwrede2@gmail.com>
2022-09-07 20:32:53 +02:00
Erik Wrede
20219fdc1b
Update README.md
Update
2022-09-06 13:42:38 +02:00
Christian Clauss
45986b18e7
Upgrade GitHub Actions (#1457) 2022-08-28 20:25:55 +02:00
Ülgen Sarıkavak
c5ccc9502d
Upgrade base Python version to 3.10 (#1449) 2022-08-28 17:33:35 +02:00
Erik Wrede
35c281a3cd
Fix BigInt export (#1456) 2022-08-28 17:30:26 +02:00
Ülgen Sarıkavak
355601bd5c
Remove duplicate flake8 call in tox, it's covered by pre-commit (#1448) 2022-08-27 18:13:48 +02:00
Christian Clauss
cbf59a88ad
Add Python 3.11 release candidate 1 to the testing (#1450)
* Add Python 3.11 release candidate 1 to the testing

https://www.python.org/download/pre-releases

* Update tests.yml
2022-08-27 18:06:38 +02:00
Erik Wrede
24059a8b40
Merge pull request #1370 from DrewHoo/fix/ariadne-link 2022-08-20 14:59:22 +02:00
Erik Wrede
d96ec55abb
Merge branch 'master' into fix/ariadne-link 2022-08-20 14:59:02 +02:00
Erik Wrede
f1e8f4862a
Merge pull request #1447 from ulgens/update_precommit 2022-08-19 08:57:13 +02:00
Ülgen Sarıkavak
e6429c3c5b Update pre-commit hooks 2022-08-19 09:20:51 +03:00
Erik Wrede
ed1290aaba
Merge pull request #1407 from alimcmaster1/patch-1
Update quickstart.rst
2022-08-18 09:41:32 +02:00
Erik Wrede
84faf8f57c
Merge pull request #1444 from karmingc/karmingc/readme-typo
fix: use install instead of instal for consistency
2022-08-18 09:39:32 +02:00
karming
0ac4d9397e fix: use install instead of instal for consistency 2022-08-16 19:21:29 -04:00
Erik Wrede
023452a09f
Merge pull request #1442 from graphql-python/erikwrede-patch-1
Delete coveralls.yml
2022-08-14 12:42:55 +02:00
Erik Wrede
6339f489e9
Delete coveralls.yml
We are now using Codecov
2022-08-14 12:09:07 +02:00
Erik Wrede
23ca978918
Merge pull request #1414 from loft-orbital/issue-#1413_fix-invalid-input-type
Providing an invalid value to an input type will now provoke an exception.
For example if the input as the type `UUID` and that you provide the value `2` it will now fail.
2022-08-14 11:55:45 +02:00
Erik Wrede
97abb9db42
Merge pull request #1381 from belkka/patch-1
Avoid ambiguity in graphene.Mutation docstring [documentation]
2022-08-13 15:19:52 +02:00
Erik Wrede
57f3aa3ba9
Merge pull request #1190 from graphql-python/update-dataloaderdocs
Update Dataloader docs
2022-08-13 15:14:44 +02:00
Erik Wrede
8e1c3d3102
Merge branch 'master' into update-dataloaderdocs 2022-08-13 15:11:12 +02:00
Erik Wrede
13c661332e
Fix typo
Co-authored-by: Justin Miller <justinrmiller@users.noreply.github.com>
2022-08-13 15:08:34 +02:00
Erik Wrede
80e3498750
Fix Test Failure due to #1304
assert bar_graphql_type.interfaces == [foo_graphql_type] failed only on tox, because .interfaces was a tuple instead of a list. Error didn't occur using just pytest. Fixed by explicitly
converting both to list.
2022-08-13 14:51:58 +02:00
Erik Wrede
c77d87d205
Merge pull request #1304 from closeio/support-interface-implementations
Add support for interfaces on interfaces
2022-08-13 14:28:17 +02:00
Erik Wrede
8bdcec5cd7
Merge pull request #1402 from RJ722/patch-1
Docs: Highlight .get in backticks
2022-08-13 14:18:21 +02:00
Erik Wrede
dfece7f65d
Merge pull request #1437 from timgates42/bugfix_typos 2022-07-17 00:30:54 +02:00
Tim Gates
8589aaeb98
docs: Fix a few typos
There are small typos in:
- UPGRADE-v1.0.md
- UPGRADE-v2.0.md
- docs/execution/fileuploading.rst

Fixes:
- Should read `standard` rather than `stantard`.
- Should read `library` rather than `libary`.
- Should read `explicitly` rather than `explicity`.

Signed-off-by: Tim Gates <tim.gates@iress.com>
2022-07-16 14:40:00 +10:00
Thomas Leonard
72c2fd5ec3
Merge pull request #1430 from loft-orbital/feat/enable-to-provide-enum-name
feat: add ability to provide a type name to enum when using from_enum
2022-06-27 18:07:49 +02:00
Erik Wrede
69be326290
Merge pull request #1431 from erikwrede/master
Add Codecov to github actions
2022-06-27 16:41:24 +02:00
Erik Wrede
2ee23b0b2c Add codecov action 2022-06-27 16:30:01 +02:00
Thomas Leonard
8f6a8f9c4a feat: add ability to provide a type name to enum when using from_enum 2022-06-24 18:18:04 +02:00
Thomas Leonard
3bdc67c6ae fix: input with invalid types should raise an error 2022-06-20 15:15:14 +02:00
Thomas Leonard
efe4b89015
Merge pull request #1424 from ramonwenger/master
Fix typo in union comments
2022-06-20 15:14:27 +02:00
Ramon Wenger
785fcb38b6
Merge branch 'graphql-python:master' into master 2022-06-15 13:48:25 +02:00
Ramon Wenger
5d4e71f463 Fix typo in union comments 2022-05-25 17:45:28 +02:00
Ali McMaster
bf40e6c419
Update quickstart.rst 2022-02-14 09:01:42 +00:00
Rahul Jha
beb957382d
Highlight .get in backticks
When I first read through the documentation twice, it took me two tries and looking very hard to find out the difference b/w the two. The background highlight using backticks would be helpful in this case.
2022-01-13 15:33:09 +05:30
belkka
b274a607f4
Avoid ambiguity in graphene.Mutation docstring
The code example in docstring starts with `from graphene import Mutation` and defines a `class Mutation` later. This definition would shadow previously imported name and (which is more important) confuses a reader about usage of this class — one need to keep in mind that previous usage of `Mutation` is imported from graphene and have not been overridden yet.

This PR changes an import-from statement to an import statement, so `graphene.Mutation` is used explicitly. This approach is consistent with other code examples in docs (e. g. https://docs.graphene-python.org/en/v2.1.9/types/mutations/).

Another option is to change name of example class Mutation to something more clear (maybe SchemaMutation or RootMutation), but I'm not sure what name to choose.

Only docstring is updated, no code changes.
2021-10-11 23:46:13 +03:00
Drew Hoover
78973964b8
fix: update ariadne url to the new docs 2021-09-21 13:00:19 -04:00
Alec Rosenbaum
7004515f06 implement interface interfaces on TypeMap, fix failing test 2021-01-15 13:13:05 -05:00
Alec Rosenbaum
a17f63cf03 add failing type_map test, bar_graphql_type has no interfaces 2021-01-15 13:10:14 -05:00
Alec Rosenbaum
86b7e6ac86 update InterfaceOptions to fix failing test 2021-01-15 13:02:08 -05:00
Alec Rosenbaum
ae93499a37 add failing test for interface meta 2021-01-15 13:01:43 -05:00
Jonathan Kim
380166989d Update dataloader docs 2020-04-26 13:22:09 +01:00
Jonathan Kim
3e4305259b Add basic test for aiodataloader 2020-04-26 13:17:00 +01:00
Jonathan Kim
a0b522fa39 Add aiodataloader to test deps 2020-04-26 13:16:51 +01:00
99 changed files with 2880 additions and 2360 deletions

21
.github/workflows/build.yaml vendored Normal file
View File

@ -0,0 +1,21 @@
name: 📦 Build
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.10
uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install build twine
- name: Building package
run: python3 -m build
- name: Check package with Twine
run: twine check dist/*

View File

@ -1,25 +0,0 @@
name: 📊 Check Coverage
on:
push:
branches:
- master
- '*.x'
paths-ignore:
- 'docs/**'
- '*.md'
- '*.rst'
pull_request:
branches:
- master
- '*.x'
paths-ignore:
- 'docs/**'
- '*.md'
- '*.rst'
jobs:
coveralls_finish:
# check coverage increase/decrease
runs-on: ubuntu-latest
steps:
- name: Coveralls Finished
uses: AndreMiras/coveralls-python-action@develop

View File

@ -10,11 +10,11 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v4
- name: Set up Python 3.9 - name: Set up Python 3.10
uses: actions/setup-python@v2 uses: actions/setup-python@v5
with: with:
python-version: 3.9 python-version: "3.10"
- name: Build wheel and source tarball - name: Build wheel and source tarball
run: | run: |
pip install wheel pip install wheel

View File

@ -7,11 +7,11 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v4
- name: Set up Python 3.9 - name: Set up Python 3.10
uses: actions/setup-python@v2 uses: actions/setup-python@v5
with: with:
python-version: 3.9 python-version: "3.10"
- name: Install dependencies - name: Install dependencies
run: | run: |
python -m pip install --upgrade pip python -m pip install --upgrade pip

View File

@ -25,32 +25,40 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
include: include:
- {name: '3.13', python: '3.13', os: ubuntu-latest, tox: py313}
- {name: '3.12', python: '3.12', os: ubuntu-latest, tox: py312}
- {name: '3.11', python: '3.11', os: ubuntu-latest, tox: py311}
- {name: '3.10', python: '3.10', os: ubuntu-latest, tox: py310} - {name: '3.10', python: '3.10', os: ubuntu-latest, tox: py310}
- {name: '3.9', python: '3.9', os: ubuntu-latest, tox: py39} - {name: '3.9', python: '3.9', os: ubuntu-latest, tox: py39}
- {name: '3.8', python: '3.8', os: ubuntu-latest, tox: py38} - {name: '3.8', python: '3.8', os: ubuntu-latest, tox: py38}
- {name: '3.7', python: '3.7', os: ubuntu-latest, tox: py37}
- {name: '3.6', python: '3.6', os: ubuntu-latest, tox: py36}
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v4
- uses: actions/setup-python@v2 - uses: actions/setup-python@v5
with: with:
python-version: ${{ matrix.python }} python-version: ${{ matrix.python }}
- name: update pip - name: update pip
run: | run: |
pip install -U wheel python -m pip install --upgrade pip
pip install -U setuptools pip install --upgrade setuptools wheel
python -m pip install -U pip
- name: get pip cache dir - name: get pip cache dir
id: pip-cache id: pip-cache
run: echo "::set-output name=dir::$(pip cache dir)" run: echo "dir=$(pip cache dir)" >> $GITHUB_OUTPUT
- name: cache pip dependencies - name: cache pip dependencies
uses: actions/cache@v2 uses: actions/cache@v3
with: with:
path: ${{ steps.pip-cache.outputs.dir }} path: ${{ steps.pip-cache.outputs.dir }}
key: pip|${{ runner.os }}|${{ matrix.python }}|${{ hashFiles('setup.py') }} key: pip|${{ runner.os }}|${{ matrix.python }}|${{ hashFiles('setup.py') }}
- run: pip install tox - run: pip install tox
- run: tox -e ${{ matrix.tox }} - run: tox -e ${{ matrix.tox }}
- name: Upload coverage.xml
if: ${{ matrix.python == '3.10' }}
uses: actions/upload-artifact@v4
with:
name: graphene-coverage
path: coverage.xml
if-no-files-found: error
- name: Upload coverage.xml to codecov
if: ${{ matrix.python == '3.10' }}
uses: codecov/codecov-action@v4

1
.gitignore vendored
View File

@ -90,3 +90,4 @@ venv/
*.sqlite3 *.sqlite3
.vscode .vscode
.mypy_cache .mypy_cache
.ruff_cache

View File

@ -1,2 +0,0 @@
[settings]
known_third_party = aniso8601,graphql,graphql_relay,promise,pytest,pytz,pyutils,setuptools,snapshottest,sphinx_graphene_theme

View File

@ -1,9 +1,9 @@
default_language_version: default_language_version:
python: python3.9 python: python3.10
repos: repos:
- repo: https://github.com/pre-commit/pre-commit-hooks - repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.2.0 rev: v4.3.0
hooks: hooks:
- id: check-merge-conflict - id: check-merge-conflict
- id: check-json - id: check-json
@ -17,14 +17,13 @@ repos:
- id: trailing-whitespace - id: trailing-whitespace
exclude: README.md exclude: README.md
- repo: https://github.com/asottile/pyupgrade - repo: https://github.com/asottile/pyupgrade
rev: v2.32.1 rev: v2.37.3
hooks: hooks:
- id: pyupgrade - id: pyupgrade
- repo: https://github.com/ambv/black - repo: https://github.com/astral-sh/ruff-pre-commit
rev: 22.3.0 # Ruff version.
rev: v0.5.0
hooks: hooks:
- id: black - id: ruff
- repo: https://github.com/PyCQA/flake8 - id: ruff-format
rev: 4.0.1 args: [ --check ]
hooks:
- id: flake8

View File

@ -7,6 +7,7 @@ help:
install-dev: install-dev:
pip install -e ".[dev]" pip install -e ".[dev]"
.PHONY: test ## Run tests
test: test:
py.test graphene examples py.test graphene examples

View File

@ -1,8 +1,8 @@
# ![Graphene Logo](http://graphene-python.org/favicon.png) [Graphene](http://graphene-python.org) [![Build Status](https://travis-ci.org/graphql-python/graphene.svg?branch=master)](https://travis-ci.org/graphql-python/graphene) [![PyPI version](https://badge.fury.io/py/graphene.svg)](https://badge.fury.io/py/graphene) [![Coverage Status](https://coveralls.io/repos/graphql-python/graphene/badge.svg?branch=master&service=github)](https://coveralls.io/github/graphql-python/graphene?branch=master) # ![Graphene Logo](http://graphene-python.org/favicon.png) [Graphene](http://graphene-python.org) [![PyPI version](https://badge.fury.io/py/graphene.svg)](https://badge.fury.io/py/graphene) [![Coverage Status](https://coveralls.io/repos/graphql-python/graphene/badge.svg?branch=master&service=github)](https://coveralls.io/github/graphql-python/graphene?branch=master) [![](https://dcbadge.vercel.app/api/server/T6Gp6NFYHe?style=flat)](https://discord.gg/T6Gp6NFYHe)
[💬 Join the community on Slack](https://join.slack.com/t/graphenetools/shared_invite/enQtOTE2MDQ1NTg4MDM1LTA4Nzk0MGU0NGEwNzUxZGNjNDQ4ZjAwNDJjMjY0OGE1ZDgxZTg4YjM2ZTc4MjE2ZTAzZjE2ZThhZTQzZTkyMmM) [💬 Join the community on Discord](https://discord.gg/T6Gp6NFYHe)
**We are looking for contributors**! Please check the [ROADMAP](https://github.com/graphql-python/graphene/blob/master/ROADMAP.md) to see how you can help ❤️ **We are looking for contributors**! Please check the current issues to see how you can help ❤️
## Introduction ## Introduction
@ -10,7 +10,7 @@
- **Easy to use:** Graphene helps you use GraphQL in Python without effort. - **Easy to use:** Graphene helps you use GraphQL in Python without effort.
- **Relay:** Graphene has builtin support for Relay. - **Relay:** Graphene has builtin support for Relay.
- **Data agnostic:** Graphene supports any kind of data source: SQL (Django, SQLAlchemy), NoSQL, custom Python objects, etc. - **Data agnostic:** Graphene supports any kind of data source: SQL (Django, SQLAlchemy), Mongo, custom Python objects, etc.
We believe that by providing a complete API you could plug Graphene anywhere your data lives and make your data available We believe that by providing a complete API you could plug Graphene anywhere your data lives and make your data available
through GraphQL. through GraphQL.
@ -20,18 +20,19 @@ Graphene has multiple integrations with different frameworks:
| integration | Package | | integration | Package |
| ----------------- | --------------------------------------------------------------------------------------- | | ----------------- | --------------------------------------------------------------------------------------- |
| Django | [graphene-django](https://github.com/graphql-python/graphene-django/) |
| SQLAlchemy | [graphene-sqlalchemy](https://github.com/graphql-python/graphene-sqlalchemy/) | | SQLAlchemy | [graphene-sqlalchemy](https://github.com/graphql-python/graphene-sqlalchemy/) |
| Google App Engine | [graphene-gae](https://github.com/graphql-python/graphene-gae/) | | Mongo | [graphene-mongo](https://github.com/graphql-python/graphene-mongo/) |
| Apollo Federation | [graphene-federation](https://github.com/graphql-python/graphene-federation/) |
| Django | [graphene-django](https://github.com/graphql-python/graphene-django/) |
Also, Graphene is fully compatible with the GraphQL spec, working seamlessly with all GraphQL clients, such as [Relay](https://github.com/facebook/relay), [Apollo](https://github.com/apollographql/apollo-client) and [gql](https://github.com/graphql-python/gql). Also, Graphene is fully compatible with the GraphQL spec, working seamlessly with all GraphQL clients, such as [Relay](https://github.com/facebook/relay), [Apollo](https://github.com/apollographql/apollo-client) and [gql](https://github.com/graphql-python/gql).
## Installation ## Installation
For instaling graphene, just run this command in your shell To install `graphene`, just run this command in your shell
```bash ```bash
pip install "graphene>=3.0" pip install "graphene>=3.1"
``` ```
## Examples ## Examples
@ -84,18 +85,24 @@ pip install -e ".[test]"
Well-written tests and maintaining good test coverage is important to this project. While developing, run new and existing tests with: Well-written tests and maintaining good test coverage is important to this project. While developing, run new and existing tests with:
```sh ```sh
py.test graphene/relay/tests/test_node.py # Single file pytest graphene/relay/tests/test_node.py # Single file
py.test graphene/relay # All tests in directory pytest graphene/relay # All tests in directory
``` ```
Add the `-s` flag if you have introduced breakpoints into the code for debugging. Add the `-s` flag if you have introduced breakpoints into the code for debugging.
Add the `-v` ("verbose") flag to get more detailed test output. For even more detailed output, use `-vv`. Add the `-v` ("verbose") flag to get more detailed test output. For even more detailed output, use `-vv`.
Check out the [pytest documentation](https://docs.pytest.org/en/latest/) for more options and test running controls. Check out the [pytest documentation](https://docs.pytest.org/en/latest/) for more options and test running controls.
Regularly ensure your `pre-commit` hooks are up to date and enabled:
```sh
pre-commit install
```
You can also run the benchmarks with: You can also run the benchmarks with:
```sh ```sh
py.test graphene --benchmark-only pytest graphene --benchmark-only
``` ```
Graphene supports several versions of Python. To make sure that changes do not break compatibility with any of those versions, we use `tox` to create virtualenvs for each Python version and run tests with that version. To run against all Python versions defined in the `tox.ini` config file, just run: Graphene supports several versions of Python. To make sure that changes do not break compatibility with any of those versions, we use `tox` to create virtualenvs for each Python version and run tests with that version. To run against all Python versions defined in the `tox.ini` config file, just run:
@ -107,10 +114,10 @@ tox
If you wish to run against a specific version defined in the `tox.ini` file: If you wish to run against a specific version defined in the `tox.ini` file:
```sh ```sh
tox -e py36 tox -e py39
``` ```
Tox can only use whatever versions of Python are installed on your system. When you create a pull request, Travis will also be running the same tests and report the results, so there is no need for potential contributors to try to install every single version of Python on their own system ahead of time. We appreciate opening issues and pull requests to make graphene even more stable & useful! Tox can only use whatever versions of Python are installed on your system. When you create a pull request, GitHub Actions pipelines will also be running the same tests and report the results, so there is no need for potential contributors to try to install every single version of Python on their own system ahead of time. We appreciate opening issues and pull requests to make graphene even more stable & useful!
### Building Documentation ### Building Documentation

View File

@ -1,174 +0,0 @@
|Graphene Logo| `Graphene <http://graphene-python.org>`__ |Build Status| |PyPI version| |Coverage Status|
=========================================================================================================
`💬 Join the community on
Slack <https://join.slack.com/t/graphenetools/shared_invite/enQtOTE2MDQ1NTg4MDM1LTA4Nzk0MGU0NGEwNzUxZGNjNDQ4ZjAwNDJjMjY0OGE1ZDgxZTg4YjM2ZTc4MjE2ZTAzZjE2ZThhZTQzZTkyMmM>`__
**We are looking for contributors**! Please check the
`ROADMAP <https://github.com/graphql-python/graphene/blob/master/ROADMAP.md>`__
to see how you can help ❤️
Introduction
------------
`Graphene <http://graphene-python.org>`__ is an opinionated Python
library for building GraphQL schemas/types fast and easily.
- **Easy to use:** Graphene helps you use GraphQL in Python without
effort.
- **Relay:** Graphene has builtin support for Relay.
- **Data agnostic:** Graphene supports any kind of data source: SQL
(Django, SQLAlchemy), NoSQL, custom Python objects, etc. We believe
that by providing a complete API you could plug Graphene anywhere
your data lives and make your data available through GraphQL.
Integrations
------------
Graphene has multiple integrations with different frameworks:
+-------------------+-------------------------------------------------+
| integration | Package |
+===================+=================================================+
| Django | `graphene-django <https:/ |
| | /github.com/graphql-python/graphene-django/>`__ |
+-------------------+-------------------------------------------------+
| SQLAlchemy | `graphene-sqlalchemy <https://git |
| | hub.com/graphql-python/graphene-sqlalchemy/>`__ |
+-------------------+-------------------------------------------------+
| Google App Engine | `graphene-gae <http |
| | s://github.com/graphql-python/graphene-gae/>`__ |
+-------------------+-------------------------------------------------+
Also, Graphene is fully compatible with the GraphQL spec, working
seamlessly with all GraphQL clients, such as
`Relay <https://github.com/facebook/relay>`__,
`Apollo <https://github.com/apollographql/apollo-client>`__ and
`gql <https://github.com/graphql-python/gql>`__.
Installation
------------
For instaling graphene, just run this command in your shell
.. code:: bash
pip install "graphene>=3.0"
Examples
--------
Here is one example for you to get started:
.. code:: python
import graphene
class Query(graphene.ObjectType):
hello = graphene.String(description='A typical hello world')
def resolve_hello(self, info):
return 'World'
schema = graphene.Schema(query=Query)
Then Querying ``graphene.Schema`` is as simple as:
.. code:: python
query = '''
query SayHello {
hello
}
'''
result = schema.execute(query)
If you want to learn even more, you can also check the following
`examples <examples/>`__:
- **Basic Schema**: `Starwars example <examples/starwars>`__
- **Relay Schema**: `Starwars Relay
example <examples/starwars_relay>`__
Documentation
-------------
Documentation and links to additional resources are available at
https://docs.graphene-python.org/en/latest/
Contributing
------------
After cloning this repo, create a
`virtualenv <https://virtualenv.pypa.io/en/stable/>`__ and ensure
dependencies are installed by running:
.. code:: sh
virtualenv venv
source venv/bin/activate
pip install -e ".[test]"
Well-written tests and maintaining good test coverage is important to
this project. While developing, run new and existing tests with:
.. code:: sh
py.test graphene/relay/tests/test_node.py # Single file
py.test graphene/relay # All tests in directory
Add the ``-s`` flag if you have introduced breakpoints into the code for
debugging. Add the ``-v`` (“verbose”) flag to get more detailed test
output. For even more detailed output, use ``-vv``. Check out the
`pytest documentation <https://docs.pytest.org/en/latest/>`__ for more
options and test running controls.
You can also run the benchmarks with:
.. code:: sh
py.test graphene --benchmark-only
Graphene supports several versions of Python. To make sure that changes
do not break compatibility with any of those versions, we use ``tox`` to
create virtualenvs for each Python version and run tests with that
version. To run against all Python versions defined in the ``tox.ini``
config file, just run:
.. code:: sh
tox
If you wish to run against a specific version defined in the ``tox.ini``
file:
.. code:: sh
tox -e py36
Tox can only use whatever versions of Python are installed on your
system. When you create a pull request, Travis will also be running the
same tests and report the results, so there is no need for potential
contributors to try to install every single version of Python on their
own system ahead of time. We appreciate opening issues and pull requests
to make graphene even more stable & useful!
Building Documentation
~~~~~~~~~~~~~~~~~~~~~~
The documentation is generated using the excellent
`Sphinx <http://www.sphinx-doc.org/>`__ and a custom theme.
An HTML version of the documentation is produced by running:
.. code:: sh
make docs
.. |Graphene Logo| image:: http://graphene-python.org/favicon.png
.. |Build Status| image:: https://travis-ci.org/graphql-python/graphene.svg?branch=master
:target: https://travis-ci.org/graphql-python/graphene
.. |PyPI version| image:: https://badge.fury.io/py/graphene.svg
:target: https://badge.fury.io/py/graphene
.. |Coverage Status| image:: https://coveralls.io/repos/graphql-python/graphene/badge.svg?branch=master&service=github
:target: https://coveralls.io/github/graphql-python/graphene?branch=master

View File

@ -1,54 +0,0 @@
# GraphQL Python Roadmap
In order to move Graphene and the GraphQL Python ecosystem forward it's essential to be clear with the community on next steps, so we can move uniformly.
_👋 If you have more ideas on how to move the Graphene ecosystem forward, don't hesistate to [open a PR](https://github.com/graphql-python/graphene/edit/master/ROADMAP.md)_
## Now
- [ ] Continue to support v2.x with security releases
- [ ] Last major/feature release is cut and graphene-* libraries should pin to that version number
## Next
New features will only be developed on version 3 of ecosystem libraries.
### [Core-Next](https://github.com/graphql-python/graphql-core-next)
Targeted as v3 of [graphql-core](https://pypi.org/project/graphql-core/), Python 3 only
### Graphene
- [ ] Integrate with the core-next API and resolve all breaking changes
- [ ] GraphQL types from type annotations - [See issue](https://github.com/graphql-python/graphene/issues/729)
- [ ] Add support for coroutines in Connection, Mutation (abstracting out Promise requirement) - [See PR](https://github.com/graphql-python/graphene/pull/824)
### Graphene-*
- [ ] Integrate with the graphene core-next API and resolve all breaking changes
### *-graphql
- [ ] Integrate with the graphql core-next API and resolve all breaking changes
## Ongoing Initiatives
- [ ] Improve documentation, especially for new users to the library
- [ ] Recipes for “quick start” that people can ideally use/run
## Dependent Libraries
| Repo | Release Manager | CODEOWNERS | Pinned | next/master created | Labels Standardized |
| ---------------------------------------------------------------------------- | --------------- | ---------- | ---------- | ------------------- | ------------------- |
| [graphene](https://github.com/graphql-python/graphene) | ekampf | ✅ | | ✅ | |
| [graphql-core](https://github.com/graphql-python/graphql-core) | Cito | ✅ | N/A | N/A | |
| [graphql-core-next](https://github.com/graphql-python/graphql-core-next) | Cito | ✅ | N/A | N/A | |
| [graphql-server-core](https://github.com/graphql-python/graphql-server-core) | Cito | | ✅ | ✅ | |
| [gql](https://github.com/graphql-python/gql) | ekampf | | | | |
| [gql-next](https://github.com/graphql-python/gql-next) | ekampf | | N/A | N/A | |
| ...[aiohttp](https://github.com/graphql-python/aiohttp-graphql) | | | | | |
| ...[django](https://github.com/graphql-python/graphene-django) | mvanlonden | | ✅ | ✅ | |
| ...[sanic](https://github.com/graphql-python/sanic-graphql) | ekampf | | | | |
| ...[flask](https://github.com/graphql-python/flask-graphql) | | | | | |
| ...[webob](https://github.com/graphql-python/webob-graphql) | | | | | |
| ...[tornado](https://github.com/graphql-python/graphene-tornado) | ewhauser | | PR created | ✅ | |
| ...[ws](https://github.com/graphql-python/graphql-ws) | Cito/dfee | | ✅ | ✅ | |
| ...[gae](https://github.com/graphql-python/graphene-gae) | ekampf | | PR created | ✅ | |
| ...[sqlalchemy](https://github.com/graphql-python/graphene-sqlalchemy) | jnak/Nabell | ✅ | ✅ | ✅ | |
| ...[mongo](https://github.com/graphql-python/graphene-mongo) | | | ✅ | ✅ | |
| ...[relay-py](https://github.com/graphql-python/graphql-relay-py) | Cito | | | | |
| ...[wsgi](https://github.com/moritzmhmk/wsgi-graphql) | | | | | |

15
SECURITY.md Normal file
View File

@ -0,0 +1,15 @@
# Security Policy
## Supported Versions
Support for security issues is currently provided for Graphene 3.0 and above. Support on earlier versions cannot be guaranteed by the maintainers of this library, but community PRs may be accepted in critical cases.
The preferred mitigation strategy is via an upgrade to Graphene 3.
| Version | Supported |
| ------- | ------------------ |
| 3.x | :white_check_mark: |
| <3.x | :x: |
## Reporting a Vulnerability
Please use responsible disclosure by contacting a core maintainer via Discord or E-Mail.

View File

@ -153,7 +153,7 @@ class Query(ObjectType):
``` ```
Also, if you wanted to create an `ObjectType` that implements `Node`, you have to do it Also, if you wanted to create an `ObjectType` that implements `Node`, you have to do it
explicity. explicitly.
## Django ## Django

View File

@ -123,7 +123,7 @@ def resolve_my_field(root, info, my_arg):
return ... return ...
``` ```
**PS.: Take care with receiving args like `my_arg` as above. This doesn't work for optional (non-required) arguments as stantard `Connection`'s arguments (first, last, after, before).** **PS.: Take care with receiving args like `my_arg` as above. This doesn't work for optional (non-required) arguments as standard `Connection`'s arguments (first, last, after, before).**
You may need something like this: You may need something like this:
```python ```python

View File

@ -1,7 +0,0 @@
#!/bin/bash
# Install the required scripts with
# pip install autoflake autopep8 isort
autoflake ./examples/ ./graphene/ -r --remove-unused-variables --remove-all-unused-imports --in-place
autopep8 ./examples/ ./graphene/ -r --in-place --experimental --aggressive --max-line-length 120
isort -rc ./examples/ ./graphene/

View File

@ -92,7 +92,7 @@ Execution Metadata
.. autoclass:: graphene.Context .. autoclass:: graphene.Context
.. autoclass:: graphql.execution.base.ExecutionResult .. autoclass:: graphql.ExecutionResult
.. Relay .. Relay
.. ----- .. -----

View File

@ -1,4 +1,5 @@
import os import os
import sys
import sphinx_graphene_theme import sphinx_graphene_theme
@ -22,8 +23,6 @@ on_rtd = os.environ.get("READTHEDOCS", None) == "True"
# add these directories to sys.path here. If the directory is relative to the # add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here. # documentation root, use os.path.abspath to make it absolute, like shown here.
# #
import os
import sys
sys.path.insert(0, os.path.abspath("..")) sys.path.insert(0, os.path.abspath(".."))
@ -82,7 +81,7 @@ release = "1.0"
# #
# This is also used if you do content translation via gettext catalogs. # This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases. # Usually you set "language" from the command line for these cases.
language = None # language = None
# There are two options for replacing |today|: either, you set today to some # There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used: # non-false value, then it is used:
@ -456,5 +455,4 @@ intersphinx_mapping = {
"http://docs.graphene-python.org/projects/sqlalchemy/en/latest/", "http://docs.graphene-python.org/projects/sqlalchemy/en/latest/",
None, None,
), ),
"graphene_gae": ("http://docs.graphene-python.org/projects/gae/en/latest/", None),
} }

View File

@ -4,7 +4,7 @@ Dataloader
DataLoader is a generic utility to be used as part of your application's DataLoader is a generic utility to be used as part of your application's
data fetching layer to provide a simplified and consistent API over data fetching layer to provide a simplified and consistent API over
various remote data sources such as databases or web services via batching various remote data sources such as databases or web services via batching
and caching. and caching. It is provided by a separate package `aiodataloader <https://pypi.org/project/aiodataloader/>`.
Batching Batching
@ -15,21 +15,19 @@ Create loaders by providing a batch loading function.
.. code:: python .. code:: python
from promise import Promise from aiodataloader import DataLoader
from promise.dataloader import DataLoader
class UserLoader(DataLoader): class UserLoader(DataLoader):
def batch_load_fn(self, keys): async def batch_load_fn(self, keys):
# Here we return a promise that will result on the # Here we call a function to return a user for each key in keys
# corresponding user for each key in keys return [get_user(id=key) for key in keys]
return Promise.resolve([get_user(id=key) for key in keys])
A batch loading function accepts a list of keys, and returns a ``Promise`` A batch loading async function accepts a list of keys, and returns a list of ``values``.
which resolves to a list of ``values``.
``DataLoader`` will coalesce all individual loads which occur within a ``DataLoader`` will coalesce all individual loads which occur within a
single frame of execution (executed once the wrapping promise is resolved) single frame of execution (executed once the wrapping event loop is resolved)
and then call your batch function with all requested keys. and then call your batch function with all requested keys.
@ -37,9 +35,11 @@ and then call your batch function with all requested keys.
user_loader = UserLoader() user_loader = UserLoader()
user_loader.load(1).then(lambda user: user_loader.load(user.best_friend_id)) user1 = await user_loader.load(1)
user1_best_friend = await user_loader.load(user1.best_friend_id)
user_loader.load(2).then(lambda user: user_loader.load(user.best_friend_id)) user2 = await user_loader.load(2)
user2_best_friend = await user_loader.load(user2.best_friend_id)
A naive application may have issued *four* round-trips to a backend for the A naive application may have issued *four* round-trips to a backend for the
@ -53,9 +53,9 @@ make sure that you then order the query result for the results to match the keys
.. code:: python .. code:: python
class UserLoader(DataLoader): class UserLoader(DataLoader):
def batch_load_fn(self, keys): async def batch_load_fn(self, keys):
users = {user.id: user for user in User.objects.filter(id__in=keys)} users = {user.id: user for user in User.objects.filter(id__in=keys)}
return Promise.resolve([users.get(user_id) for user_id in keys]) return [users.get(user_id) for user_id in keys]
``DataLoader`` allows you to decouple unrelated parts of your application without ``DataLoader`` allows you to decouple unrelated parts of your application without
@ -110,8 +110,8 @@ leaner code and at most 4 database requests, and possibly fewer if there are cac
best_friend = graphene.Field(lambda: User) best_friend = graphene.Field(lambda: User)
friends = graphene.List(lambda: User) friends = graphene.List(lambda: User)
def resolve_best_friend(root, info): async def resolve_best_friend(root, info):
return user_loader.load(root.best_friend_id) return await user_loader.load(root.best_friend_id)
def resolve_friends(root, info): async def resolve_friends(root, info):
return user_loader.load_many(root.friend_ids) return await user_loader.load_many(root.friend_ids)

View File

@ -4,5 +4,5 @@ File uploading
File uploading is not part of the official GraphQL spec yet and is not natively File uploading is not part of the official GraphQL spec yet and is not natively
implemented in Graphene. implemented in Graphene.
If your server needs to support file uploading then you can use the libary: `graphene-file-upload <https://github.com/lmcgartland/graphene-file-upload>`_ which enhances Graphene to add file If your server needs to support file uploading then you can use the library: `graphene-file-upload <https://github.com/lmcgartland/graphene-file-upload>`_ which enhances Graphene to add file
uploads and conforms to the unoffical GraphQL `multipart request spec <https://github.com/jaydenseric/graphql-multipart-request-spec>`_. uploads and conforms to the unoffical GraphQL `multipart request spec <https://github.com/jaydenseric/graphql-multipart-request-spec>`_.

View File

@ -41,6 +41,8 @@ And then execute it with:
result = schema.execute('THE QUERY', middleware=[AuthorizationMiddleware()]) result = schema.execute('THE QUERY', middleware=[AuthorizationMiddleware()])
If the ``middleware`` argument includes multiple middlewares,
these middlewares will be executed bottom-up, i.e. from last to first.
Functional example Functional example
------------------ ------------------

View File

@ -1,5 +1,5 @@
Query Validation Query Validation
========== ================
GraphQL uses query validators to check if Query AST is valid and can be executed. Every GraphQL server implements GraphQL uses query validators to check if Query AST is valid and can be executed. Every GraphQL server implements
standard query validators. For example, there is an validator that tests if queried field exists on queried type, that standard query validators. For example, there is an validator that tests if queried field exists on queried type, that
makes query fail with "Cannot query field on type" error if it doesn't. makes query fail with "Cannot query field on type" error if it doesn't.
@ -8,7 +8,7 @@ To help with common use cases, graphene provides a few validation rules out of t
Depth limit Validator Depth limit Validator
----------------- ---------------------
The depth limit validator helps to prevent execution of malicious The depth limit validator helps to prevent execution of malicious
queries. It takes in the following arguments. queries. It takes in the following arguments.
@ -17,7 +17,7 @@ queries. It takes in the following arguments.
- ``callback`` Called each time validation runs. Receives an Object which is a map of the depths for each operation. - ``callback`` Called each time validation runs. Receives an Object which is a map of the depths for each operation.
Usage Usage
------- -----
Here is how you would implement depth-limiting on your schema. Here is how you would implement depth-limiting on your schema.
@ -54,7 +54,7 @@ the disable introspection validation rule ensures that your schema cannot be int
This is a useful security measure in production environments. This is a useful security measure in production environments.
Usage Usage
------- -----
Here is how you would disable introspection for your schema. Here is how you would disable introspection for your schema.

View File

@ -1,12 +1,6 @@
Graphene Graphene
======== ========
------------
The documentation below is for the ``dev`` (prerelease) version of Graphene. To view the documentation for the latest stable Graphene version go to the `v2 docs <https://docs.graphene-python.org/en/stable/>`_.
------------
Contents: Contents:
.. toctree:: .. toctree::
@ -27,7 +21,6 @@ Integrations
* `Graphene-Django <http://docs.graphene-python.org/projects/django/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-django/>`_) * `Graphene-Django <http://docs.graphene-python.org/projects/django/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-django/>`_)
* Flask-Graphql (`source <https://github.com/graphql-python/flask-graphql>`_) * Flask-Graphql (`source <https://github.com/graphql-python/flask-graphql>`_)
* `Graphene-SQLAlchemy <http://docs.graphene-python.org/projects/sqlalchemy/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-sqlalchemy/>`_) * `Graphene-SQLAlchemy <http://docs.graphene-python.org/projects/sqlalchemy/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-sqlalchemy/>`_)
* `Graphene-GAE <http://docs.graphene-python.org/projects/gae/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-gae/>`_)
* `Graphene-Mongo <http://graphene-mongo.readthedocs.io/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-mongo>`_) * `Graphene-Mongo <http://graphene-mongo.readthedocs.io/en/latest/>`_ (`source <https://github.com/graphql-python/graphene-mongo>`_)
* `Starlette <https://www.starlette.io/graphql/>`_ (`source <https://github.com/encode/starlette>`_) * `Starlette <https://www.starlette.io/graphql/>`_ (`source <https://github.com/encode/starlette>`_)
* `FastAPI <https://fastapi.tiangolo.com/advanced/graphql/>`_ (`source <https://github.com/tiangolo/fastapi>`_) * `FastAPI <https://fastapi.tiangolo.com/advanced/graphql/>`_ (`source <https://github.com/tiangolo/fastapi>`_)

View File

@ -28,7 +28,7 @@ Compare Graphene's *code-first* approach to building a GraphQL API with *schema-
.. _Apollo Server: https://www.apollographql.com/docs/apollo-server/ .. _Apollo Server: https://www.apollographql.com/docs/apollo-server/
.. _Ariadne: https://ariadne.readthedocs.io .. _Ariadne: https://ariadnegraphql.org/
Graphene is fully featured with integrations for the most popular web frameworks and ORMs. Graphene produces schemas that are fully compliant with the GraphQL spec and provides tools and patterns for building a Relay-Compliant API as well. Graphene is fully featured with integrations for the most popular web frameworks and ORMs. Graphene produces schemas that are fully compliant with the GraphQL spec and provides tools and patterns for building a Relay-Compliant API as well.
@ -37,12 +37,12 @@ An example in Graphene
Lets build a basic GraphQL schema to say "hello" and "goodbye" in Graphene. Lets build a basic GraphQL schema to say "hello" and "goodbye" in Graphene.
When we send a **Query** requesting only one **Field**, ``hello``, and specify a value for the ``name`` **Argument**... When we send a **Query** requesting only one **Field**, ``hello``, and specify a value for the ``firstName`` **Argument**...
.. code:: .. code::
{ {
hello(name: "friend") hello(firstName: "friend")
} }
...we would expect the following Response containing only the data requested (the ``goodbye`` field is not resolved). ...we would expect the following Response containing only the data requested (the ``goodbye`` field is not resolved).
@ -59,7 +59,7 @@ When we send a **Query** requesting only one **Field**, ``hello``, and specify a
Requirements Requirements
~~~~~~~~~~~~ ~~~~~~~~~~~~
- Python (2.7, 3.4, 3.5, 3.6, pypy) - Python (3.8, 3.9, 3.10, 3.11, 3.12, pypy)
- Graphene (3.0) - Graphene (3.0)
Project setup Project setup
@ -79,14 +79,15 @@ In Graphene, we can define a simple schema using the following code:
from graphene import ObjectType, String, Schema from graphene import ObjectType, String, Schema
class Query(ObjectType): class Query(ObjectType):
# this defines a Field `hello` in our Schema with a single Argument `name` # this defines a Field `hello` in our Schema with a single Argument `first_name`
hello = String(name=String(default_value="stranger")) # By default, the argument name will automatically be camel-based into firstName in the generated schema
hello = String(first_name=String(default_value="stranger"))
goodbye = String() goodbye = String()
# our Resolver method takes the GraphQL context (root, info) as well as # our Resolver method takes the GraphQL context (root, info) as well as
# Argument (name) for the Field and returns data for the query Response # Argument (first_name) for the Field and returns data for the query Response
def resolve_hello(root, info, name): def resolve_hello(root, info, first_name):
return f'Hello {name}!' return f'Hello {first_name}!'
def resolve_goodbye(root, info): def resolve_goodbye(root, info):
return 'See ya!' return 'See ya!'
@ -110,7 +111,7 @@ In the `GraphQL Schema Definition Language`_, we could describe the fields defin
.. code:: .. code::
type Query { type Query {
hello(name: String = "stranger"): String hello(firstName: String = "stranger"): String
goodbye: String goodbye: String
} }
@ -130,7 +131,7 @@ Then we can start querying our **Schema** by passing a GraphQL query string to `
# "Hello stranger!" # "Hello stranger!"
# or passing the argument in the query # or passing the argument in the query
query_with_argument = '{ hello(name: "GraphQL") }' query_with_argument = '{ hello(firstName: "GraphQL") }'
result = schema.execute(query_with_argument) result = schema.execute(query_with_argument)
print(result.data['hello']) print(result.data['hello'])
# "Hello GraphQL!" # "Hello GraphQL!"

View File

@ -1,5 +1,5 @@
# Required library # Required library
Sphinx==1.5.3 Sphinx==6.1.3
sphinx-autobuild==0.7.1 sphinx-autobuild==2021.3.14
# Docs template # Docs template
http://graphene-python.org/sphinx_graphene_theme.zip http://graphene-python.org/sphinx_graphene_theme.zip

View File

@ -69,43 +69,3 @@ You can also add extra keyword arguments to the ``execute`` method, such as
'hey': 'hello Peter!' 'hey': 'hello Peter!'
} }
} }
Snapshot testing
~~~~~~~~~~~~~~~~
As our APIs evolve, we need to know when our changes introduce any breaking changes that might break
some of the clients of our GraphQL app.
However, writing tests and replicating the same response we expect from our GraphQL application can be a
tedious and repetitive task, and sometimes it's easier to skip this process.
Because of that, we recommend the usage of `SnapshotTest <https://github.com/syrusakbary/snapshottest/>`_.
SnapshotTest lets us write all these tests in a breeze, as it automatically creates the ``snapshots`` for us
the first time the test are executed.
Here is a simple example on how our tests will look if we use ``pytest``:
.. code:: python
def test_hey(snapshot):
client = Client(my_schema)
# This will create a snapshot dir and a snapshot file
# the first time the test is executed, with the response
# of the execution.
snapshot.assert_match(client.execute('''{ hey }'''))
If we are using ``unittest``:
.. code:: python
from snapshottest import TestCase
class APITestCase(TestCase):
def test_api_me(self):
"""Testing the API for /me"""
client = Client(my_schema)
self.assertMatchSnapshot(client.execute('''{ hey }'''))

View File

@ -86,7 +86,7 @@ In the Python ``Enum`` implementation you can access a member by initing the Enu
assert Color(1) == Color.RED assert Color(1) == Color.RED
However, in Graphene ``Enum`` you need to call get to have the same effect: However, in Graphene ``Enum`` you need to call `.get` to have the same effect:
.. code:: python .. code:: python

View File

@ -80,6 +80,10 @@ If we have a schema with Person type and one field on the root query.
from graphene import ObjectType, String, Field from graphene import ObjectType, String, Field
def get_human(name):
first_name, last_name = name.split()
return Person(first_name, last_name)
class Person(ObjectType): class Person(ObjectType):
full_name = String() full_name = String()

View File

@ -270,8 +270,8 @@ The following is an example for creating a DateTime scalar:
return dt.isoformat() return dt.isoformat()
@staticmethod @staticmethod
def parse_literal(node): def parse_literal(node, _variables=None):
if isinstance(node, ast.StringValue): if isinstance(node, ast.StringValueNode):
return datetime.datetime.strptime( return datetime.datetime.strptime(
node.value, "%Y-%m-%dT%H:%M:%S.%f") node.value, "%Y-%m-%dT%H:%M:%S.%f")

View File

@ -8,7 +8,6 @@ class Patron(graphene.ObjectType):
class Query(graphene.ObjectType): class Query(graphene.ObjectType):
patron = graphene.Field(Patron) patron = graphene.Field(Patron)
def resolve_patron(root, info): def resolve_patron(root, info):

View File

@ -1,100 +0,0 @@
# -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots["test_hero_name_query 1"] = {"data": {"hero": {"name": "R2-D2"}}}
snapshots["test_hero_name_and_friends_query 1"] = {
"data": {
"hero": {
"id": "2001",
"name": "R2-D2",
"friends": [
{"name": "Luke Skywalker"},
{"name": "Han Solo"},
{"name": "Leia Organa"},
],
}
}
}
snapshots["test_nested_query 1"] = {
"data": {
"hero": {
"name": "R2-D2",
"friends": [
{
"name": "Luke Skywalker",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Han Solo"},
{"name": "Leia Organa"},
{"name": "C-3PO"},
{"name": "R2-D2"},
],
},
{
"name": "Han Solo",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Luke Skywalker"},
{"name": "Leia Organa"},
{"name": "R2-D2"},
],
},
{
"name": "Leia Organa",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Luke Skywalker"},
{"name": "Han Solo"},
{"name": "C-3PO"},
{"name": "R2-D2"},
],
},
],
}
}
}
snapshots["test_fetch_luke_query 1"] = {"data": {"human": {"name": "Luke Skywalker"}}}
snapshots["test_fetch_some_id_query 1"] = {
"data": {"human": {"name": "Luke Skywalker"}}
}
snapshots["test_fetch_some_id_query2 1"] = {"data": {"human": {"name": "Han Solo"}}}
snapshots["test_invalid_id_query 1"] = {"data": {"human": None}}
snapshots["test_fetch_luke_aliased 1"] = {"data": {"luke": {"name": "Luke Skywalker"}}}
snapshots["test_fetch_luke_and_leia_aliased 1"] = {
"data": {"luke": {"name": "Luke Skywalker"}, "leia": {"name": "Leia Organa"}}
}
snapshots["test_duplicate_fields 1"] = {
"data": {
"luke": {"name": "Luke Skywalker", "homePlanet": "Tatooine"},
"leia": {"name": "Leia Organa", "homePlanet": "Alderaan"},
}
}
snapshots["test_use_fragment 1"] = {
"data": {
"luke": {"name": "Luke Skywalker", "homePlanet": "Tatooine"},
"leia": {"name": "Leia Organa", "homePlanet": "Alderaan"},
}
}
snapshots["test_check_type_of_r2 1"] = {
"data": {"hero": {"__typename": "Droid", "name": "R2-D2"}}
}
snapshots["test_check_type_of_luke 1"] = {
"data": {"hero": {"__typename": "Human", "name": "Luke Skywalker"}}
}

View File

@ -8,19 +8,19 @@ setup()
client = Client(schema) client = Client(schema)
def test_hero_name_query(snapshot): def test_hero_name_query():
query = """ result = client.execute("""
query HeroNameQuery { query HeroNameQuery {
hero { hero {
name name
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {"data": {"hero": {"name": "R2-D2"}}}
def test_hero_name_and_friends_query(snapshot): def test_hero_name_and_friends_query():
query = """ result = client.execute("""
query HeroNameAndFriendsQuery { query HeroNameAndFriendsQuery {
hero { hero {
id id
@ -30,12 +30,24 @@ def test_hero_name_and_friends_query(snapshot):
} }
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {
"data": {
"hero": {
"id": "2001",
"name": "R2-D2",
"friends": [
{"name": "Luke Skywalker"},
{"name": "Han Solo"},
{"name": "Leia Organa"},
],
}
}
}
def test_nested_query(snapshot): def test_nested_query():
query = """ result = client.execute("""
query NestedQuery { query NestedQuery {
hero { hero {
name name
@ -48,70 +60,113 @@ def test_nested_query(snapshot):
} }
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {
"data": {
"hero": {
"name": "R2-D2",
"friends": [
{
"name": "Luke Skywalker",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Han Solo"},
{"name": "Leia Organa"},
{"name": "C-3PO"},
{"name": "R2-D2"},
],
},
{
"name": "Han Solo",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Luke Skywalker"},
{"name": "Leia Organa"},
{"name": "R2-D2"},
],
},
{
"name": "Leia Organa",
"appearsIn": ["NEWHOPE", "EMPIRE", "JEDI"],
"friends": [
{"name": "Luke Skywalker"},
{"name": "Han Solo"},
{"name": "C-3PO"},
{"name": "R2-D2"},
],
},
],
}
}
}
def test_fetch_luke_query(snapshot): def test_fetch_luke_query():
query = """ result = client.execute("""
query FetchLukeQuery { query FetchLukeQuery {
human(id: "1000") { human(id: "1000") {
name name
} }
} }
""")
assert result == {"data": {"human": {"name": "Luke Skywalker"}}}
def test_fetch_some_id_query():
result = client.execute(
""" """
snapshot.assert_match(client.execute(query))
def test_fetch_some_id_query(snapshot):
query = """
query FetchSomeIDQuery($someId: String!) { query FetchSomeIDQuery($someId: String!) {
human(id: $someId) { human(id: $someId) {
name name
} }
} }
""",
variables={"someId": "1000"},
)
assert result == {"data": {"human": {"name": "Luke Skywalker"}}}
def test_fetch_some_id_query2():
result = client.execute(
""" """
params = {"someId": "1000"}
snapshot.assert_match(client.execute(query, variables=params))
def test_fetch_some_id_query2(snapshot):
query = """
query FetchSomeIDQuery($someId: String!) { query FetchSomeIDQuery($someId: String!) {
human(id: $someId) { human(id: $someId) {
name name
} }
} }
""",
variables={"someId": "1002"},
)
assert result == {"data": {"human": {"name": "Han Solo"}}}
def test_invalid_id_query():
result = client.execute(
""" """
params = {"someId": "1002"}
snapshot.assert_match(client.execute(query, variables=params))
def test_invalid_id_query(snapshot):
query = """
query humanQuery($id: String!) { query humanQuery($id: String!) {
human(id: $id) { human(id: $id) {
name name
} }
} }
""" """,
params = {"id": "not a valid id"} variables={"id": "not a valid id"},
snapshot.assert_match(client.execute(query, variables=params)) )
assert result == {"data": {"human": None}}
def test_fetch_luke_aliased(snapshot): def test_fetch_luke_aliased():
query = """ result = client.execute("""
query FetchLukeAliased { query FetchLukeAliased {
luke: human(id: "1000") { luke: human(id: "1000") {
name name
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {"data": {"luke": {"name": "Luke Skywalker"}}}
def test_fetch_luke_and_leia_aliased(snapshot): def test_fetch_luke_and_leia_aliased():
query = """ result = client.execute("""
query FetchLukeAndLeiaAliased { query FetchLukeAndLeiaAliased {
luke: human(id: "1000") { luke: human(id: "1000") {
name name
@ -120,12 +175,14 @@ def test_fetch_luke_and_leia_aliased(snapshot):
name name
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {
"data": {"luke": {"name": "Luke Skywalker"}, "leia": {"name": "Leia Organa"}}
}
def test_duplicate_fields(snapshot): def test_duplicate_fields():
query = """ result = client.execute("""
query DuplicateFields { query DuplicateFields {
luke: human(id: "1000") { luke: human(id: "1000") {
name name
@ -136,12 +193,17 @@ def test_duplicate_fields(snapshot):
homePlanet homePlanet
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {
"data": {
"luke": {"name": "Luke Skywalker", "homePlanet": "Tatooine"},
"leia": {"name": "Leia Organa", "homePlanet": "Alderaan"},
}
}
def test_use_fragment(snapshot): def test_use_fragment():
query = """ result = client.execute("""
query UseFragment { query UseFragment {
luke: human(id: "1000") { luke: human(id: "1000") {
...HumanFragment ...HumanFragment
@ -154,29 +216,36 @@ def test_use_fragment(snapshot):
name name
homePlanet homePlanet
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {
"data": {
"luke": {"name": "Luke Skywalker", "homePlanet": "Tatooine"},
"leia": {"name": "Leia Organa", "homePlanet": "Alderaan"},
}
}
def test_check_type_of_r2(snapshot): def test_check_type_of_r2():
query = """ result = client.execute("""
query CheckTypeOfR2 { query CheckTypeOfR2 {
hero { hero {
__typename __typename
name name
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {"data": {"hero": {"__typename": "Droid", "name": "R2-D2"}}}
def test_check_type_of_luke(snapshot): def test_check_type_of_luke():
query = """ result = client.execute("""
query CheckTypeOfLuke { query CheckTypeOfLuke {
hero(episode: EMPIRE) { hero(episode: EMPIRE) {
__typename __typename
name name
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {
"data": {"hero": {"__typename": "Human", "name": "Luke Skywalker"}}
}

View File

@ -1,26 +0,0 @@
# -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots["test_correct_fetch_first_ship_rebels 1"] = {
"data": {
"rebels": {
"name": "Alliance to Restore the Republic",
"ships": {
"pageInfo": {
"startCursor": "YXJyYXljb25uZWN0aW9uOjA=",
"endCursor": "YXJyYXljb25uZWN0aW9uOjA=",
"hasNextPage": True,
"hasPreviousPage": False,
},
"edges": [
{"cursor": "YXJyYXljb25uZWN0aW9uOjA=", "node": {"name": "X-Wing"}}
],
},
}
}
}

View File

@ -1,28 +0,0 @@
# -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots["test_mutations 1"] = {
"data": {
"introduceShip": {
"ship": {"id": "U2hpcDo5", "name": "Peter"},
"faction": {
"name": "Alliance to Restore the Republic",
"ships": {
"edges": [
{"node": {"id": "U2hpcDox", "name": "X-Wing"}},
{"node": {"id": "U2hpcDoy", "name": "Y-Wing"}},
{"node": {"id": "U2hpcDoz", "name": "A-Wing"}},
{"node": {"id": "U2hpcDo0", "name": "Millennium Falcon"}},
{"node": {"id": "U2hpcDo1", "name": "Home One"}},
{"node": {"id": "U2hpcDo5", "name": "Peter"}},
]
},
},
}
}
}

View File

@ -1,118 +0,0 @@
# -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots["test_correctly_fetches_id_name_rebels 1"] = {
"data": {
"rebels": {"id": "RmFjdGlvbjox", "name": "Alliance to Restore the Republic"}
}
}
snapshots["test_correctly_refetches_rebels 1"] = {
"data": {"node": {"id": "RmFjdGlvbjox", "name": "Alliance to Restore the Republic"}}
}
snapshots["test_correctly_fetches_id_name_empire 1"] = {
"data": {"empire": {"id": "RmFjdGlvbjoy", "name": "Galactic Empire"}}
}
snapshots["test_correctly_refetches_empire 1"] = {
"data": {"node": {"id": "RmFjdGlvbjoy", "name": "Galactic Empire"}}
}
snapshots["test_correctly_refetches_xwing 1"] = {
"data": {"node": {"id": "U2hpcDox", "name": "X-Wing"}}
}
snapshots[
"test_str_schema 1"
] = '''type Query {
rebels: Faction
empire: Faction
node(
"""The ID of the object"""
id: ID!
): Node
}
"""A faction in the Star Wars saga"""
type Faction implements Node {
"""The ID of the object"""
id: ID!
"""The name of the faction."""
name: String
"""The ships used by the faction."""
ships(before: String, after: String, first: Int, last: Int): ShipConnection
}
"""An object with an ID"""
interface Node {
"""The ID of the object"""
id: ID!
}
type ShipConnection {
"""Pagination data for this connection."""
pageInfo: PageInfo!
"""Contains the nodes in this connection."""
edges: [ShipEdge]!
}
"""
The Relay compliant `PageInfo` type, containing data necessary to paginate this connection.
"""
type PageInfo {
"""When paginating forwards, are there more items?"""
hasNextPage: Boolean!
"""When paginating backwards, are there more items?"""
hasPreviousPage: Boolean!
"""When paginating backwards, the cursor to continue."""
startCursor: String
"""When paginating forwards, the cursor to continue."""
endCursor: String
}
"""A Relay edge containing a `Ship` and its cursor."""
type ShipEdge {
"""The item at the end of the edge"""
node: Ship
"""A cursor for use in pagination"""
cursor: String!
}
"""A ship in the Star Wars saga"""
type Ship implements Node {
"""The ID of the object"""
id: ID!
"""The name of the ship."""
name: String
}
type Mutation {
introduceShip(input: IntroduceShipInput!): IntroduceShipPayload
}
type IntroduceShipPayload {
ship: Ship
faction: Faction
clientMutationId: String
}
input IntroduceShipInput {
shipName: String!
factionId: String!
clientMutationId: String
}'''

View File

@ -8,8 +8,8 @@ setup()
client = Client(schema) client = Client(schema)
def test_correct_fetch_first_ship_rebels(snapshot): def test_correct_fetch_first_ship_rebels():
query = """ result = client.execute("""
query RebelsShipsQuery { query RebelsShipsQuery {
rebels { rebels {
name, name,
@ -29,5 +29,25 @@ def test_correct_fetch_first_ship_rebels(snapshot):
} }
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {
"data": {
"rebels": {
"name": "Alliance to Restore the Republic",
"ships": {
"pageInfo": {
"startCursor": "YXJyYXljb25uZWN0aW9uOjA=",
"endCursor": "YXJyYXljb25uZWN0aW9uOjA=",
"hasNextPage": True,
"hasPreviousPage": False,
},
"edges": [
{
"cursor": "YXJyYXljb25uZWN0aW9uOjA=",
"node": {"name": "X-Wing"},
}
],
},
}
}
}

View File

@ -8,8 +8,8 @@ setup()
client = Client(schema) client = Client(schema)
def test_mutations(snapshot): def test_mutations():
query = """ result = client.execute("""
mutation MyMutation { mutation MyMutation {
introduceShip(input:{clientMutationId:"abc", shipName: "Peter", factionId: "1"}) { introduceShip(input:{clientMutationId:"abc", shipName: "Peter", factionId: "1"}) {
ship { ship {
@ -29,5 +29,24 @@ def test_mutations(snapshot):
} }
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {
"data": {
"introduceShip": {
"ship": {"id": "U2hpcDo5", "name": "Peter"},
"faction": {
"name": "Alliance to Restore the Republic",
"ships": {
"edges": [
{"node": {"id": "U2hpcDox", "name": "X-Wing"}},
{"node": {"id": "U2hpcDoy", "name": "Y-Wing"}},
{"node": {"id": "U2hpcDoz", "name": "A-Wing"}},
{"node": {"id": "U2hpcDo0", "name": "Millennium Falcon"}},
{"node": {"id": "U2hpcDo1", "name": "Home One"}},
{"node": {"id": "U2hpcDo5", "name": "Peter"}},
]
},
},
}
}
}

View File

@ -1,3 +1,5 @@
import textwrap
from graphene.test import Client from graphene.test import Client
from ..data import setup from ..data import setup
@ -8,24 +10,115 @@ setup()
client = Client(schema) client = Client(schema)
def test_str_schema(snapshot): def test_str_schema():
snapshot.assert_match(str(schema).strip()) assert str(schema).strip() == textwrap.dedent(
'''\
type Query {
rebels: Faction
empire: Faction
node(
"""The ID of the object"""
id: ID!
): Node
}
"""A faction in the Star Wars saga"""
type Faction implements Node {
"""The ID of the object"""
id: ID!
"""The name of the faction."""
name: String
"""The ships used by the faction."""
ships(before: String, after: String, first: Int, last: Int): ShipConnection
}
"""An object with an ID"""
interface Node {
"""The ID of the object"""
id: ID!
}
type ShipConnection {
"""Pagination data for this connection."""
pageInfo: PageInfo!
"""Contains the nodes in this connection."""
edges: [ShipEdge]!
}
"""
The Relay compliant `PageInfo` type, containing data necessary to paginate this connection.
"""
type PageInfo {
"""When paginating forwards, are there more items?"""
hasNextPage: Boolean!
"""When paginating backwards, are there more items?"""
hasPreviousPage: Boolean!
"""When paginating backwards, the cursor to continue."""
startCursor: String
"""When paginating forwards, the cursor to continue."""
endCursor: String
}
"""A Relay edge containing a `Ship` and its cursor."""
type ShipEdge {
"""The item at the end of the edge"""
node: Ship
"""A cursor for use in pagination"""
cursor: String!
}
"""A ship in the Star Wars saga"""
type Ship implements Node {
"""The ID of the object"""
id: ID!
"""The name of the ship."""
name: String
}
type Mutation {
introduceShip(input: IntroduceShipInput!): IntroduceShipPayload
}
type IntroduceShipPayload {
ship: Ship
faction: Faction
clientMutationId: String
}
input IntroduceShipInput {
shipName: String!
factionId: String!
clientMutationId: String
}'''
)
def test_correctly_fetches_id_name_rebels(snapshot): def test_correctly_fetches_id_name_rebels():
query = """ result = client.execute("""
query RebelsQuery { query RebelsQuery {
rebels { rebels {
id id
name name
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {
"data": {
"rebels": {"id": "RmFjdGlvbjox", "name": "Alliance to Restore the Republic"}
}
}
def test_correctly_refetches_rebels(snapshot): def test_correctly_refetches_rebels():
query = """ result = client.execute("""
query RebelsRefetchQuery { query RebelsRefetchQuery {
node(id: "RmFjdGlvbjox") { node(id: "RmFjdGlvbjox") {
id id
@ -34,24 +127,30 @@ def test_correctly_refetches_rebels(snapshot):
} }
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {
"data": {
"node": {"id": "RmFjdGlvbjox", "name": "Alliance to Restore the Republic"}
}
}
def test_correctly_fetches_id_name_empire(snapshot): def test_correctly_fetches_id_name_empire():
query = """ result = client.execute("""
query EmpireQuery { query EmpireQuery {
empire { empire {
id id
name name
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {
"data": {"empire": {"id": "RmFjdGlvbjoy", "name": "Galactic Empire"}}
}
def test_correctly_refetches_empire(snapshot): def test_correctly_refetches_empire():
query = """ result = client.execute("""
query EmpireRefetchQuery { query EmpireRefetchQuery {
node(id: "RmFjdGlvbjoy") { node(id: "RmFjdGlvbjoy") {
id id
@ -60,12 +159,14 @@ def test_correctly_refetches_empire(snapshot):
} }
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {
"data": {"node": {"id": "RmFjdGlvbjoy", "name": "Galactic Empire"}}
}
def test_correctly_refetches_xwing(snapshot): def test_correctly_refetches_xwing():
query = """ result = client.execute("""
query XWingRefetchQuery { query XWingRefetchQuery {
node(id: "U2hpcDox") { node(id: "U2hpcDox") {
id id
@ -74,5 +175,5 @@ def test_correctly_refetches_xwing(snapshot):
} }
} }
} }
""" """)
snapshot.assert_match(client.execute(query)) assert result == {"data": {"node": {"id": "U2hpcDox", "name": "X-Wing"}}}

View File

@ -1,11 +1,15 @@
from .pyutils.version import get_version from .pyutils.version import get_version
from .relay import ( from .relay import (
BaseGlobalIDType,
ClientIDMutation, ClientIDMutation,
Connection, Connection,
ConnectionField, ConnectionField,
DefaultGlobalIDType,
GlobalID, GlobalID,
Node, Node,
PageInfo, PageInfo,
SimpleGlobalIDType,
UUIDGlobalIDType,
is_node, is_node,
) )
from .types import ( from .types import (
@ -13,6 +17,7 @@ from .types import (
UUID, UUID,
Argument, Argument,
Base64, Base64,
BigInt,
Boolean, Boolean,
Context, Context,
Date, Date,
@ -41,7 +46,7 @@ from .types import (
from .utils.module_loading import lazy_import from .utils.module_loading import lazy_import
from .utils.resolve_only_args import resolve_only_args from .utils.resolve_only_args import resolve_only_args
VERSION = (3, 1, 0, "final", 0) VERSION = (3, 4, 3, "final", 0)
__version__ = get_version(VERSION) __version__ = get_version(VERSION)
@ -50,6 +55,8 @@ __all__ = [
"__version__", "__version__",
"Argument", "Argument",
"Base64", "Base64",
"BigInt",
"BaseGlobalIDType",
"Boolean", "Boolean",
"ClientIDMutation", "ClientIDMutation",
"Connection", "Connection",
@ -58,6 +65,7 @@ __all__ = [
"Date", "Date",
"DateTime", "DateTime",
"Decimal", "Decimal",
"DefaultGlobalIDType",
"Dynamic", "Dynamic",
"Enum", "Enum",
"Field", "Field",
@ -78,10 +86,12 @@ __all__ = [
"ResolveInfo", "ResolveInfo",
"Scalar", "Scalar",
"Schema", "Schema",
"SimpleGlobalIDType",
"String", "String",
"Time", "Time",
"UUID",
"Union", "Union",
"UUID",
"UUIDGlobalIDType",
"is_node", "is_node",
"lazy_import", "lazy_import",
"resolve_only_args", "resolve_only_args",

File diff suppressed because it is too large Load Diff

View File

@ -1,5 +1,3 @@
from __future__ import unicode_literals
import datetime import datetime
import os import os
import subprocess import subprocess
@ -73,6 +71,6 @@ def get_git_changeset():
) )
timestamp = git_log.communicate()[0] timestamp = git_log.communicate()[0]
timestamp = datetime.datetime.utcfromtimestamp(int(timestamp)) timestamp = datetime.datetime.utcfromtimestamp(int(timestamp))
except: except Exception:
return None return None
return timestamp.strftime("%Y%m%d%H%M%S") return timestamp.strftime("%Y%m%d%H%M%S")

View File

@ -1,13 +1,23 @@
from .node import Node, is_node, GlobalID from .node import Node, is_node, GlobalID
from .mutation import ClientIDMutation from .mutation import ClientIDMutation
from .connection import Connection, ConnectionField, PageInfo from .connection import Connection, ConnectionField, PageInfo
from .id_type import (
BaseGlobalIDType,
DefaultGlobalIDType,
SimpleGlobalIDType,
UUIDGlobalIDType,
)
__all__ = [ __all__ = [
"Node", "BaseGlobalIDType",
"is_node",
"GlobalID",
"ClientIDMutation", "ClientIDMutation",
"Connection", "Connection",
"ConnectionField", "ConnectionField",
"DefaultGlobalIDType",
"GlobalID",
"Node",
"PageInfo", "PageInfo",
"SimpleGlobalIDType",
"UUIDGlobalIDType",
"is_node",
] ]

View File

@ -1,6 +1,7 @@
import re import re
from collections.abc import Iterable from collections.abc import Iterable
from functools import partial from functools import partial
from typing import Type
from graphql_relay import connection_from_array from graphql_relay import connection_from_array
@ -8,7 +9,34 @@ from ..types import Boolean, Enum, Int, Interface, List, NonNull, Scalar, String
from ..types.field import Field from ..types.field import Field
from ..types.objecttype import ObjectType, ObjectTypeOptions from ..types.objecttype import ObjectType, ObjectTypeOptions
from ..utils.thenables import maybe_thenable from ..utils.thenables import maybe_thenable
from .node import is_node from .node import is_node, AbstractNode
def get_edge_class(
connection_class: Type["Connection"],
_node: Type[AbstractNode],
base_name: str,
strict_types: bool = False,
):
edge_class = getattr(connection_class, "Edge", None)
class EdgeBase:
node = Field(
NonNull(_node) if strict_types else _node,
description="The item at the end of the edge",
)
cursor = String(required=True, description="A cursor for use in pagination")
class EdgeMeta:
description = f"A Relay edge containing a `{base_name}` and its cursor."
edge_name = f"{base_name}Edge"
edge_bases = [edge_class, EdgeBase] if edge_class else [EdgeBase]
if not isinstance(edge_class, ObjectType):
edge_bases = [*edge_bases, ObjectType]
return type(edge_name, tuple(edge_bases), {"Meta": EdgeMeta})
class PageInfo(ObjectType): class PageInfo(ObjectType):
@ -61,7 +89,10 @@ class Connection(ObjectType):
abstract = True abstract = True
@classmethod @classmethod
def __init_subclass_with_meta__(cls, node=None, name=None, **options): def __init_subclass_with_meta__(
cls, node=None, name=None, strict_types=False, _meta=None, **options
):
if not _meta:
_meta = ConnectionOptions(cls) _meta = ConnectionOptions(cls)
assert node, f"You have to provide a node in {cls.__name__}.Meta" assert node, f"You have to provide a node in {cls.__name__}.Meta"
assert isinstance(node, NonNull) or issubclass( assert isinstance(node, NonNull) or issubclass(
@ -72,39 +103,29 @@ class Connection(ObjectType):
if not name: if not name:
name = f"{base_name}Connection" name = f"{base_name}Connection"
edge_class = getattr(cls, "Edge", None)
_node = node
class EdgeBase:
node = Field(_node, description="The item at the end of the edge")
cursor = String(required=True, description="A cursor for use in pagination")
class EdgeMeta:
description = f"A Relay edge containing a `{base_name}` and its cursor."
edge_name = f"{base_name}Edge"
if edge_class:
edge_bases = (edge_class, EdgeBase, ObjectType)
else:
edge_bases = (EdgeBase, ObjectType)
edge = type(edge_name, edge_bases, {"Meta": EdgeMeta})
cls.Edge = edge
options["name"] = name options["name"] = name
_meta.node = node _meta.node = node
_meta.fields = {
"page_info": Field( if not _meta.fields:
_meta.fields = {}
if "page_info" not in _meta.fields:
_meta.fields["page_info"] = Field(
PageInfo, PageInfo,
name="pageInfo", name="pageInfo",
required=True, required=True,
description="Pagination data for this connection.", description="Pagination data for this connection.",
), )
"edges": Field(
NonNull(List(edge)), if "edges" not in _meta.fields:
edge_class = get_edge_class(cls, node, base_name, strict_types) # type: ignore
cls.Edge = edge_class
_meta.fields["edges"] = Field(
NonNull(List(NonNull(edge_class) if strict_types else edge_class)),
description="Contains the nodes in this connection.", description="Contains the nodes in this connection.",
), )
}
return super(Connection, cls).__init_subclass_with_meta__( return super(Connection, cls).__init_subclass_with_meta__(
_meta=_meta, **options _meta=_meta, **options
) )

87
graphene/relay/id_type.py Normal file
View File

@ -0,0 +1,87 @@
from graphql_relay import from_global_id, to_global_id
from ..types import ID, UUID
from ..types.base import BaseType
from typing import Type
class BaseGlobalIDType:
"""
Base class that define the required attributes/method for a type.
"""
graphene_type: Type[BaseType] = ID
@classmethod
def resolve_global_id(cls, info, global_id):
# return _type, _id
raise NotImplementedError
@classmethod
def to_global_id(cls, _type, _id):
# return _id
raise NotImplementedError
class DefaultGlobalIDType(BaseGlobalIDType):
"""
Default global ID type: base64 encoded version of "<node type name>: <node id>".
"""
graphene_type = ID
@classmethod
def resolve_global_id(cls, info, global_id):
try:
_type, _id = from_global_id(global_id)
if not _type:
raise ValueError("Invalid Global ID")
return _type, _id
except Exception as e:
raise Exception(
f'Unable to parse global ID "{global_id}". '
'Make sure it is a base64 encoded string in the format: "TypeName:id". '
f"Exception message: {e}"
)
@classmethod
def to_global_id(cls, _type, _id):
return to_global_id(_type, _id)
class SimpleGlobalIDType(BaseGlobalIDType):
"""
Simple global ID type: simply the id of the object.
To be used carefully as the user is responsible for ensuring that the IDs are indeed global
(otherwise it could cause request caching issues).
"""
graphene_type = ID
@classmethod
def resolve_global_id(cls, info, global_id):
_type = info.return_type.graphene_type._meta.name
return _type, global_id
@classmethod
def to_global_id(cls, _type, _id):
return _id
class UUIDGlobalIDType(BaseGlobalIDType):
"""
UUID global ID type.
By definition UUID are global so they are used as they are.
"""
graphene_type = UUID
@classmethod
def resolve_global_id(cls, info, global_id):
_type = info.return_type.graphene_type._meta.name
return _type, global_id
@classmethod
def to_global_id(cls, _type, _id):
return _id

View File

@ -1,11 +1,10 @@
from functools import partial from functools import partial
from inspect import isclass from inspect import isclass
from graphql_relay import from_global_id, to_global_id from ..types import Field, Interface, ObjectType
from ..types import ID, Field, Interface, ObjectType
from ..types.interface import InterfaceOptions from ..types.interface import InterfaceOptions
from ..types.utils import get_type from ..types.utils import get_type
from .id_type import BaseGlobalIDType, DefaultGlobalIDType
def is_node(objecttype): def is_node(objecttype):
@ -22,8 +21,18 @@ def is_node(objecttype):
class GlobalID(Field): class GlobalID(Field):
def __init__(self, node=None, parent_type=None, required=True, *args, **kwargs): def __init__(
super(GlobalID, self).__init__(ID, required=required, *args, **kwargs) self,
node=None,
parent_type=None,
required=True,
global_id_type=DefaultGlobalIDType,
*args,
**kwargs,
):
super(GlobalID, self).__init__(
global_id_type.graphene_type, required=required, *args, **kwargs
)
self.node = node or Node self.node = node or Node
self.parent_type_name = parent_type._meta.name if parent_type else None self.parent_type_name = parent_type._meta.name if parent_type else None
@ -47,12 +56,14 @@ class NodeField(Field):
assert issubclass(node, Node), "NodeField can only operate in Nodes" assert issubclass(node, Node), "NodeField can only operate in Nodes"
self.node_type = node self.node_type = node
self.field_type = type_ self.field_type = type_
global_id_type = node._meta.global_id_type
super(NodeField, self).__init__( super(NodeField, self).__init__(
# If we don's specify a type, the field type will be the node # If we don't specify a type, the field type will be the node interface
# interface
type_ or node, type_ or node,
id=ID(required=True, description="The ID of the object"), id=global_id_type.graphene_type(
required=True, description="The ID of the object"
),
**kwargs, **kwargs,
) )
@ -65,11 +76,23 @@ class AbstractNode(Interface):
abstract = True abstract = True
@classmethod @classmethod
def __init_subclass_with_meta__(cls, **options): def __init_subclass_with_meta__(cls, global_id_type=DefaultGlobalIDType, **options):
assert issubclass(
global_id_type, BaseGlobalIDType
), "Custom ID type need to be implemented as a subclass of BaseGlobalIDType."
_meta = InterfaceOptions(cls) _meta = InterfaceOptions(cls)
_meta.fields = {"id": GlobalID(cls, description="The ID of the object")} _meta.global_id_type = global_id_type
_meta.fields = {
"id": GlobalID(
cls, global_id_type=global_id_type, description="The ID of the object"
)
}
super(AbstractNode, cls).__init_subclass_with_meta__(_meta=_meta, **options) super(AbstractNode, cls).__init_subclass_with_meta__(_meta=_meta, **options)
@classmethod
def resolve_global_id(cls, info, global_id):
return cls._meta.global_id_type.resolve_global_id(info, global_id)
class Node(AbstractNode): class Node(AbstractNode):
"""An object with an ID""" """An object with an ID"""
@ -84,16 +107,7 @@ class Node(AbstractNode):
@classmethod @classmethod
def get_node_from_global_id(cls, info, global_id, only_type=None): def get_node_from_global_id(cls, info, global_id, only_type=None):
try: _type, _id = cls.resolve_global_id(info, global_id)
_type, _id = cls.from_global_id(global_id)
if not _type:
raise ValueError("Invalid Global ID")
except Exception as e:
raise Exception(
f'Unable to parse global ID "{global_id}". '
'Make sure it is a base64 encoded string in the format: "TypeName:id". '
f"Exception message: {e}"
)
graphene_type = info.schema.get_type(_type) graphene_type = info.schema.get_type(_type)
if graphene_type is None: if graphene_type is None:
@ -116,10 +130,6 @@ class Node(AbstractNode):
if get_node: if get_node:
return get_node(info, _id) return get_node(info, _id)
@classmethod
def from_global_id(cls, global_id):
return from_global_id(global_id)
@classmethod @classmethod
def to_global_id(cls, type_, id): def to_global_id(cls, type_, id):
return to_global_id(type_, id) return cls._meta.global_id_type.to_global_id(type_, id)

View File

@ -1,7 +1,15 @@
import re
from pytest import raises from pytest import raises
from ...types import Argument, Field, Int, List, NonNull, ObjectType, Schema, String from ...types import Argument, Field, Int, List, NonNull, ObjectType, Schema, String
from ..connection import Connection, ConnectionField, PageInfo from ..connection import (
Connection,
ConnectionField,
PageInfo,
ConnectionOptions,
get_edge_class,
)
from ..node import Node from ..node import Node
@ -51,6 +59,111 @@ def test_connection_inherit_abstracttype():
assert list(fields) == ["page_info", "edges", "extra"] assert list(fields) == ["page_info", "edges", "extra"]
def test_connection_extra_abstract_fields():
class ConnectionWithNodes(Connection):
class Meta:
abstract = True
@classmethod
def __init_subclass_with_meta__(cls, node=None, name=None, **options):
_meta = ConnectionOptions(cls)
_meta.fields = {
"nodes": Field(
NonNull(List(node)),
description="Contains all the nodes in this connection.",
),
}
return super(ConnectionWithNodes, cls).__init_subclass_with_meta__(
node=node, name=name, _meta=_meta, **options
)
class MyObjectConnection(ConnectionWithNodes):
class Meta:
node = MyObject
class Edge:
other = String()
assert MyObjectConnection._meta.name == "MyObjectConnection"
fields = MyObjectConnection._meta.fields
assert list(fields) == ["nodes", "page_info", "edges"]
edge_field = fields["edges"]
pageinfo_field = fields["page_info"]
nodes_field = fields["nodes"]
assert isinstance(edge_field, Field)
assert isinstance(edge_field.type, NonNull)
assert isinstance(edge_field.type.of_type, List)
assert edge_field.type.of_type.of_type == MyObjectConnection.Edge
assert isinstance(pageinfo_field, Field)
assert isinstance(pageinfo_field.type, NonNull)
assert pageinfo_field.type.of_type == PageInfo
assert isinstance(nodes_field, Field)
assert isinstance(nodes_field.type, NonNull)
assert isinstance(nodes_field.type.of_type, List)
assert nodes_field.type.of_type.of_type == MyObject
def test_connection_override_fields():
class ConnectionWithNodes(Connection):
class Meta:
abstract = True
@classmethod
def __init_subclass_with_meta__(cls, node=None, name=None, **options):
_meta = ConnectionOptions(cls)
base_name = (
re.sub("Connection$", "", name or cls.__name__) or node._meta.name
)
edge_class = get_edge_class(cls, node, base_name)
_meta.fields = {
"page_info": Field(
NonNull(
PageInfo,
name="pageInfo",
required=True,
description="Pagination data for this connection.",
)
),
"edges": Field(
NonNull(List(NonNull(edge_class))),
description="Contains the nodes in this connection.",
),
}
return super(ConnectionWithNodes, cls).__init_subclass_with_meta__(
node=node, name=name, _meta=_meta, **options
)
class MyObjectConnection(ConnectionWithNodes):
class Meta:
node = MyObject
assert MyObjectConnection._meta.name == "MyObjectConnection"
fields = MyObjectConnection._meta.fields
assert list(fields) == ["page_info", "edges"]
edge_field = fields["edges"]
pageinfo_field = fields["page_info"]
assert isinstance(edge_field, Field)
assert isinstance(edge_field.type, NonNull)
assert isinstance(edge_field.type.of_type, List)
assert isinstance(edge_field.type.of_type.of_type, NonNull)
assert edge_field.type.of_type.of_type.of_type.__name__ == "MyObjectEdge"
# This page info is NonNull
assert isinstance(pageinfo_field, Field)
assert isinstance(edge_field.type, NonNull)
assert pageinfo_field.type.of_type == PageInfo
def test_connection_name(): def test_connection_name():
custom_name = "MyObjectCustomNameConnection" custom_name = "MyObjectCustomNameConnection"
@ -186,3 +299,20 @@ def test_connectionfield_required():
executed = schema.execute("{ testConnection { edges { cursor } } }") executed = schema.execute("{ testConnection { edges { cursor } } }")
assert not executed.errors assert not executed.errors
assert executed.data == {"testConnection": {"edges": []}} assert executed.data == {"testConnection": {"edges": []}}
def test_connectionfield_strict_types():
class MyObjectConnection(Connection):
class Meta:
node = MyObject
strict_types = True
connection_field = ConnectionField(MyObjectConnection)
edges_field_type = connection_field.type._meta.fields["edges"].type
assert isinstance(edges_field_type, NonNull)
edges_list_element_type = edges_field_type.of_type.of_type
assert isinstance(edges_list_element_type, NonNull)
node_field = edges_list_element_type.of_type._meta.fields["node"]
assert isinstance(node_field.type, NonNull)

View File

@ -0,0 +1,325 @@
import re
from uuid import uuid4
from graphql import graphql_sync
from ..id_type import BaseGlobalIDType, SimpleGlobalIDType, UUIDGlobalIDType
from ..node import Node
from ...types import Int, ObjectType, Schema, String
class TestUUIDGlobalID:
def setup_method(self):
self.user_list = [
{"id": uuid4(), "name": "First"},
{"id": uuid4(), "name": "Second"},
{"id": uuid4(), "name": "Third"},
{"id": uuid4(), "name": "Fourth"},
]
self.users = {user["id"]: user for user in self.user_list}
class CustomNode(Node):
class Meta:
global_id_type = UUIDGlobalIDType
class User(ObjectType):
class Meta:
interfaces = [CustomNode]
name = String()
@classmethod
def get_node(cls, _type, _id):
return self.users[_id]
class RootQuery(ObjectType):
user = CustomNode.Field(User)
self.schema = Schema(query=RootQuery, types=[User])
self.graphql_schema = self.schema.graphql_schema
def test_str_schema_correct(self):
"""
Check that the schema has the expected and custom node interface and user type and that they both use UUIDs
"""
parsed = re.findall(r"(.+) \{\n\s*([\w\W]*?)\n\}", str(self.schema))
types = [t for t, f in parsed]
fields = [f for t, f in parsed]
custom_node_interface = "interface CustomNode"
assert custom_node_interface in types
assert (
'"""The ID of the object"""\n id: UUID!'
== fields[types.index(custom_node_interface)]
)
user_type = "type User implements CustomNode"
assert user_type in types
assert (
'"""The ID of the object"""\n id: UUID!\n name: String'
== fields[types.index(user_type)]
)
def test_get_by_id(self):
query = """query userById($id: UUID!) {
user(id: $id) {
id
name
}
}"""
# UUID need to be converted to string for serialization
result = graphql_sync(
self.graphql_schema,
query,
variable_values={"id": str(self.user_list[0]["id"])},
)
assert not result.errors
assert result.data["user"]["id"] == str(self.user_list[0]["id"])
assert result.data["user"]["name"] == self.user_list[0]["name"]
class TestSimpleGlobalID:
def setup_method(self):
self.user_list = [
{"id": "my global primary key in clear 1", "name": "First"},
{"id": "my global primary key in clear 2", "name": "Second"},
{"id": "my global primary key in clear 3", "name": "Third"},
{"id": "my global primary key in clear 4", "name": "Fourth"},
]
self.users = {user["id"]: user for user in self.user_list}
class CustomNode(Node):
class Meta:
global_id_type = SimpleGlobalIDType
class User(ObjectType):
class Meta:
interfaces = [CustomNode]
name = String()
@classmethod
def get_node(cls, _type, _id):
return self.users[_id]
class RootQuery(ObjectType):
user = CustomNode.Field(User)
self.schema = Schema(query=RootQuery, types=[User])
self.graphql_schema = self.schema.graphql_schema
def test_str_schema_correct(self):
"""
Check that the schema has the expected and custom node interface and user type and that they both use UUIDs
"""
parsed = re.findall(r"(.+) \{\n\s*([\w\W]*?)\n\}", str(self.schema))
types = [t for t, f in parsed]
fields = [f for t, f in parsed]
custom_node_interface = "interface CustomNode"
assert custom_node_interface in types
assert (
'"""The ID of the object"""\n id: ID!'
== fields[types.index(custom_node_interface)]
)
user_type = "type User implements CustomNode"
assert user_type in types
assert (
'"""The ID of the object"""\n id: ID!\n name: String'
== fields[types.index(user_type)]
)
def test_get_by_id(self):
query = """query {
user(id: "my global primary key in clear 3") {
id
name
}
}"""
result = graphql_sync(self.graphql_schema, query)
assert not result.errors
assert result.data["user"]["id"] == self.user_list[2]["id"]
assert result.data["user"]["name"] == self.user_list[2]["name"]
class TestCustomGlobalID:
def setup_method(self):
self.user_list = [
{"id": 1, "name": "First"},
{"id": 2, "name": "Second"},
{"id": 3, "name": "Third"},
{"id": 4, "name": "Fourth"},
]
self.users = {user["id"]: user for user in self.user_list}
class CustomGlobalIDType(BaseGlobalIDType):
"""
Global id that is simply and integer in clear.
"""
graphene_type = Int
@classmethod
def resolve_global_id(cls, info, global_id):
_type = info.return_type.graphene_type._meta.name
return _type, global_id
@classmethod
def to_global_id(cls, _type, _id):
return _id
class CustomNode(Node):
class Meta:
global_id_type = CustomGlobalIDType
class User(ObjectType):
class Meta:
interfaces = [CustomNode]
name = String()
@classmethod
def get_node(cls, _type, _id):
return self.users[_id]
class RootQuery(ObjectType):
user = CustomNode.Field(User)
self.schema = Schema(query=RootQuery, types=[User])
self.graphql_schema = self.schema.graphql_schema
def test_str_schema_correct(self):
"""
Check that the schema has the expected and custom node interface and user type and that they both use UUIDs
"""
parsed = re.findall(r"(.+) \{\n\s*([\w\W]*?)\n\}", str(self.schema))
types = [t for t, f in parsed]
fields = [f for t, f in parsed]
custom_node_interface = "interface CustomNode"
assert custom_node_interface in types
assert (
'"""The ID of the object"""\n id: Int!'
== fields[types.index(custom_node_interface)]
)
user_type = "type User implements CustomNode"
assert user_type in types
assert (
'"""The ID of the object"""\n id: Int!\n name: String'
== fields[types.index(user_type)]
)
def test_get_by_id(self):
query = """query {
user(id: 2) {
id
name
}
}"""
result = graphql_sync(self.graphql_schema, query)
assert not result.errors
assert result.data["user"]["id"] == self.user_list[1]["id"]
assert result.data["user"]["name"] == self.user_list[1]["name"]
class TestIncompleteCustomGlobalID:
def setup_method(self):
self.user_list = [
{"id": 1, "name": "First"},
{"id": 2, "name": "Second"},
{"id": 3, "name": "Third"},
{"id": 4, "name": "Fourth"},
]
self.users = {user["id"]: user for user in self.user_list}
def test_must_define_to_global_id(self):
"""
Test that if the `to_global_id` method is not defined, we can query the object, but we can't request its ID.
"""
class CustomGlobalIDType(BaseGlobalIDType):
graphene_type = Int
@classmethod
def resolve_global_id(cls, info, global_id):
_type = info.return_type.graphene_type._meta.name
return _type, global_id
class CustomNode(Node):
class Meta:
global_id_type = CustomGlobalIDType
class User(ObjectType):
class Meta:
interfaces = [CustomNode]
name = String()
@classmethod
def get_node(cls, _type, _id):
return self.users[_id]
class RootQuery(ObjectType):
user = CustomNode.Field(User)
self.schema = Schema(query=RootQuery, types=[User])
self.graphql_schema = self.schema.graphql_schema
query = """query {
user(id: 2) {
name
}
}"""
result = graphql_sync(self.graphql_schema, query)
assert not result.errors
assert result.data["user"]["name"] == self.user_list[1]["name"]
query = """query {
user(id: 2) {
id
name
}
}"""
result = graphql_sync(self.graphql_schema, query)
assert result.errors is not None
assert len(result.errors) == 1
assert result.errors[0].path == ["user", "id"]
def test_must_define_resolve_global_id(self):
"""
Test that if the `resolve_global_id` method is not defined, we can't query the object by ID.
"""
class CustomGlobalIDType(BaseGlobalIDType):
graphene_type = Int
@classmethod
def to_global_id(cls, _type, _id):
return _id
class CustomNode(Node):
class Meta:
global_id_type = CustomGlobalIDType
class User(ObjectType):
class Meta:
interfaces = [CustomNode]
name = String()
@classmethod
def get_node(cls, _type, _id):
return self.users[_id]
class RootQuery(ObjectType):
user = CustomNode.Field(User)
self.schema = Schema(query=RootQuery, types=[User])
self.graphql_schema = self.schema.graphql_schema
query = """query {
user(id: 2) {
id
name
}
}"""
result = graphql_sync(self.graphql_schema, query)
assert result.errors is not None
assert len(result.errors) == 1
assert result.errors[0].path == ["user"]

View File

@ -3,6 +3,7 @@ from pytest import mark
from graphene.types import ID, Field, ObjectType, Schema from graphene.types import ID, Field, ObjectType, Schema
from graphene.types.scalars import String from graphene.types.scalars import String
from graphene.relay.mutation import ClientIDMutation from graphene.relay.mutation import ClientIDMutation
from graphene.test import Client
class SharedFields(object): class SharedFields(object):
@ -61,24 +62,27 @@ class Mutation(ObjectType):
schema = Schema(query=RootQuery, mutation=Mutation) schema = Schema(query=RootQuery, mutation=Mutation)
client = Client(schema)
@mark.asyncio @mark.asyncio
async def test_node_query_promise(): async def test_node_query_promise():
executed = await schema.execute_async( executed = await client.execute_async(
'mutation a { sayPromise(input: {what:"hello", clientMutationId:"1"}) { phrase } }' 'mutation a { sayPromise(input: {what:"hello", clientMutationId:"1"}) { phrase } }'
) )
assert not executed.errors assert isinstance(executed, dict)
assert executed.data == {"sayPromise": {"phrase": "hello"}} assert "errors" not in executed
assert executed["data"] == {"sayPromise": {"phrase": "hello"}}
@mark.asyncio @mark.asyncio
async def test_edge_query(): async def test_edge_query():
executed = await schema.execute_async( executed = await client.execute_async(
'mutation a { other(input: {clientMutationId:"1"}) { clientMutationId, myNodeEdge { cursor node { name }} } }' 'mutation a { other(input: {clientMutationId:"1"}) { clientMutationId, myNodeEdge { cursor node { name }} } }'
) )
assert not executed.errors assert isinstance(executed, dict)
assert dict(executed.data) == { assert "errors" not in executed
assert executed["data"] == {
"other": { "other": {
"clientMutationId": "1", "clientMutationId": "1",
"myNodeEdge": {"cursor": "1", "node": {"name": "name"}}, "myNodeEdge": {"cursor": "1", "node": {"name": "name"}},

View File

@ -8,7 +8,6 @@ from ..node import Node, is_node
class SharedNodeFields: class SharedNodeFields:
shared = String() shared = String()
something_else = String() something_else = String()
@ -55,6 +54,7 @@ def test_node_good():
assert "id" in MyNode._meta.fields assert "id" in MyNode._meta.fields
assert is_node(MyNode) assert is_node(MyNode)
assert not is_node(object) assert not is_node(object)
assert not is_node("node")
def test_node_query(): def test_node_query():

View File

@ -1,4 +1,3 @@
from promise import Promise, is_thenable
from graphql.error import GraphQLError from graphql.error import GraphQLError
from graphene.types.schema import Schema from graphene.types.schema import Schema
@ -31,7 +30,10 @@ class Client:
def execute(self, *args, **kwargs): def execute(self, *args, **kwargs):
executed = self.schema.execute(*args, **dict(self.execute_options, **kwargs)) executed = self.schema.execute(*args, **dict(self.execute_options, **kwargs))
if is_thenable(executed): return self.format_result(executed)
return Promise.resolve(executed).then(self.format_result)
async def execute_async(self, *args, **kwargs):
executed = await self.schema.execute_async(
*args, **dict(self.execute_options, **kwargs)
)
return self.format_result(executed) return self.format_result(executed)

View File

@ -0,0 +1,41 @@
# https://github.com/graphql-python/graphene/issues/1293
from datetime import datetime, timezone
import graphene
from graphql.utilities import print_schema
class Filters(graphene.InputObjectType):
datetime_after = graphene.DateTime(
required=False,
default_value=datetime.fromtimestamp(1434549820.776, timezone.utc),
)
datetime_before = graphene.DateTime(
required=False,
default_value=datetime.fromtimestamp(1444549820.776, timezone.utc),
)
class SetDatetime(graphene.Mutation):
class Arguments:
filters = Filters(required=True)
ok = graphene.Boolean()
def mutate(root, info, filters):
return SetDatetime(ok=True)
class Query(graphene.ObjectType):
goodbye = graphene.String()
class Mutations(graphene.ObjectType):
set_datetime = SetDatetime.Field()
def test_schema_printable_with_default_datetime_value():
schema = graphene.Schema(query=Query, mutation=Mutations)
schema_str = print_schema(schema.graphql_schema)
assert schema_str, "empty schema printed"

View File

@ -24,8 +24,8 @@ from ...types.uuid import UUID
(ID, "1"), (ID, "1"),
(DateTime, '"2022-02-02T11:11:11"'), (DateTime, '"2022-02-02T11:11:11"'),
(UUID, '"cbebbc62-758e-4f75-a890-bc73b5017d81"'), (UUID, '"cbebbc62-758e-4f75-a890-bc73b5017d81"'),
(Decimal, "1.1"), (Decimal, '"1.1"'),
(JSONString, '{key:"foo",value:"bar"}'), (JSONString, '"{\\"key\\":\\"foo\\",\\"value\\":\\"bar\\"}"'),
(Base64, '"Q2hlbG8gd29ycmxkCg=="'), (Base64, '"Q2hlbG8gd29ycmxkCg=="'),
], ],
) )

View File

@ -0,0 +1,27 @@
import pickle
from ...types.enum import Enum
class PickleEnum(Enum):
# is defined outside of test because pickle unable to dump class inside ot pytest function
A = "a"
B = 1
def test_enums_pickling():
a = PickleEnum.A
pickled = pickle.dumps(a)
restored = pickle.loads(pickled)
assert type(a) is type(restored)
assert a == restored
assert a.value == restored.value
assert a.name == restored.name
b = PickleEnum.B
pickled = pickle.dumps(b)
restored = pickle.loads(pickled)
assert type(a) is type(restored)
assert b == restored
assert b.value == restored.value
assert b.name == restored.name

View File

@ -1,4 +1,3 @@
# flake8: noqa
from graphql import GraphQLResolveInfo as ResolveInfo from graphql import GraphQLResolveInfo as ResolveInfo
from .argument import Argument from .argument import Argument
@ -15,7 +14,7 @@ from .interface import Interface
from .json import JSONString from .json import JSONString
from .mutation import Mutation from .mutation import Mutation
from .objecttype import ObjectType from .objecttype import ObjectType
from .scalars import ID, Boolean, Float, Int, Scalar, String from .scalars import ID, BigInt, Boolean, Float, Int, Scalar, String
from .schema import Schema from .schema import Schema
from .structures import List, NonNull from .structures import List, NonNull
from .union import Union from .union import Union
@ -24,6 +23,7 @@ from .uuid import UUID
__all__ = [ __all__ = [
"Argument", "Argument",
"Base64", "Base64",
"BigInt",
"Boolean", "Boolean",
"Context", "Context",
"Date", "Date",

View File

@ -31,18 +31,22 @@ class Argument(MountedType):
type (class for a graphene.UnmountedType): must be a class (not an instance) of an type (class for a graphene.UnmountedType): must be a class (not an instance) of an
unmounted graphene type (ex. scalar or object) which is used for the type of this unmounted graphene type (ex. scalar or object) which is used for the type of this
argument in the GraphQL schema. argument in the GraphQL schema.
required (bool): indicates this argument as not null in the graphql schema. Same behavior required (optional, bool): indicates this argument as not null in the graphql schema. Same behavior
as graphene.NonNull. Default False. as graphene.NonNull. Default False.
name (str): the name of the GraphQL argument. Defaults to parameter name. name (optional, str): the name of the GraphQL argument. Defaults to parameter name.
description (str): the description of the GraphQL argument in the schema. description (optional, str): the description of the GraphQL argument in the schema.
default_value (Any): The value to be provided if the user does not set this argument in default_value (optional, Any): The value to be provided if the user does not set this argument in
the operation. the operation.
deprecation_reason (optional, str): Setting this value indicates that the argument is
depreciated and may provide instruction or reason on how for clients to proceed. Cannot be
set if the argument is required (see spec).
""" """
def __init__( def __init__(
self, self,
type_, type_,
default_value=Undefined, default_value=Undefined,
deprecation_reason=None,
description=None, description=None,
name=None, name=None,
required=False, required=False,
@ -51,12 +55,16 @@ class Argument(MountedType):
super(Argument, self).__init__(_creation_counter=_creation_counter) super(Argument, self).__init__(_creation_counter=_creation_counter)
if required: if required:
assert (
deprecation_reason is None
), f"Argument {name} is required, cannot deprecate it."
type_ = NonNull(type_) type_ = NonNull(type_)
self.name = name self.name = name
self._type = type_ self._type = type_
self.default_value = default_value self.default_value = default_value
self.description = description self.description = description
self.deprecation_reason = deprecation_reason
@property @property
def type(self): def type(self):
@ -68,6 +76,7 @@ class Argument(MountedType):
and self.type == other.type and self.type == other.type
and self.default_value == other.default_value and self.default_value == other.default_value
and self.description == other.description and self.description == other.description
and self.deprecation_reason == other.deprecation_reason
) )

View File

@ -1,17 +1,17 @@
from typing import Type from typing import Type, Optional
from ..utils.subclass_with_meta import SubclassWithMeta, SubclassWithMeta_Meta from ..utils.subclass_with_meta import SubclassWithMeta, SubclassWithMeta_Meta
from ..utils.trim_docstring import trim_docstring from ..utils.trim_docstring import trim_docstring
class BaseOptions: class BaseOptions:
name = None # type: str name: Optional[str] = None
description = None # type: str description: Optional[str] = None
_frozen = False # type: bool _frozen: bool = False
def __init__(self, class_type): def __init__(self, class_type: Type):
self.class_type = class_type # type: Type self.class_type: Type = class_type
def freeze(self): def freeze(self):
self._frozen = True self._frozen = True

View File

@ -1,8 +1,7 @@
from __future__ import absolute_import
import datetime import datetime
from aniso8601 import parse_date, parse_datetime, parse_time from dateutil.parser import isoparse
from graphql.error import GraphQLError from graphql.error import GraphQLError
from graphql.language import StringValueNode, print_ast from graphql.language import StringValueNode, print_ast
@ -39,7 +38,7 @@ class Date(Scalar):
if not isinstance(value, str): if not isinstance(value, str):
raise GraphQLError(f"Date cannot represent non-string value: {repr(value)}") raise GraphQLError(f"Date cannot represent non-string value: {repr(value)}")
try: try:
return parse_date(value) return datetime.date.fromisoformat(value)
except ValueError: except ValueError:
raise GraphQLError(f"Date cannot represent value: {repr(value)}") raise GraphQLError(f"Date cannot represent value: {repr(value)}")
@ -74,7 +73,7 @@ class DateTime(Scalar):
f"DateTime cannot represent non-string value: {repr(value)}" f"DateTime cannot represent non-string value: {repr(value)}"
) )
try: try:
return parse_datetime(value) return isoparse(value)
except ValueError: except ValueError:
raise GraphQLError(f"DateTime cannot represent value: {repr(value)}") raise GraphQLError(f"DateTime cannot represent value: {repr(value)}")
@ -107,6 +106,6 @@ class Time(Scalar):
if not isinstance(value, str): if not isinstance(value, str):
raise GraphQLError(f"Time cannot represent non-string value: {repr(value)}") raise GraphQLError(f"Time cannot represent non-string value: {repr(value)}")
try: try:
return parse_time(value) return datetime.time.fromisoformat(value)
except ValueError: except ValueError:
raise GraphQLError(f"Time cannot represent value: {repr(value)}") raise GraphQLError(f"Time cannot represent value: {repr(value)}")

View File

@ -1,7 +1,6 @@
from __future__ import absolute_import
from decimal import Decimal as _Decimal from decimal import Decimal as _Decimal
from graphql import Undefined
from graphql.language.ast import StringValueNode, IntValueNode from graphql.language.ast import StringValueNode, IntValueNode
from .scalars import Scalar from .scalars import Scalar
@ -25,10 +24,11 @@ class Decimal(Scalar):
def parse_literal(cls, node, _variables=None): def parse_literal(cls, node, _variables=None):
if isinstance(node, (StringValueNode, IntValueNode)): if isinstance(node, (StringValueNode, IntValueNode)):
return cls.parse_value(node.value) return cls.parse_value(node.value)
return Undefined
@staticmethod @staticmethod
def parse_value(value): def parse_value(value):
try: try:
return _Decimal(value) return _Decimal(value)
except ValueError: except Exception:
return None return Undefined

View File

@ -20,6 +20,11 @@ class GrapheneGraphQLType:
self.graphene_type = kwargs.pop("graphene_type") self.graphene_type = kwargs.pop("graphene_type")
super(GrapheneGraphQLType, self).__init__(*args, **kwargs) super(GrapheneGraphQLType, self).__init__(*args, **kwargs)
def __copy__(self):
result = GrapheneGraphQLType(graphene_type=self.graphene_type)
result.__dict__.update(self.__dict__)
return result
class GrapheneInterfaceType(GrapheneGraphQLType, GraphQLInterfaceType): class GrapheneInterfaceType(GrapheneGraphQLType, GraphQLInterfaceType):
pass pass

View File

@ -12,6 +12,10 @@ def eq_enum(self, other):
return self.value is other return self.value is other
def hash_enum(self):
return hash(self.name)
EnumType = type(PyEnum) EnumType = type(PyEnum)
@ -22,14 +26,16 @@ class EnumOptions(BaseOptions):
class EnumMeta(SubclassWithMeta_Meta): class EnumMeta(SubclassWithMeta_Meta):
def __new__(cls, name_, bases, classdict, **options): def __new__(cls, name_, bases, classdict, **options):
enum_members = dict(classdict, __eq__=eq_enum) enum_members = dict(classdict, __eq__=eq_enum, __hash__=hash_enum)
# We remove the Meta attribute from the class to not collide # We remove the Meta attribute from the class to not collide
# with the enum values. # with the enum values.
enum_members.pop("Meta", None) enum_members.pop("Meta", None)
enum = PyEnum(cls.__name__, enum_members) enum = PyEnum(cls.__name__, enum_members)
return SubclassWithMeta_Meta.__new__( obj = SubclassWithMeta_Meta.__new__(
cls, name_, bases, dict(classdict, __enum__=enum), **options cls, name_, bases, dict(classdict, __enum__=enum), **options
) )
globals()[name_] = obj.__enum__
return obj
def get(cls, value): def get(cls, value):
return cls._meta.enum(value) return cls._meta.enum(value)
@ -52,15 +58,19 @@ class EnumMeta(SubclassWithMeta_Meta):
return super(EnumMeta, cls).__call__(*args, **kwargs) return super(EnumMeta, cls).__call__(*args, **kwargs)
# return cls._meta.enum(*args, **kwargs) # return cls._meta.enum(*args, **kwargs)
def from_enum(cls, enum, description=None, deprecation_reason=None): # noqa: N805 def __iter__(cls):
description = description or enum.__doc__ return cls._meta.enum.__iter__()
def from_enum(cls, enum, name=None, description=None, deprecation_reason=None): # noqa: N805
name = name or enum.__name__
description = description or enum.__doc__ or "An enumeration."
meta_dict = { meta_dict = {
"enum": enum, "enum": enum,
"description": description, "description": description,
"deprecation_reason": deprecation_reason, "deprecation_reason": deprecation_reason,
} }
meta_class = type("Meta", (object,), meta_dict) meta_class = type("Meta", (object,), meta_dict)
return type(meta_class.enum.__name__, (Enum,), {"Meta": meta_class}) return type(name, (Enum,), {"Meta": meta_class})
class Enum(UnmountedType, BaseType, metaclass=EnumMeta): class Enum(UnmountedType, BaseType, metaclass=EnumMeta):

View File

@ -43,7 +43,8 @@ class Field(MountedType):
args: args:
type (class for a graphene.UnmountedType): Must be a class (not an instance) of an type (class for a graphene.UnmountedType): Must be a class (not an instance) of an
unmounted graphene type (ex. scalar or object) which is used for the type of this unmounted graphene type (ex. scalar or object) which is used for the type of this
field in the GraphQL schema. field in the GraphQL schema. You can provide a dotted module import path (string)
to the class instead of the class itself (e.g. to avoid circular import issues).
args (optional, Dict[str, graphene.Argument]): Arguments that can be input to the field. args (optional, Dict[str, graphene.Argument]): Arguments that can be input to the field.
Prefer to use ``**extra_args``, unless you use an argument name that clashes with one Prefer to use ``**extra_args``, unless you use an argument name that clashes with one
of the Field arguments presented here (see :ref:`example<ResolverParamGraphQLArguments>`). of the Field arguments presented here (see :ref:`example<ResolverParamGraphQLArguments>`).

View File

@ -1,5 +1,3 @@
from __future__ import unicode_literals
from graphql.language.ast import ( from graphql.language.ast import (
BooleanValueNode, BooleanValueNode,
FloatValueNode, FloatValueNode,

View File

@ -55,11 +55,14 @@ class InputField(MountedType):
description=None, description=None,
required=False, required=False,
_creation_counter=None, _creation_counter=None,
**extra_args **extra_args,
): ):
super(InputField, self).__init__(_creation_counter=_creation_counter) super(InputField, self).__init__(_creation_counter=_creation_counter)
self.name = name self.name = name
if required: if required:
assert (
deprecation_reason is None
), f"InputField {name} is required, cannot deprecate it."
type_ = NonNull(type_) type_ = NonNull(type_)
self._type = type_ self._type = type_
self.deprecation_reason = deprecation_reason self.deprecation_reason = deprecation_reason

View File

@ -1,11 +1,12 @@
from typing import TYPE_CHECKING
from .base import BaseOptions, BaseType from .base import BaseOptions, BaseType
from .inputfield import InputField from .inputfield import InputField
from .unmountedtype import UnmountedType from .unmountedtype import UnmountedType
from .utils import yank_fields_from_attrs from .utils import yank_fields_from_attrs
# For static type checking with Mypy # For static type checking with type checker
MYPY = False if TYPE_CHECKING:
if MYPY:
from typing import Dict, Callable # NOQA from typing import Dict, Callable # NOQA
@ -14,14 +15,39 @@ class InputObjectTypeOptions(BaseOptions):
container = None # type: InputObjectTypeContainer container = None # type: InputObjectTypeContainer
class InputObjectTypeContainer(dict, BaseType): # Currently in Graphene, we get a `None` whenever we access an (optional) field that was not set in an InputObjectType
# using the InputObjectType.<attribute> dot access syntax. This is ambiguous, because in this current (Graphene
# historical) arrangement, we cannot distinguish between a field not being set and a field being set to None.
# At the same time, we shouldn't break existing code that expects a `None` when accessing a field that was not set.
_INPUT_OBJECT_TYPE_DEFAULT_VALUE = None
# To mitigate this, we provide the function `set_input_object_type_default_value` to allow users to change the default
# value returned in non-specified fields in InputObjectType to another meaningful sentinel value (e.g. Undefined)
# if they want to. This way, we can keep code that expects a `None` working while we figure out a better solution (or
# a well-documented breaking change) for this issue.
def set_input_object_type_default_value(default_value):
"""
Change the sentinel value returned by non-specified fields in an InputObjectType
Useful to differentiate between a field not being set and a field being set to None by using a sentinel value
(e.g. Undefined is a good sentinel value for this purpose)
This function should be called at the beginning of the app or in some other place where it is guaranteed to
be called before any InputObjectType is defined.
"""
global _INPUT_OBJECT_TYPE_DEFAULT_VALUE
_INPUT_OBJECT_TYPE_DEFAULT_VALUE = default_value
class InputObjectTypeContainer(dict, BaseType): # type: ignore
class Meta: class Meta:
abstract = True abstract = True
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
dict.__init__(self, *args, **kwargs) dict.__init__(self, *args, **kwargs)
for key in self._meta.fields: for key in self._meta.fields:
setattr(self, key, self.get(key, None)) setattr(self, key, self.get(key, _INPUT_OBJECT_TYPE_DEFAULT_VALUE))
def __init_subclass__(cls, *args, **kwargs): def __init_subclass__(cls, *args, **kwargs):
pass pass

View File

@ -1,15 +1,17 @@
from typing import TYPE_CHECKING
from .base import BaseOptions, BaseType from .base import BaseOptions, BaseType
from .field import Field from .field import Field
from .utils import yank_fields_from_attrs from .utils import yank_fields_from_attrs
# For static type checking with Mypy # For static type checking with type checker
MYPY = False if TYPE_CHECKING:
if MYPY: from typing import Dict, Iterable, Type # NOQA
from typing import Dict # NOQA
class InterfaceOptions(BaseOptions): class InterfaceOptions(BaseOptions):
fields = None # type: Dict[str, Field] fields = None # type: Dict[str, Field]
interfaces = () # type: Iterable[Type[Interface]]
class Interface(BaseType): class Interface(BaseType):
@ -45,7 +47,7 @@ class Interface(BaseType):
""" """
@classmethod @classmethod
def __init_subclass_with_meta__(cls, _meta=None, **options): def __init_subclass_with_meta__(cls, _meta=None, interfaces=(), **options):
if not _meta: if not _meta:
_meta = InterfaceOptions(cls) _meta = InterfaceOptions(cls)
@ -58,6 +60,9 @@ class Interface(BaseType):
else: else:
_meta.fields = fields _meta.fields = fields
if not _meta.interfaces:
_meta.interfaces = interfaces
super(Interface, cls).__init_subclass_with_meta__(_meta=_meta, **options) super(Interface, cls).__init_subclass_with_meta__(_meta=_meta, **options)
@classmethod @classmethod

View File

@ -1,7 +1,6 @@
from __future__ import absolute_import
import json import json
from graphql import Undefined
from graphql.language.ast import StringValueNode from graphql.language.ast import StringValueNode
from .scalars import Scalar from .scalars import Scalar
@ -22,7 +21,11 @@ class JSONString(Scalar):
@staticmethod @staticmethod
def parse_literal(node, _variables=None): def parse_literal(node, _variables=None):
if isinstance(node, StringValueNode): if isinstance(node, StringValueNode):
try:
return json.loads(node.value) return json.loads(node.value)
except Exception as error:
raise ValueError(f"Badly formed JSONString: {str(error)}")
return Undefined
@staticmethod @staticmethod
def parse_value(value): def parse_value(value):

View File

@ -1,3 +1,5 @@
from typing import TYPE_CHECKING
from ..utils.deprecated import warn_deprecation from ..utils.deprecated import warn_deprecation
from ..utils.get_unbound_function import get_unbound_function from ..utils.get_unbound_function import get_unbound_function
from ..utils.props import props from ..utils.props import props
@ -6,9 +8,8 @@ from .objecttype import ObjectType, ObjectTypeOptions
from .utils import yank_fields_from_attrs from .utils import yank_fields_from_attrs
from .interface import Interface from .interface import Interface
# For static type checking with Mypy # For static type checking with type checker
MYPY = False if TYPE_CHECKING:
if MYPY:
from .argument import Argument # NOQA from .argument import Argument # NOQA
from typing import Dict, Type, Callable, Iterable # NOQA from typing import Dict, Type, Callable, Iterable # NOQA
@ -29,21 +30,21 @@ class Mutation(ObjectType):
.. code:: python .. code:: python
from graphene import Mutation, ObjectType, String, Boolean, Field import graphene
class CreatePerson(Mutation): class CreatePerson(graphene.Mutation):
class Arguments: class Arguments:
name = String() name = graphene.String()
ok = Boolean() ok = graphene.Boolean()
person = Field(Person) person = graphene.Field(Person)
def mutate(parent, info, name): def mutate(parent, info, name):
person = Person(name=name) person = Person(name=name)
ok = True ok = True
return CreatePerson(person=person, ok=ok) return CreatePerson(person=person, ok=ok)
class Mutation(ObjectType): class Mutation(graphene.ObjectType):
create_person = CreatePerson.Field() create_person = CreatePerson.Field()
Meta class options (optional): Meta class options (optional):

View File

@ -1,15 +1,14 @@
from typing import TYPE_CHECKING
from .base import BaseOptions, BaseType, BaseTypeMeta from .base import BaseOptions, BaseType, BaseTypeMeta
from .field import Field from .field import Field
from .interface import Interface from .interface import Interface
from .utils import yank_fields_from_attrs from .utils import yank_fields_from_attrs
try:
from dataclasses import make_dataclass, field from dataclasses import make_dataclass, field
except ImportError:
from ..pyutils.dataclasses import make_dataclass, field # type: ignore # For static type checking with type checker
# For static type checking with Mypy if TYPE_CHECKING:
MYPY = False
if MYPY:
from typing import Dict, Iterable, Type # NOQA from typing import Dict, Iterable, Type # NOQA

View File

@ -1,5 +1,6 @@
from typing import Any from typing import Any
from graphql import Undefined
from graphql.language.ast import ( from graphql.language.ast import (
BooleanValueNode, BooleanValueNode,
FloatValueNode, FloatValueNode,
@ -67,9 +68,10 @@ class Int(Scalar):
try: try:
num = int(float(value)) num = int(float(value))
except ValueError: except ValueError:
return None return Undefined
if MIN_INT <= num <= MAX_INT: if MIN_INT <= num <= MAX_INT:
return num return num
return Undefined
serialize = coerce_int serialize = coerce_int
parse_value = coerce_int parse_value = coerce_int
@ -80,6 +82,7 @@ class Int(Scalar):
num = int(ast.value) num = int(ast.value)
if MIN_INT <= num <= MAX_INT: if MIN_INT <= num <= MAX_INT:
return num return num
return Undefined
class BigInt(Scalar): class BigInt(Scalar):
@ -97,7 +100,7 @@ class BigInt(Scalar):
try: try:
num = int(float(value)) num = int(float(value))
except ValueError: except ValueError:
return None return Undefined
return num return num
serialize = coerce_int serialize = coerce_int
@ -107,6 +110,7 @@ class BigInt(Scalar):
def parse_literal(ast, _variables=None): def parse_literal(ast, _variables=None):
if isinstance(ast, IntValueNode): if isinstance(ast, IntValueNode):
return int(ast.value) return int(ast.value)
return Undefined
class Float(Scalar): class Float(Scalar):
@ -117,12 +121,11 @@ class Float(Scalar):
""" """
@staticmethod @staticmethod
def coerce_float(value): def coerce_float(value: Any) -> float:
# type: (Any) -> float
try: try:
return float(value) return float(value)
except ValueError: except ValueError:
return None return Undefined
serialize = coerce_float serialize = coerce_float
parse_value = coerce_float parse_value = coerce_float
@ -131,6 +134,7 @@ class Float(Scalar):
def parse_literal(ast, _variables=None): def parse_literal(ast, _variables=None):
if isinstance(ast, (FloatValueNode, IntValueNode)): if isinstance(ast, (FloatValueNode, IntValueNode)):
return float(ast.value) return float(ast.value)
return Undefined
class String(Scalar): class String(Scalar):
@ -153,6 +157,7 @@ class String(Scalar):
def parse_literal(ast, _variables=None): def parse_literal(ast, _variables=None):
if isinstance(ast, StringValueNode): if isinstance(ast, StringValueNode):
return ast.value return ast.value
return Undefined
class Boolean(Scalar): class Boolean(Scalar):
@ -167,6 +172,7 @@ class Boolean(Scalar):
def parse_literal(ast, _variables=None): def parse_literal(ast, _variables=None):
if isinstance(ast, BooleanValueNode): if isinstance(ast, BooleanValueNode):
return ast.value return ast.value
return Undefined
class ID(Scalar): class ID(Scalar):
@ -185,3 +191,4 @@ class ID(Scalar):
def parse_literal(ast, _variables=None): def parse_literal(ast, _variables=None):
if isinstance(ast, (StringValueNode, IntValueNode)): if isinstance(ast, (StringValueNode, IntValueNode)):
return ast.value return ast.value
return Undefined

View File

@ -1,3 +1,4 @@
from enum import Enum as PyEnum
import inspect import inspect
from functools import partial from functools import partial
@ -169,10 +170,16 @@ class TypeMap(dict):
values = {} values = {}
for name, value in graphene_type._meta.enum.__members__.items(): for name, value in graphene_type._meta.enum.__members__.items():
description = getattr(value, "description", None) description = getattr(value, "description", None)
deprecation_reason = getattr(value, "deprecation_reason", None) # if the "description" attribute is an Enum, it is likely an enum member
# called description, not a description property
if isinstance(description, PyEnum):
description = None
if not description and callable(graphene_type._meta.description): if not description and callable(graphene_type._meta.description):
description = graphene_type._meta.description(value) description = graphene_type._meta.description(value)
deprecation_reason = getattr(value, "deprecation_reason", None)
if isinstance(deprecation_reason, PyEnum):
deprecation_reason = None
if not deprecation_reason and callable( if not deprecation_reason and callable(
graphene_type._meta.deprecation_reason graphene_type._meta.deprecation_reason
): ):
@ -233,11 +240,20 @@ class TypeMap(dict):
else None else None
) )
def interfaces():
interfaces = []
for graphene_interface in graphene_type._meta.interfaces:
interface = self.add_type(graphene_interface)
assert interface.graphene_type == graphene_interface
interfaces.append(interface)
return interfaces
return GrapheneInterfaceType( return GrapheneInterfaceType(
graphene_type=graphene_type, graphene_type=graphene_type,
name=graphene_type._meta.name, name=graphene_type._meta.name,
description=graphene_type._meta.description, description=graphene_type._meta.description,
fields=partial(self.create_fields_for_type, graphene_type), fields=partial(self.create_fields_for_type, graphene_type),
interfaces=interfaces,
resolve_type=resolve_type, resolve_type=resolve_type,
) )
@ -300,6 +316,7 @@ class TypeMap(dict):
default_value=field.default_value, default_value=field.default_value,
out_name=name, out_name=name,
description=field.description, description=field.description,
deprecation_reason=field.deprecation_reason,
) )
else: else:
args = {} args = {}
@ -311,6 +328,7 @@ class TypeMap(dict):
out_name=arg_name, out_name=arg_name,
description=arg.description, description=arg.description,
default_value=arg.default_value, default_value=arg.default_value,
deprecation_reason=arg.deprecation_reason,
) )
subscribe = field.wrap_subscribe( subscribe = field.wrap_subscribe(
self.get_function_for_type( self.get_function_for_type(

View File

@ -0,0 +1,12 @@
import pytest
from graphql import Undefined
from graphene.types.inputobjecttype import set_input_object_type_default_value
@pytest.fixture()
def set_default_input_object_type_to_undefined():
"""This fixture is used to change the default value of optional inputs in InputObjectTypes for specific tests"""
set_input_object_type_default_value(Undefined)
yield
set_input_object_type_default_value(None)

View File

@ -18,8 +18,20 @@ def test_argument():
def test_argument_comparasion(): def test_argument_comparasion():
arg1 = Argument(String, name="Hey", description="Desc", default_value="default") arg1 = Argument(
arg2 = Argument(String, name="Hey", description="Desc", default_value="default") String,
name="Hey",
description="Desc",
default_value="default",
deprecation_reason="deprecated",
)
arg2 = Argument(
String,
name="Hey",
description="Desc",
default_value="default",
deprecation_reason="deprecated",
)
assert arg1 == arg2 assert arg1 == arg2
assert arg1 != String() assert arg1 != String()
@ -40,6 +52,30 @@ def test_to_arguments():
} }
def test_to_arguments_deprecated():
args = {"unmounted_arg": String(required=False, deprecation_reason="deprecated")}
my_args = to_arguments(args)
assert my_args == {
"unmounted_arg": Argument(
String, required=False, deprecation_reason="deprecated"
),
}
def test_to_arguments_required_deprecated():
args = {
"unmounted_arg": String(
required=True, name="arg", deprecation_reason="deprecated"
)
}
with raises(AssertionError) as exc_info:
to_arguments(args)
assert str(exc_info.value) == "Argument arg is required, cannot deprecate it."
def test_to_arguments_raises_if_field(): def test_to_arguments_raises_if_field():
args = {"arg_string": Field(String)} args = {"arg_string": Field(String)}

View File

@ -1,6 +1,5 @@
import datetime import datetime
import pytz
from graphql import GraphQLError from graphql import GraphQLError
from pytest import fixture from pytest import fixture
@ -30,7 +29,7 @@ schema = Schema(query=Query)
@fixture @fixture
def sample_datetime(): def sample_datetime():
utc_datetime = datetime.datetime(2019, 5, 25, 5, 30, 15, 10, pytz.utc) utc_datetime = datetime.datetime(2019, 5, 25, 5, 30, 15, 10, datetime.timezone.utc)
return utc_datetime return utc_datetime
@ -228,6 +227,18 @@ def test_time_query_variable(sample_time):
assert result.data == {"time": isoformat} assert result.data == {"time": isoformat}
def test_support_isoformat():
isoformat = "2011-11-04T00:05:23Z"
# test time variable provided as Python time
result = schema.execute(
"""query DateTime($time: DateTime){ datetime(in: $time) }""",
variables={"time": isoformat},
)
assert not result.errors
assert result.data == {"datetime": "2011-11-04T00:05:23+00:00"}
def test_bad_variables(sample_date, sample_datetime, sample_time): def test_bad_variables(sample_date, sample_datetime, sample_time):
def _test_bad_variables(type_, input_): def _test_bad_variables(type_, input_):
result = schema.execute( result = schema.execute(

View File

@ -39,8 +39,25 @@ def test_bad_decimal_query():
not_a_decimal = "Nobody expects the Spanish Inquisition!" not_a_decimal = "Nobody expects the Spanish Inquisition!"
result = schema.execute("""{ decimal(input: "%s") }""" % not_a_decimal) result = schema.execute("""{ decimal(input: "%s") }""" % not_a_decimal)
assert result.errors
assert len(result.errors) == 1 assert len(result.errors) == 1
assert result.data is None assert result.data is None
assert (
result.errors[0].message
== "Expected value of type 'Decimal', found \"Nobody expects the Spanish Inquisition!\"."
)
result = schema.execute("{ decimal(input: true) }")
assert result.errors
assert len(result.errors) == 1
assert result.data is None
assert result.errors[0].message == "Expected value of type 'Decimal', found true."
result = schema.execute("{ decimal(input: 1.2) }")
assert result.errors
assert len(result.errors) == 1
assert result.data is None
assert result.errors[0].message == "Expected value of type 'Decimal', found 1.2."
def test_decimal_string_query_integer(): def test_decimal_string_query_integer():

View File

@ -1,4 +1,7 @@
import copy
from ..argument import Argument from ..argument import Argument
from ..definitions import GrapheneGraphQLType
from ..enum import Enum from ..enum import Enum
from ..field import Field from ..field import Field
from ..inputfield import InputField from ..inputfield import InputField
@ -312,3 +315,16 @@ def test_does_not_mutate_passed_field_definitions():
pass pass
assert TestInputObject1._meta.fields == TestInputObject2._meta.fields assert TestInputObject1._meta.fields == TestInputObject2._meta.fields
def test_graphene_graphql_type_can_be_copied():
class Query(ObjectType):
field = String()
def resolve_field(self, info):
return ""
schema = Schema(query=Query)
query_type_copy = copy.copy(schema.graphql_schema.query_type)
assert query_type_copy.__dict__ == schema.graphql_schema.query_type.__dict__
assert isinstance(schema.graphql_schema.query_type, GrapheneGraphQLType)

View File

@ -65,6 +65,21 @@ def test_enum_from_builtin_enum():
assert RGB.BLUE assert RGB.BLUE
def test_enum_custom_description_in_constructor():
description = "An enumeration, but with a custom description"
RGB = Enum(
"RGB",
"RED,GREEN,BLUE",
description=description,
)
assert RGB._meta.description == description
def test_enum_from_python3_enum_uses_default_builtin_doc():
RGB = Enum("RGB", "RED,GREEN,BLUE")
assert RGB._meta.description == "An enumeration."
def test_enum_from_builtin_enum_accepts_lambda_description(): def test_enum_from_builtin_enum_accepts_lambda_description():
def custom_description(value): def custom_description(value):
if not value: if not value:
@ -328,6 +343,52 @@ def test_enum_resolver_compat():
assert results.data["colorByName"] == Color.RED.name assert results.data["colorByName"] == Color.RED.name
def test_enum_with_name():
from enum import Enum as PyEnum
class Color(PyEnum):
RED = 1
YELLOW = 2
BLUE = 3
GColor = Enum.from_enum(Color, description="original colors")
UniqueGColor = Enum.from_enum(
Color, name="UniqueColor", description="unique colors"
)
class Query(ObjectType):
color = GColor(required=True)
unique_color = UniqueGColor(required=True)
schema = Schema(query=Query)
assert (
str(schema).strip()
== dedent(
'''
type Query {
color: Color!
uniqueColor: UniqueColor!
}
"""original colors"""
enum Color {
RED
YELLOW
BLUE
}
"""unique colors"""
enum UniqueColor {
RED
YELLOW
BLUE
}
'''
).strip()
)
def test_enum_resolver_invalid(): def test_enum_resolver_invalid():
from enum import Enum as PyEnum from enum import Enum as PyEnum
@ -472,3 +533,83 @@ def test_mutation_enum_input_type():
assert result.data == {"createPaint": {"color": "RED"}} assert result.data == {"createPaint": {"color": "RED"}}
assert color_input_value == RGB.RED assert color_input_value == RGB.RED
def test_hashable_enum():
class RGB(Enum):
"""Available colors"""
RED = 1
GREEN = 2
BLUE = 3
color_map = {RGB.RED: "a", RGB.BLUE: "b", 1: "c"}
assert color_map[RGB.RED] == "a"
assert color_map[RGB.BLUE] == "b"
assert color_map[1] == "c"
def test_hashable_instance_creation_enum():
Episode = Enum("Episode", [("NEWHOPE", 4), ("EMPIRE", 5), ("JEDI", 6)])
trilogy_map = {Episode.NEWHOPE: "better", Episode.EMPIRE: "best", 5: "foo"}
assert trilogy_map[Episode.NEWHOPE] == "better"
assert trilogy_map[Episode.EMPIRE] == "best"
assert trilogy_map[5] == "foo"
def test_enum_iteration():
class TestEnum(Enum):
FIRST = 1
SECOND = 2
result = []
expected_values = ["FIRST", "SECOND"]
for c in TestEnum:
result.append(c.name)
assert result == expected_values
def test_iterable_instance_creation_enum():
TestEnum = Enum("TestEnum", [("FIRST", 1), ("SECOND", 2)])
result = []
expected_values = ["FIRST", "SECOND"]
for c in TestEnum:
result.append(c.name)
assert result == expected_values
# https://github.com/graphql-python/graphene/issues/1321
def test_enum_description_member_not_interpreted_as_property():
class RGB(Enum):
"""Description"""
red = "red"
green = "green"
blue = "blue"
description = "description"
deprecation_reason = "deprecation_reason"
class Query(ObjectType):
color = RGB()
def resolve_color(_, info):
return RGB.description
values = RGB._meta.enum.__members__.values()
assert sorted(v.name for v in values) == [
"blue",
"deprecation_reason",
"description",
"green",
"red",
]
schema = Schema(query=Query)
results = schema.execute("query { color }")
assert not results.errors
assert results.data["color"] == RGB.description.name

View File

@ -128,13 +128,20 @@ def test_field_name_as_argument():
def test_field_source_argument_as_kw(): def test_field_source_argument_as_kw():
MyType = object() MyType = object()
field = Field(MyType, b=NonNull(True), c=Argument(None), a=NonNull(False)) deprecation_reason = "deprecated"
field = Field(
MyType,
b=NonNull(True),
c=Argument(None, deprecation_reason=deprecation_reason),
a=NonNull(False),
)
assert list(field.args) == ["b", "c", "a"] assert list(field.args) == ["b", "c", "a"]
assert isinstance(field.args["b"], Argument) assert isinstance(field.args["b"], Argument)
assert isinstance(field.args["b"].type, NonNull) assert isinstance(field.args["b"].type, NonNull)
assert field.args["b"].type.of_type is True assert field.args["b"].type.of_type is True
assert isinstance(field.args["c"], Argument) assert isinstance(field.args["c"], Argument)
assert field.args["c"].type is None assert field.args["c"].type is None
assert field.args["c"].deprecation_reason == deprecation_reason
assert isinstance(field.args["a"], Argument) assert isinstance(field.args["a"], Argument)
assert isinstance(field.args["a"].type, NonNull) assert isinstance(field.args["a"].type, NonNull)
assert field.args["a"].type.of_type is False assert field.args["a"].type.of_type is False

View File

@ -1,5 +1,7 @@
from functools import partial from functools import partial
from pytest import raises
from ..inputfield import InputField from ..inputfield import InputField
from ..structures import NonNull from ..structures import NonNull
from .utils import MyLazyType from .utils import MyLazyType
@ -12,6 +14,22 @@ def test_inputfield_required():
assert field.type.of_type == MyType assert field.type.of_type == MyType
def test_inputfield_deprecated():
MyType = object()
deprecation_reason = "deprecated"
field = InputField(MyType, required=False, deprecation_reason=deprecation_reason)
assert isinstance(field.type, type(MyType))
assert field.deprecation_reason == deprecation_reason
def test_inputfield_required_deprecated():
MyType = object()
with raises(AssertionError) as exc_info:
InputField(MyType, name="input", required=True, deprecation_reason="deprecated")
assert str(exc_info.value) == "InputField input is required, cannot deprecate it."
def test_inputfield_with_lazy_type(): def test_inputfield_with_lazy_type():
MyType = object() MyType = object()
field = InputField(lambda: MyType) field = InputField(lambda: MyType)

View File

@ -1,3 +1,5 @@
from graphql import Undefined
from ..argument import Argument from ..argument import Argument
from ..field import Field from ..field import Field
from ..inputfield import InputField from ..inputfield import InputField
@ -6,6 +8,7 @@ from ..objecttype import ObjectType
from ..scalars import Boolean, String from ..scalars import Boolean, String
from ..schema import Schema from ..schema import Schema
from ..unmountedtype import UnmountedType from ..unmountedtype import UnmountedType
from ... import NonNull
class MyType: class MyType:
@ -136,3 +139,31 @@ def test_inputobjecttype_of_input():
assert not result.errors assert not result.errors
assert result.data == {"isChild": True} assert result.data == {"isChild": True}
def test_inputobjecttype_default_input_as_undefined(
set_default_input_object_type_to_undefined,
):
class TestUndefinedInput(InputObjectType):
required_field = String(required=True)
optional_field = String()
class Query(ObjectType):
undefined_optionals_work = Field(NonNull(Boolean), input=TestUndefinedInput())
def resolve_undefined_optionals_work(self, info, input: TestUndefinedInput):
# Confirm that optional_field comes as Undefined
return (
input.required_field == "required" and input.optional_field is Undefined
)
schema = Schema(query=Query)
result = schema.execute(
"""query basequery {
undefinedOptionalsWork(input: {requiredField: "required"})
}
"""
)
assert not result.errors
assert result.data == {"undefinedOptionalsWork": True}

View File

@ -25,13 +25,18 @@ def test_generate_interface():
def test_generate_interface_with_meta(): def test_generate_interface_with_meta():
class MyFirstInterface(Interface):
pass
class MyInterface(Interface): class MyInterface(Interface):
class Meta: class Meta:
name = "MyOtherInterface" name = "MyOtherInterface"
description = "Documentation" description = "Documentation"
interfaces = [MyFirstInterface]
assert MyInterface._meta.name == "MyOtherInterface" assert MyInterface._meta.name == "MyOtherInterface"
assert MyInterface._meta.description == "Documentation" assert MyInterface._meta.description == "Documentation"
assert MyInterface._meta.interfaces == [MyFirstInterface]
def test_generate_interface_with_fields(): def test_generate_interface_with_fields():

View File

@ -21,6 +21,10 @@ def test_jsonstring_query():
assert not result.errors assert not result.errors
assert result.data == {"json": json_value} assert result.data == {"json": json_value}
result = schema.execute("""{ json(input: "{}") }""")
assert not result.errors
assert result.data == {"json": "{}"}
def test_jsonstring_query_variable(): def test_jsonstring_query_variable():
json_value = '{"key": "value"}' json_value = '{"key": "value"}'
@ -31,3 +35,46 @@ def test_jsonstring_query_variable():
) )
assert not result.errors assert not result.errors
assert result.data == {"json": json_value} assert result.data == {"json": json_value}
def test_jsonstring_optional_uuid_input():
"""
Test that we can provide a null value to an optional input
"""
result = schema.execute("{ json(input: null) }")
assert not result.errors
assert result.data == {"json": None}
def test_jsonstring_invalid_query():
"""
Test that if an invalid type is provided we get an error
"""
result = schema.execute("{ json(input: 1) }")
assert result.errors == [
{"message": "Expected value of type 'JSONString', found 1."},
]
result = schema.execute("{ json(input: {}) }")
assert result.errors == [
{"message": "Expected value of type 'JSONString', found {}."},
]
result = schema.execute('{ json(input: "a") }')
assert result.errors == [
{
"message": "Expected value of type 'JSONString', found \"a\"; "
"Badly formed JSONString: Expecting value: line 1 column 1 (char 0)",
},
]
result = schema.execute("""{ json(input: "{\\'key\\': 0}") }""")
assert result.errors == [
{"message": "Syntax Error: Invalid character escape sequence: '\\''."},
]
result = schema.execute("""{ json(input: "{\\"key\\": 0,}") }""")
assert len(result.errors) == 1
assert result.errors[0].message.startswith(
'Expected value of type \'JSONString\', found "{\\"key\\": 0,}"; Badly formed JSONString:'
)

View File

@ -1,4 +1,7 @@
from ..scalars import Scalar, Int, BigInt from ..objecttype import ObjectType, Field
from ..scalars import Scalar, Int, BigInt, Float, String, Boolean
from ..schema import Schema
from graphql import Undefined
from graphql.language.ast import IntValueNode from graphql.language.ast import IntValueNode
@ -11,19 +14,295 @@ def test_scalar():
def test_ints(): def test_ints():
assert Int.parse_value(2**31 - 1) is not None assert Int.parse_value(2**31 - 1) is not Undefined
assert Int.parse_value("2.0") is not None assert Int.parse_value("2.0") == 2
assert Int.parse_value(2**31) is None assert Int.parse_value(2**31) is Undefined
assert Int.parse_literal(IntValueNode(value=str(2**31 - 1))) == 2**31 - 1 assert Int.parse_literal(IntValueNode(value=str(2**31 - 1))) == 2**31 - 1
assert Int.parse_literal(IntValueNode(value=str(2**31))) is None assert Int.parse_literal(IntValueNode(value=str(2**31))) is Undefined
assert Int.parse_value(-(2**31)) is not None assert Int.parse_value(-(2**31)) is not Undefined
assert Int.parse_value(-(2**31) - 1) is None assert Int.parse_value(-(2**31) - 1) is Undefined
assert BigInt.parse_value(2**31) is not None assert BigInt.parse_value(2**31) is not Undefined
assert BigInt.parse_value("2.0") is not None assert BigInt.parse_value("2.0") == 2
assert BigInt.parse_value(-(2**31) - 1) is not None assert BigInt.parse_value(-(2**31) - 1) is not Undefined
assert BigInt.parse_literal(IntValueNode(value=str(2**31 - 1))) == 2**31 - 1 assert BigInt.parse_literal(IntValueNode(value=str(2**31 - 1))) == 2**31 - 1
assert BigInt.parse_literal(IntValueNode(value=str(2**31))) == 2**31 assert BigInt.parse_literal(IntValueNode(value=str(2**31))) == 2**31
def return_input(_parent, _info, input):
return input
class Optional(ObjectType):
int = Int(input=Int(), resolver=return_input)
big_int = BigInt(input=BigInt(), resolver=return_input)
float = Float(input=Float(), resolver=return_input)
bool = Boolean(input=Boolean(), resolver=return_input)
string = String(input=String(), resolver=return_input)
class Query(ObjectType):
optional = Field(Optional)
def resolve_optional(self, info):
return Optional()
def resolve_required(self, info, input):
return input
schema = Schema(query=Query)
class TestInt:
def test_query(self):
"""
Test that a normal query works.
"""
result = schema.execute("{ optional { int(input: 20) } }")
assert not result.errors
assert result.data == {"optional": {"int": 20}}
def test_optional_input(self):
"""
Test that we can provide a null value to an optional input
"""
result = schema.execute("{ optional { int(input: null) } }")
assert not result.errors
assert result.data == {"optional": {"int": None}}
def test_invalid_input(self):
"""
Test that if an invalid type is provided we get an error
"""
result = schema.execute('{ optional { int(input: "20") } }')
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message == 'Int cannot represent non-integer value: "20"'
)
result = schema.execute('{ optional { int(input: "a") } }')
assert result.errors
assert len(result.errors) == 1
assert result.errors[0].message == 'Int cannot represent non-integer value: "a"'
result = schema.execute("{ optional { int(input: true) } }")
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message == "Int cannot represent non-integer value: true"
)
class TestBigInt:
def test_query(self):
"""
Test that a normal query works.
"""
value = 2**31
result = schema.execute("{ optional { bigInt(input: %s) } }" % value)
assert not result.errors
assert result.data == {"optional": {"bigInt": value}}
def test_optional_input(self):
"""
Test that we can provide a null value to an optional input
"""
result = schema.execute("{ optional { bigInt(input: null) } }")
assert not result.errors
assert result.data == {"optional": {"bigInt": None}}
def test_invalid_input(self):
"""
Test that if an invalid type is provided we get an error
"""
result = schema.execute('{ optional { bigInt(input: "20") } }')
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message == "Expected value of type 'BigInt', found \"20\"."
)
result = schema.execute('{ optional { bigInt(input: "a") } }')
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message == "Expected value of type 'BigInt', found \"a\"."
)
result = schema.execute("{ optional { bigInt(input: true) } }")
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message == "Expected value of type 'BigInt', found true."
)
class TestFloat:
def test_query(self):
"""
Test that a normal query works.
"""
result = schema.execute("{ optional { float(input: 20) } }")
assert not result.errors
assert result.data == {"optional": {"float": 20.0}}
result = schema.execute("{ optional { float(input: 20.2) } }")
assert not result.errors
assert result.data == {"optional": {"float": 20.2}}
def test_optional_input(self):
"""
Test that we can provide a null value to an optional input
"""
result = schema.execute("{ optional { float(input: null) } }")
assert not result.errors
assert result.data == {"optional": {"float": None}}
def test_invalid_input(self):
"""
Test that if an invalid type is provided we get an error
"""
result = schema.execute('{ optional { float(input: "20") } }')
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message == 'Float cannot represent non numeric value: "20"'
)
result = schema.execute('{ optional { float(input: "a") } }')
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message == 'Float cannot represent non numeric value: "a"'
)
result = schema.execute("{ optional { float(input: true) } }")
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message == "Float cannot represent non numeric value: true"
)
class TestBoolean:
def test_query(self):
"""
Test that a normal query works.
"""
result = schema.execute("{ optional { bool(input: true) } }")
assert not result.errors
assert result.data == {"optional": {"bool": True}}
result = schema.execute("{ optional { bool(input: false) } }")
assert not result.errors
assert result.data == {"optional": {"bool": False}}
def test_optional_input(self):
"""
Test that we can provide a null value to an optional input
"""
result = schema.execute("{ optional { bool(input: null) } }")
assert not result.errors
assert result.data == {"optional": {"bool": None}}
def test_invalid_input(self):
"""
Test that if an invalid type is provided we get an error
"""
result = schema.execute('{ optional { bool(input: "True") } }')
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message
== 'Boolean cannot represent a non boolean value: "True"'
)
result = schema.execute('{ optional { bool(input: "true") } }')
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message
== 'Boolean cannot represent a non boolean value: "true"'
)
result = schema.execute('{ optional { bool(input: "a") } }')
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message
== 'Boolean cannot represent a non boolean value: "a"'
)
result = schema.execute("{ optional { bool(input: 1) } }")
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message
== "Boolean cannot represent a non boolean value: 1"
)
result = schema.execute("{ optional { bool(input: 0) } }")
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message
== "Boolean cannot represent a non boolean value: 0"
)
class TestString:
def test_query(self):
"""
Test that a normal query works.
"""
result = schema.execute('{ optional { string(input: "something something") } }')
assert not result.errors
assert result.data == {"optional": {"string": "something something"}}
result = schema.execute('{ optional { string(input: "True") } }')
assert not result.errors
assert result.data == {"optional": {"string": "True"}}
result = schema.execute('{ optional { string(input: "0") } }')
assert not result.errors
assert result.data == {"optional": {"string": "0"}}
def test_optional_input(self):
"""
Test that we can provide a null value to an optional input
"""
result = schema.execute("{ optional { string(input: null) } }")
assert not result.errors
assert result.data == {"optional": {"string": None}}
def test_invalid_input(self):
"""
Test that if an invalid type is provided we get an error
"""
result = schema.execute("{ optional { string(input: 1) } }")
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message == "String cannot represent a non string value: 1"
)
result = schema.execute("{ optional { string(input: 3.2) } }")
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message
== "String cannot represent a non string value: 3.2"
)
result = schema.execute("{ optional { string(input: true) } }")
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message
== "String cannot represent a non string value: true"
)

View File

@ -1,3 +1,4 @@
from graphql import Undefined
from ..scalars import Boolean, Float, Int, String from ..scalars import Boolean, Float, Int, String
@ -9,12 +10,12 @@ def test_serializes_output_int():
assert Int.serialize(1.1) == 1 assert Int.serialize(1.1) == 1
assert Int.serialize(-1.1) == -1 assert Int.serialize(-1.1) == -1
assert Int.serialize(1e5) == 100000 assert Int.serialize(1e5) == 100000
assert Int.serialize(9876504321) is None assert Int.serialize(9876504321) is Undefined
assert Int.serialize(-9876504321) is None assert Int.serialize(-9876504321) is Undefined
assert Int.serialize(1e100) is None assert Int.serialize(1e100) is Undefined
assert Int.serialize(-1e100) is None assert Int.serialize(-1e100) is Undefined
assert Int.serialize("-1.1") == -1 assert Int.serialize("-1.1") == -1
assert Int.serialize("one") is None assert Int.serialize("one") is Undefined
assert Int.serialize(False) == 0 assert Int.serialize(False) == 0
assert Int.serialize(True) == 1 assert Int.serialize(True) == 1
@ -27,7 +28,7 @@ def test_serializes_output_float():
assert Float.serialize(1.1) == 1.1 assert Float.serialize(1.1) == 1.1
assert Float.serialize(-1.1) == -1.1 assert Float.serialize(-1.1) == -1.1
assert Float.serialize("-1.1") == -1.1 assert Float.serialize("-1.1") == -1.1
assert Float.serialize("one") is None assert Float.serialize("one") is Undefined
assert Float.serialize(False) == 0 assert Float.serialize(False) == 0
assert Float.serialize(True) == 1 assert Float.serialize(True) == 1
@ -38,7 +39,7 @@ def test_serializes_output_string():
assert String.serialize(-1.1) == "-1.1" assert String.serialize(-1.1) == "-1.1"
assert String.serialize(True) == "true" assert String.serialize(True) == "true"
assert String.serialize(False) == "false" assert String.serialize(False) == "false"
assert String.serialize("\U0001F601") == "\U0001F601" assert String.serialize("\U0001f601") == "\U0001f601"
def test_serializes_output_boolean(): def test_serializes_output_boolean():

View File

@ -20,8 +20,8 @@ from ..inputobjecttype import InputObjectType
from ..interface import Interface from ..interface import Interface
from ..objecttype import ObjectType from ..objecttype import ObjectType
from ..scalars import Int, String from ..scalars import Int, String
from ..structures import List, NonNull
from ..schema import Schema from ..schema import Schema
from ..structures import List, NonNull
def create_type_map(types, auto_camelcase=True): def create_type_map(types, auto_camelcase=True):
@ -227,6 +227,18 @@ def test_inputobject():
assert foo_field.description == "Field description" assert foo_field.description == "Field description"
def test_inputobject_undefined(set_default_input_object_type_to_undefined):
class OtherObjectType(InputObjectType):
optional_field = String()
type_map = create_type_map([OtherObjectType])
assert "OtherObjectType" in type_map
graphql_type = type_map["OtherObjectType"]
container = graphql_type.out_type({})
assert container.optional_field is Undefined
def test_objecttype_camelcase(): def test_objecttype_camelcase():
class MyObjectType(ObjectType): class MyObjectType(ObjectType):
"""Description""" """Description"""
@ -289,3 +301,33 @@ def test_objecttype_with_possible_types():
assert graphql_type.is_type_of assert graphql_type.is_type_of
assert graphql_type.is_type_of({}, None) is True assert graphql_type.is_type_of({}, None) is True
assert graphql_type.is_type_of(MyObjectType(), None) is False assert graphql_type.is_type_of(MyObjectType(), None) is False
def test_interface_with_interfaces():
class FooInterface(Interface):
foo = String()
class BarInterface(Interface):
class Meta:
interfaces = [FooInterface]
foo = String()
bar = String()
type_map = create_type_map([FooInterface, BarInterface])
assert "FooInterface" in type_map
foo_graphql_type = type_map["FooInterface"]
assert isinstance(foo_graphql_type, GraphQLInterfaceType)
assert foo_graphql_type.name == "FooInterface"
assert "BarInterface" in type_map
bar_graphql_type = type_map["BarInterface"]
assert isinstance(bar_graphql_type, GraphQLInterfaceType)
assert bar_graphql_type.name == "BarInterface"
fields = bar_graphql_type.fields
assert list(fields) == ["foo", "bar"]
assert isinstance(fields["foo"], GraphQLField)
assert isinstance(fields["bar"], GraphQLField)
assert list(bar_graphql_type.interfaces) == list([foo_graphql_type])

View File

@ -1,14 +1,19 @@
from ..objecttype import ObjectType from ..objecttype import ObjectType
from ..schema import Schema from ..schema import Schema
from ..uuid import UUID from ..uuid import UUID
from ..structures import NonNull
class Query(ObjectType): class Query(ObjectType):
uuid = UUID(input=UUID()) uuid = UUID(input=UUID())
required_uuid = UUID(input=NonNull(UUID), required=True)
def resolve_uuid(self, info, input): def resolve_uuid(self, info, input):
return input return input
def resolve_required_uuid(self, info, input):
return input
schema = Schema(query=Query) schema = Schema(query=Query)
@ -29,3 +34,50 @@ def test_uuidstring_query_variable():
) )
assert not result.errors assert not result.errors
assert result.data == {"uuid": uuid_value} assert result.data == {"uuid": uuid_value}
def test_uuidstring_invalid_argument():
uuid_value = {"not": "a string"}
result = schema.execute(
"""query Test($uuid: UUID){ uuid(input: $uuid) }""",
variables={"uuid": uuid_value},
)
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message
== "Variable '$uuid' got invalid value {'not': 'a string'}; UUID cannot represent value: {'not': 'a string'}"
)
def test_uuidstring_optional_uuid_input():
"""
Test that we can provide a null value to an optional input
"""
result = schema.execute("{ uuid(input: null) }")
assert not result.errors
assert result.data == {"uuid": None}
def test_uuidstring_invalid_query():
"""
Test that if an invalid type is provided we get an error
"""
result = schema.execute("{ uuid(input: 1) }")
assert result.errors
assert len(result.errors) == 1
assert result.errors[0].message == "Expected value of type 'UUID', found 1."
result = schema.execute('{ uuid(input: "a") }')
assert result.errors
assert len(result.errors) == 1
assert (
result.errors[0].message
== "Expected value of type 'UUID', found \"a\"; badly formed hexadecimal UUID string"
)
result = schema.execute("{ requiredUuid(input: null) }")
assert result.errors
assert len(result.errors) == 1
assert result.errors[0].message == "Expected value of type 'UUID!', found null."

View File

@ -1,9 +1,10 @@
from typing import TYPE_CHECKING
from .base import BaseOptions, BaseType from .base import BaseOptions, BaseType
from .unmountedtype import UnmountedType from .unmountedtype import UnmountedType
# For static type checking with Mypy # For static type checking with type checker
MYPY = False if TYPE_CHECKING:
if MYPY:
from .objecttype import ObjectType # NOQA from .objecttype import ObjectType # NOQA
from typing import Iterable, Type # NOQA from typing import Iterable, Type # NOQA
@ -21,7 +22,7 @@ class Union(UnmountedType, BaseType):
to determine which type is actually used when the field is resolved. to determine which type is actually used when the field is resolved.
The schema in this example can take a search text and return any of the GraphQL object types The schema in this example can take a search text and return any of the GraphQL object types
indicated: Human, Droid or Startship. indicated: Human, Droid or Starship.
Ambiguous return types can be resolved on each ObjectType through ``Meta.possible_types`` Ambiguous return types can be resolved on each ObjectType through ``Meta.possible_types``
attribute or ``is_type_of`` method. Or by implementing ``resolve_type`` class method on the attribute or ``is_type_of`` method. Or by implementing ``resolve_type`` class method on the
@ -50,12 +51,14 @@ class Union(UnmountedType, BaseType):
""" """
@classmethod @classmethod
def __init_subclass_with_meta__(cls, types=None, **options): def __init_subclass_with_meta__(cls, types=None, _meta=None, **options):
assert ( assert (
isinstance(types, (list, tuple)) and len(types) > 0 isinstance(types, (list, tuple)) and len(types) > 0
), f"Must provide types for Union {cls.__name__}." ), f"Must provide types for Union {cls.__name__}."
if not _meta:
_meta = UnionOptions(cls) _meta = UnionOptions(cls)
_meta.types = types _meta.types = types
super(Union, cls).__init_subclass_with_meta__(_meta=_meta, **options) super(Union, cls).__init_subclass_with_meta__(_meta=_meta, **options)

View File

@ -1,7 +1,8 @@
from __future__ import absolute_import
from uuid import UUID as _UUID from uuid import UUID as _UUID
from graphql.error import GraphQLError
from graphql.language.ast import StringValueNode from graphql.language.ast import StringValueNode
from graphql import Undefined
from .scalars import Scalar from .scalars import Scalar
@ -24,7 +25,13 @@ class UUID(Scalar):
def parse_literal(node, _variables=None): def parse_literal(node, _variables=None):
if isinstance(node, StringValueNode): if isinstance(node, StringValueNode):
return _UUID(node.value) return _UUID(node.value)
return Undefined
@staticmethod @staticmethod
def parse_value(value): def parse_value(value):
if isinstance(value, _UUID):
return value
try:
return _UUID(value) return _UUID(value)
except (ValueError, AttributeError):
raise GraphQLError(f"UUID cannot represent value: {repr(value)}")

View File

@ -0,0 +1,280 @@
from asyncio import (
gather,
ensure_future,
get_event_loop,
iscoroutine,
iscoroutinefunction,
)
from collections import namedtuple
from collections.abc import Iterable
from functools import partial
from typing import List
Loader = namedtuple("Loader", "key,future")
def iscoroutinefunctionorpartial(fn):
return iscoroutinefunction(fn.func if isinstance(fn, partial) else fn)
class DataLoader(object):
batch = True
max_batch_size = None # type: int
cache = True
def __init__(
self,
batch_load_fn=None,
batch=None,
max_batch_size=None,
cache=None,
get_cache_key=None,
cache_map=None,
loop=None,
):
self._loop = loop
if batch_load_fn is not None:
self.batch_load_fn = batch_load_fn
assert iscoroutinefunctionorpartial(
self.batch_load_fn
), "batch_load_fn must be coroutine. Received: {}".format(self.batch_load_fn)
if not callable(self.batch_load_fn):
raise TypeError( # pragma: no cover
(
"DataLoader must be have a batch_load_fn which accepts "
"Iterable<key> and returns Future<Iterable<value>>, but got: {}."
).format(batch_load_fn)
)
if batch is not None:
self.batch = batch # pragma: no cover
if max_batch_size is not None:
self.max_batch_size = max_batch_size
if cache is not None:
self.cache = cache # pragma: no cover
self.get_cache_key = get_cache_key or (lambda x: x)
self._cache = cache_map if cache_map is not None else {}
self._queue: List[Loader] = []
@property
def loop(self):
if not self._loop:
self._loop = get_event_loop()
return self._loop
def load(self, key=None):
"""
Loads a key, returning a `Future` for the value represented by that key.
"""
if key is None:
raise TypeError( # pragma: no cover
(
"The loader.load() function must be called with a value, "
"but got: {}."
).format(key)
)
cache_key = self.get_cache_key(key)
# If caching and there is a cache-hit, return cached Future.
if self.cache:
cached_result = self._cache.get(cache_key)
if cached_result:
return cached_result
# Otherwise, produce a new Future for this value.
future = self.loop.create_future()
# If caching, cache this Future.
if self.cache:
self._cache[cache_key] = future
self.do_resolve_reject(key, future)
return future
def do_resolve_reject(self, key, future):
# Enqueue this Future to be dispatched.
self._queue.append(Loader(key=key, future=future))
# Determine if a dispatch of this queue should be scheduled.
# A single dispatch should be scheduled per queue at the time when the
# queue changes from "empty" to "full".
if len(self._queue) == 1:
if self.batch:
# If batching, schedule a task to dispatch the queue.
enqueue_post_future_job(self.loop, self)
else:
# Otherwise dispatch the (queue of one) immediately.
dispatch_queue(self) # pragma: no cover
def load_many(self, keys):
"""
Loads multiple keys, returning a list of values
>>> a, b = await my_loader.load_many([ 'a', 'b' ])
This is equivalent to the more verbose:
>>> a, b = await gather(
>>> my_loader.load('a'),
>>> my_loader.load('b')
>>> )
"""
if not isinstance(keys, Iterable):
raise TypeError( # pragma: no cover
(
"The loader.load_many() function must be called with Iterable<key> "
"but got: {}."
).format(keys)
)
return gather(*[self.load(key) for key in keys])
def clear(self, key):
"""
Clears the value at `key` from the cache, if it exists. Returns itself for
method chaining.
"""
cache_key = self.get_cache_key(key)
self._cache.pop(cache_key, None)
return self
def clear_all(self):
"""
Clears the entire cache. To be used when some event results in unknown
invalidations across this particular `DataLoader`. Returns itself for
method chaining.
"""
self._cache.clear()
return self
def prime(self, key, value):
"""
Adds the provied key and value to the cache. If the key already exists, no
change is made. Returns itself for method chaining.
"""
cache_key = self.get_cache_key(key)
# Only add the key if it does not already exist.
if cache_key not in self._cache:
# Cache a rejected future if the value is an Error, in order to match
# the behavior of load(key).
future = self.loop.create_future()
if isinstance(value, Exception):
future.set_exception(value)
else:
future.set_result(value)
self._cache[cache_key] = future
return self
def enqueue_post_future_job(loop, loader):
async def dispatch():
dispatch_queue(loader)
loop.call_soon(ensure_future, dispatch())
def get_chunks(iterable_obj, chunk_size=1):
chunk_size = max(1, chunk_size)
return (
iterable_obj[i : i + chunk_size]
for i in range(0, len(iterable_obj), chunk_size)
)
def dispatch_queue(loader):
"""
Given the current state of a Loader instance, perform a batch load
from its current queue.
"""
# Take the current loader queue, replacing it with an empty queue.
queue = loader._queue
loader._queue = []
# If a max_batch_size was provided and the queue is longer, then segment the
# queue into multiple batches, otherwise treat the queue as a single batch.
max_batch_size = loader.max_batch_size
if max_batch_size and max_batch_size < len(queue):
chunks = get_chunks(queue, max_batch_size)
for chunk in chunks:
ensure_future(dispatch_queue_batch(loader, chunk))
else:
ensure_future(dispatch_queue_batch(loader, queue))
async def dispatch_queue_batch(loader, queue):
# Collect all keys to be loaded in this dispatch
keys = [loaded.key for loaded in queue]
# Call the provided batch_load_fn for this loader with the loader queue's keys.
batch_future = loader.batch_load_fn(keys)
# Assert the expected response from batch_load_fn
if not batch_future or not iscoroutine(batch_future):
return failed_dispatch( # pragma: no cover
loader,
queue,
TypeError(
(
"DataLoader must be constructed with a function which accepts "
"Iterable<key> and returns Future<Iterable<value>>, but the function did "
"not return a Coroutine: {}."
).format(batch_future)
),
)
try:
values = await batch_future
if not isinstance(values, Iterable):
raise TypeError( # pragma: no cover
(
"DataLoader must be constructed with a function which accepts "
"Iterable<key> and returns Future<Iterable<value>>, but the function did "
"not return a Future of a Iterable: {}."
).format(values)
)
values = list(values)
if len(values) != len(keys):
raise TypeError( # pragma: no cover
(
"DataLoader must be constructed with a function which accepts "
"Iterable<key> and returns Future<Iterable<value>>, but the function did "
"not return a Future of a Iterable with the same length as the Iterable "
"of keys."
"\n\nKeys:\n{}"
"\n\nValues:\n{}"
).format(keys, values)
)
# Step through the values, resolving or rejecting each Future in the
# loaded queue.
for loaded, value in zip(queue, values):
if isinstance(value, Exception):
loaded.future.set_exception(value)
else:
loaded.future.set_result(value)
except Exception as e:
return failed_dispatch(loader, queue, e)
def failed_dispatch(loader, queue, error):
"""
Do not cache individual loads if the entire batch dispatch fails,
but still reject each request so they do not hang.
"""
for loaded in queue:
loader.clear(loaded.key)
loaded.future.set_exception(error)

View File

@ -1,70 +1,5 @@
import functools from warnings import warn
import inspect
import warnings
string_types = (type(b""), type(""))
def warn_deprecation(text): def warn_deprecation(text: str):
warnings.warn(text, category=DeprecationWarning, stacklevel=2) warn(text, category=DeprecationWarning, stacklevel=2)
def deprecated(reason):
"""
This is a decorator which can be used to mark functions
as deprecated. It will result in a warning being emitted
when the function is used.
"""
if isinstance(reason, string_types):
# The @deprecated is used with a 'reason'.
#
# .. code-block:: python
#
# @deprecated("please, use another function")
# def old_function(x, y):
# pass
def decorator(func1):
if inspect.isclass(func1):
fmt1 = f"Call to deprecated class {func1.__name__} ({reason})."
else:
fmt1 = f"Call to deprecated function {func1.__name__} ({reason})."
@functools.wraps(func1)
def new_func1(*args, **kwargs):
warn_deprecation(fmt1)
return func1(*args, **kwargs)
return new_func1
return decorator
elif inspect.isclass(reason) or inspect.isfunction(reason):
# The @deprecated is used without any 'reason'.
#
# .. code-block:: python
#
# @deprecated
# def old_function(x, y):
# pass
func2 = reason
if inspect.isclass(func2):
fmt2 = f"Call to deprecated class {func2.__name__}."
else:
fmt2 = f"Call to deprecated function {func2.__name__}."
@functools.wraps(func2)
def new_func2(*args, **kwargs):
warn_deprecation(fmt2)
return func2(*args, **kwargs)
return new_func2
else:
raise TypeError(repr(type(reason)))

View File

@ -1,6 +1,5 @@
from functools import wraps from functools import wraps
from typing_extensions import deprecated
from .deprecated import deprecated
@deprecated("This function is deprecated") @deprecated("This function is deprecated")

View File

@ -0,0 +1,452 @@
from asyncio import gather
from collections import namedtuple
from functools import partial
from unittest.mock import Mock
from graphene.utils.dataloader import DataLoader
from pytest import mark, raises
from graphene import ObjectType, String, Schema, Field, List
CHARACTERS = {
"1": {"name": "Luke Skywalker", "sibling": "3"},
"2": {"name": "Darth Vader", "sibling": None},
"3": {"name": "Leia Organa", "sibling": "1"},
}
get_character = Mock(side_effect=lambda character_id: CHARACTERS[character_id])
class CharacterType(ObjectType):
name = String()
sibling = Field(lambda: CharacterType)
async def resolve_sibling(character, info):
if character["sibling"]:
return await info.context.character_loader.load(character["sibling"])
return None
class Query(ObjectType):
skywalker_family = List(CharacterType)
async def resolve_skywalker_family(_, info):
return await info.context.character_loader.load_many(["1", "2", "3"])
mock_batch_load_fn = Mock(
side_effect=lambda character_ids: [get_character(id) for id in character_ids]
)
class CharacterLoader(DataLoader):
async def batch_load_fn(self, character_ids):
return mock_batch_load_fn(character_ids)
Context = namedtuple("Context", "character_loader")
@mark.asyncio
async def test_basic_dataloader():
schema = Schema(query=Query)
character_loader = CharacterLoader()
context = Context(character_loader=character_loader)
query = """
{
skywalkerFamily {
name
sibling {
name
}
}
}
"""
result = await schema.execute_async(query, context=context)
assert not result.errors
assert result.data == {
"skywalkerFamily": [
{"name": "Luke Skywalker", "sibling": {"name": "Leia Organa"}},
{"name": "Darth Vader", "sibling": None},
{"name": "Leia Organa", "sibling": {"name": "Luke Skywalker"}},
]
}
assert mock_batch_load_fn.call_count == 1
assert get_character.call_count == 3
def id_loader(**options):
load_calls = []
async def default_resolve(x):
return x
resolve = options.pop("resolve", default_resolve)
async def fn(keys):
load_calls.append(keys)
return await resolve(keys)
# return keys
identity_loader = DataLoader(fn, **options)
return identity_loader, load_calls
@mark.asyncio
async def test_build_a_simple_data_loader():
async def call_fn(keys):
return keys
identity_loader = DataLoader(call_fn)
promise1 = identity_loader.load(1)
value1 = await promise1
assert value1 == 1
@mark.asyncio
async def test_can_build_a_data_loader_from_a_partial():
value_map = {1: "one"}
async def call_fn(context, keys):
return [context.get(key) for key in keys]
partial_fn = partial(call_fn, value_map)
identity_loader = DataLoader(partial_fn)
promise1 = identity_loader.load(1)
value1 = await promise1
assert value1 == "one"
@mark.asyncio
async def test_supports_loading_multiple_keys_in_one_call():
async def call_fn(keys):
return keys
identity_loader = DataLoader(call_fn)
promise_all = identity_loader.load_many([1, 2])
values = await promise_all
assert values == [1, 2]
promise_all = identity_loader.load_many([])
values = await promise_all
assert values == []
@mark.asyncio
async def test_batches_multiple_requests():
identity_loader, load_calls = id_loader()
promise1 = identity_loader.load(1)
promise2 = identity_loader.load(2)
p = gather(promise1, promise2)
value1, value2 = await p
assert value1 == 1
assert value2 == 2
assert load_calls == [[1, 2]]
@mark.asyncio
async def test_batches_multiple_requests_with_max_batch_sizes():
identity_loader, load_calls = id_loader(max_batch_size=2)
promise1 = identity_loader.load(1)
promise2 = identity_loader.load(2)
promise3 = identity_loader.load(3)
p = gather(promise1, promise2, promise3)
value1, value2, value3 = await p
assert value1 == 1
assert value2 == 2
assert value3 == 3
assert load_calls == [[1, 2], [3]]
@mark.asyncio
async def test_coalesces_identical_requests():
identity_loader, load_calls = id_loader()
promise1 = identity_loader.load(1)
promise2 = identity_loader.load(1)
assert promise1 == promise2
p = gather(promise1, promise2)
value1, value2 = await p
assert value1 == 1
assert value2 == 1
assert load_calls == [[1]]
@mark.asyncio
async def test_caches_repeated_requests():
identity_loader, load_calls = id_loader()
a, b = await gather(identity_loader.load("A"), identity_loader.load("B"))
assert a == "A"
assert b == "B"
assert load_calls == [["A", "B"]]
a2, c = await gather(identity_loader.load("A"), identity_loader.load("C"))
assert a2 == "A"
assert c == "C"
assert load_calls == [["A", "B"], ["C"]]
a3, b2, c2 = await gather(
identity_loader.load("A"), identity_loader.load("B"), identity_loader.load("C")
)
assert a3 == "A"
assert b2 == "B"
assert c2 == "C"
assert load_calls == [["A", "B"], ["C"]]
@mark.asyncio
async def test_clears_single_value_in_loader():
identity_loader, load_calls = id_loader()
a, b = await gather(identity_loader.load("A"), identity_loader.load("B"))
assert a == "A"
assert b == "B"
assert load_calls == [["A", "B"]]
identity_loader.clear("A")
a2, b2 = await gather(identity_loader.load("A"), identity_loader.load("B"))
assert a2 == "A"
assert b2 == "B"
assert load_calls == [["A", "B"], ["A"]]
@mark.asyncio
async def test_clears_all_values_in_loader():
identity_loader, load_calls = id_loader()
a, b = await gather(identity_loader.load("A"), identity_loader.load("B"))
assert a == "A"
assert b == "B"
assert load_calls == [["A", "B"]]
identity_loader.clear_all()
a2, b2 = await gather(identity_loader.load("A"), identity_loader.load("B"))
assert a2 == "A"
assert b2 == "B"
assert load_calls == [["A", "B"], ["A", "B"]]
@mark.asyncio
async def test_allows_priming_the_cache():
identity_loader, load_calls = id_loader()
identity_loader.prime("A", "A")
a, b = await gather(identity_loader.load("A"), identity_loader.load("B"))
assert a == "A"
assert b == "B"
assert load_calls == [["B"]]
@mark.asyncio
async def test_does_not_prime_keys_that_already_exist():
identity_loader, load_calls = id_loader()
identity_loader.prime("A", "X")
a1 = await identity_loader.load("A")
b1 = await identity_loader.load("B")
assert a1 == "X"
assert b1 == "B"
identity_loader.prime("A", "Y")
identity_loader.prime("B", "Y")
a2 = await identity_loader.load("A")
b2 = await identity_loader.load("B")
assert a2 == "X"
assert b2 == "B"
assert load_calls == [["B"]]
# # Represents Errors
@mark.asyncio
async def test_resolves_to_error_to_indicate_failure():
async def resolve(keys):
mapped_keys = [
key if key % 2 == 0 else Exception("Odd: {}".format(key)) for key in keys
]
return mapped_keys
even_loader, load_calls = id_loader(resolve=resolve)
with raises(Exception) as exc_info:
await even_loader.load(1)
assert str(exc_info.value) == "Odd: 1"
value2 = await even_loader.load(2)
assert value2 == 2
assert load_calls == [[1], [2]]
@mark.asyncio
async def test_can_represent_failures_and_successes_simultaneously():
async def resolve(keys):
mapped_keys = [
key if key % 2 == 0 else Exception("Odd: {}".format(key)) for key in keys
]
return mapped_keys
even_loader, load_calls = id_loader(resolve=resolve)
promise1 = even_loader.load(1)
promise2 = even_loader.load(2)
with raises(Exception) as exc_info:
await promise1
assert str(exc_info.value) == "Odd: 1"
value2 = await promise2
assert value2 == 2
assert load_calls == [[1, 2]]
@mark.asyncio
async def test_caches_failed_fetches():
async def resolve(keys):
mapped_keys = [Exception("Error: {}".format(key)) for key in keys]
return mapped_keys
error_loader, load_calls = id_loader(resolve=resolve)
with raises(Exception) as exc_info:
await error_loader.load(1)
assert str(exc_info.value) == "Error: 1"
with raises(Exception) as exc_info:
await error_loader.load(1)
assert str(exc_info.value) == "Error: 1"
assert load_calls == [[1]]
@mark.asyncio
async def test_caches_failed_fetches_2():
identity_loader, load_calls = id_loader()
identity_loader.prime(1, Exception("Error: 1"))
with raises(Exception) as _:
await identity_loader.load(1)
assert load_calls == []
# It is resilient to job queue ordering
@mark.asyncio
async def test_batches_loads_occuring_within_promises():
identity_loader, load_calls = id_loader()
async def load_b_1():
return await load_b_2()
async def load_b_2():
return await identity_loader.load("B")
values = await gather(identity_loader.load("A"), load_b_1())
assert values == ["A", "B"]
assert load_calls == [["A", "B"]]
@mark.asyncio
async def test_catches_error_if_loader_resolver_fails():
exc = Exception("AOH!")
def do_resolve(x):
raise exc
a_loader, a_load_calls = id_loader(resolve=do_resolve)
with raises(Exception) as exc_info:
await a_loader.load("A1")
assert exc_info.value == exc
@mark.asyncio
async def test_can_call_a_loader_from_a_loader():
deep_loader, deep_load_calls = id_loader()
a_loader, a_load_calls = id_loader(
resolve=lambda keys: deep_loader.load(tuple(keys))
)
b_loader, b_load_calls = id_loader(
resolve=lambda keys: deep_loader.load(tuple(keys))
)
a1, b1, a2, b2 = await gather(
a_loader.load("A1"),
b_loader.load("B1"),
a_loader.load("A2"),
b_loader.load("B2"),
)
assert a1 == "A1"
assert b1 == "B1"
assert a2 == "A2"
assert b2 == "B2"
assert a_load_calls == [["A1", "A2"]]
assert b_load_calls == [["B1", "B2"]]
assert deep_load_calls == [[("A1", "A2"), ("B1", "B2")]]
@mark.asyncio
async def test_dataloader_clear_with_missing_key_works():
async def do_resolve(x):
return x
a_loader, a_load_calls = id_loader(resolve=do_resolve)
assert a_loader.clear("A1") == a_loader

View File

@ -1,75 +1,9 @@
from pytest import raises
from .. import deprecated from .. import deprecated
from ..deprecated import deprecated as deprecated_decorator
from ..deprecated import warn_deprecation from ..deprecated import warn_deprecation
def test_warn_deprecation(mocker): def test_warn_deprecation(mocker):
mocker.patch.object(deprecated.warnings, "warn") mocker.patch.object(deprecated, "warn")
warn_deprecation("OH!") warn_deprecation("OH!")
deprecated.warnings.warn.assert_called_with( deprecated.warn.assert_called_with("OH!", stacklevel=2, category=DeprecationWarning)
"OH!", stacklevel=2, category=DeprecationWarning
)
def test_deprecated_decorator(mocker):
mocker.patch.object(deprecated, "warn_deprecation")
@deprecated_decorator
def my_func():
return True
result = my_func()
assert result
deprecated.warn_deprecation.assert_called_with(
"Call to deprecated function my_func."
)
def test_deprecated_class(mocker):
mocker.patch.object(deprecated, "warn_deprecation")
@deprecated_decorator
class X:
pass
result = X()
assert result
deprecated.warn_deprecation.assert_called_with("Call to deprecated class X.")
def test_deprecated_decorator_text(mocker):
mocker.patch.object(deprecated, "warn_deprecation")
@deprecated_decorator("Deprecation text")
def my_func():
return True
result = my_func()
assert result
deprecated.warn_deprecation.assert_called_with(
"Call to deprecated function my_func (Deprecation text)."
)
def test_deprecated_class_text(mocker):
mocker.patch.object(deprecated, "warn_deprecation")
@deprecated_decorator("Deprecation text")
class X:
pass
result = X()
assert result
deprecated.warn_deprecation.assert_called_with(
"Call to deprecated class X (Deprecation text)."
)
def test_deprecated_other_object(mocker):
mocker.patch.object(deprecated, "warn_deprecation")
with raises(TypeError):
deprecated_decorator({})

View File

@ -9,6 +9,5 @@ def test_resolve_only_args(mocker):
return root, args return root, args
wrapped_resolver = resolve_only_args(resolver) wrapped_resolver = resolve_only_args(resolver)
assert deprecated.warn_deprecation.called
result = wrapped_resolver(1, 2, a=3) result = wrapped_resolver(1, 2, a=3)
assert result == (1, {"a": 3}) assert result == (1, {"a": 3})

View File

@ -30,7 +30,7 @@ try:
except ImportError: except ImportError:
# backwards compatibility for v3.6 # backwards compatibility for v3.6
from typing import Pattern from typing import Pattern
from typing import Callable, Dict, List, Optional, Union from typing import Callable, Dict, List, Optional, Union, Tuple
from graphql import GraphQLError from graphql import GraphQLError
from graphql.validation import ValidationContext, ValidationRule from graphql.validation import ValidationContext, ValidationRule
@ -53,7 +53,7 @@ IgnoreType = Union[Callable[[str], bool], Pattern, str]
def depth_limit_validator( def depth_limit_validator(
max_depth: int, max_depth: int,
ignore: Optional[List[IgnoreType]] = None, ignore: Optional[List[IgnoreType]] = None,
callback: Callable[[Dict[str, int]], None] = None, callback: Optional[Callable[[Dict[str, int]], None]] = None,
): ):
class DepthLimitValidator(ValidationRule): class DepthLimitValidator(ValidationRule):
def __init__(self, validation_context: ValidationContext): def __init__(self, validation_context: ValidationContext):
@ -82,7 +82,7 @@ def depth_limit_validator(
def get_fragments( def get_fragments(
definitions: List[DefinitionNode], definitions: Tuple[DefinitionNode, ...],
) -> Dict[str, FragmentDefinitionNode]: ) -> Dict[str, FragmentDefinitionNode]:
fragments = {} fragments = {}
for definition in definitions: for definition in definitions:
@ -94,7 +94,7 @@ def get_fragments(
# This will actually get both queries and mutations. # This will actually get both queries and mutations.
# We can basically treat those the same # We can basically treat those the same
def get_queries_and_mutations( def get_queries_and_mutations(
definitions: List[DefinitionNode], definitions: Tuple[DefinitionNode, ...],
) -> Dict[str, OperationDefinitionNode]: ) -> Dict[str, OperationDefinitionNode]:
operations = {} operations = {}

View File

@ -1,12 +1,5 @@
[flake8]
exclude = setup.py,docs/*,*/examples/*,graphene/pyutils/*,tests
max-line-length = 120
[coverage:run] [coverage:run]
omit = graphene/pyutils/*,*/tests/*,graphene/types/scalars.py omit = graphene/pyutils/*,*/tests/*,graphene/types/scalars.py
[isort]
known_first_party=graphene
[bdist_wheel] [bdist_wheel]
universal=1 universal=1

View File

@ -45,48 +45,50 @@ class PyTest(TestCommand):
tests_require = [ tests_require = [
"pytest>=6,<7", "pytest>=8,<9",
"pytest-benchmark>=3.4,<4", "pytest-benchmark>=4,<5",
"pytest-cov>=3,<4", "pytest-cov>=5,<6",
"pytest-mock>=3,<4", "pytest-mock>=3,<4",
"pytest-asyncio>=0.16,<2", "pytest-asyncio>=0.16,<2",
"snapshottest>=0.6,<1", "coveralls>=3.3,<5",
"coveralls>=3.3,<4",
"promise>=2.3,<3",
"mock>=4,<5",
"pytz==2022.1",
"iso8601>=1,<2",
] ]
dev_requires = ["black==22.3.0", "flake8>=4,<5"] + tests_require dev_requires = [
"ruff==0.5.0",
"types-python-dateutil>=2.8.1,<3",
"mypy>=1.10,<2",
] + tests_require
setup( setup(
name="graphene", name="graphene",
version=version, version=version,
description="GraphQL Framework for Python", description="GraphQL Framework for Python",
long_description=codecs.open( long_description=codecs.open(
"README.rst", "r", encoding="ascii", errors="replace" "README.md", "r", encoding="ascii", errors="replace"
).read(), ).read(),
long_description_content_type="text/markdown",
url="https://github.com/graphql-python/graphene", url="https://github.com/graphql-python/graphene",
author="Syrus Akbary", author="Syrus Akbary",
author_email="me@syrusakbary.com", author_email="me@syrusakbary.com",
license="MIT", license="MIT",
classifiers=[ classifiers=[
"Development Status :: 3 - Alpha", "Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers", "Intended Audience :: Developers",
"Topic :: Software Development :: Libraries", "Topic :: Software Development :: Libraries",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
], ],
keywords="api graphql protocol rest relay graphene", keywords="api graphql protocol rest relay graphene",
packages=find_packages(exclude=["examples*"]), packages=find_packages(exclude=["examples*"]),
install_requires=[ install_requires=[
"graphql-core>=3.1,<3.3", "graphql-core>=3.1,<3.3",
"graphql-relay>=3.1,<3.3", "graphql-relay>=3.1,<3.3",
"aniso8601>=8,<10", "python-dateutil>=2.7.0,<3",
"typing-extensions>=4.7.1,<5",
], ],
tests_require=tests_require, tests_require=tests_require,
extras_require={"test": tests_require, "dev": dev_requires}, extras_require={"test": tests_require, "dev": dev_requires},

22
tox.ini
View File

@ -1,37 +1,27 @@
[tox] [tox]
envlist = py3{6,7,8,9,10}, flake8, mypy, pre-commit envlist = py3{8,9,10,11,12,13}, mypy, pre-commit
skipsdist = true skipsdist = true
[testenv] [testenv]
deps = deps =
.[test] .[test]
setenv =
PYTHONPATH = .:{envdir}
commands = commands =
py{36,37,38,39,310}: pytest --cov=graphene graphene examples {posargs} pytest --cov=graphene graphene --cov-report=term --cov-report=xml examples {posargs}
[testenv:pre-commit] [testenv:pre-commit]
basepython = python3.9 basepython = python3.10
deps = deps =
pre-commit>=2.16,<3 pre-commit>=3.7,<4
setenv = setenv =
LC_CTYPE=en_US.UTF-8 LC_CTYPE=en_US.UTF-8
commands = commands =
pre-commit run --all-files --show-diff-on-failure pre-commit run --all-files --show-diff-on-failure
[testenv:mypy] [testenv:mypy]
basepython = python3.9 basepython = python3.10
deps = deps =
mypy>=0.950,<1 .[dev]
commands = commands =
mypy graphene mypy graphene
[testenv:flake8]
basepython = python3.9
deps =
flake8>=4,<5
commands =
pip install --pre -e .
flake8 graphene
[pytest] [pytest]