Merge branch 'master' into apollo-docs

This commit is contained in:
Jonas Helfer 2017-06-12 11:24:18 -07:00 committed by GitHub
commit 86f5cbc08e
68 changed files with 1628 additions and 478 deletions

View File

@ -2,3 +2,4 @@ global-exclude tests/*
recursive-exclude tests *
recursive-exclude tests_py35 *
recursive-exclude examples *
include LICENSE

View File

@ -8,7 +8,7 @@ Please read [UPGRADE-v1.0.md](/UPGRADE-v1.0.md) to learn how to upgrade to Graph
[Graphene](http://graphene-python.org) is a Python library for building GraphQL schemas/types fast and easily.
- **Easy to use:** Graphene helps you use GraphQL in Python without effort.
- **Relay:** Graphene has builtin support for Relay
- **Relay:** Graphene has builtin support for Relay.
- **Data agnostic:** Graphene supports any kind of data source: SQL (Django, SQLAlchemy), NoSQL, custom Python objects, etc.
We believe that by providing a complete API you could plug Graphene anywhere your data lives and make your data available
through GraphQL.
@ -25,6 +25,7 @@ Graphene has multiple integrations with different frameworks:
| Google App Engine | [graphene-gae](https://github.com/graphql-python/graphene-gae/) |
| Peewee | *In progress* ([Tracking Issue](https://github.com/graphql-python/graphene/issues/289)) |
Also, Graphene is fully compatible with the GraphQL spec, working seamlessly with all GraphQL clients, such as [Relay](https://github.com/facebook/relay), [Apollo](https://github.com/apollographql/apollo-client) and [gql](https://github.com/graphql-python/gql).
## Installation
@ -75,7 +76,7 @@ If you want to learn even more, you can also check the following [examples](exam
After cloning this repo, ensure dependencies are installed by running:
```sh
pip install .[test]
pip install -e ".[test]"
```
After developing, the full test suite can be evaluated by running:

View File

@ -11,7 +11,7 @@ building GraphQL schemas/types fast and easily.
- **Easy to use:** Graphene helps you use GraphQL in Python without
effort.
- **Relay:** Graphene has builtin support for Relay
- **Relay:** Graphene has builtin support for both Relay.
- **Data agnostic:** Graphene supports any kind of data source: SQL
(Django, SQLAlchemy), NoSQL, custom Python objects, etc. We believe
that by providing a complete API you could plug Graphene anywhere
@ -34,6 +34,12 @@ Graphene has multiple integrations with different frameworks:
| Peewee | *In progress* (`Tracking Issue <https://github.com/graphql-python/graphene/issues/289>`__) |
+---------------------+----------------------------------------------------------------------------------------------+
Also, Graphene is fully compatible with the GraphQL spec, working
seamlessly with all GraphQL clients, such as
`Relay <https://github.com/facebook/relay>`__,
`Apollo <https://github.com/apollographql/apollo-client>`__ and
`gql <https://github.com/graphql-python/gql>`__.
Installation
------------
@ -89,7 +95,7 @@ After cloning this repo, ensure dependencies are installed by running:
.. code:: sh
pip install .[test]
pip install -e ".[test]"
After developing, the full test suite can be evaluated by running:

View File

@ -97,7 +97,7 @@ schema = graphene.Schema(
## Interfaces
For implementing an Interface in a ObjectType, you have to it onto `Meta.interfaces`.
For implementing an Interface in an ObjectType, you have to add it onto `Meta.interfaces`.
Like:
@ -142,7 +142,7 @@ class Query(ObjectType):
## Nodes
Apart of implementing as showed in the previous section, for use the node field you have to
Apart from implementing as shown in the previous section, to use the node field you have to
specify the node Type.
Example:
@ -155,16 +155,16 @@ class Query(ObjectType):
node = relay.Node.Field() # New way
```
Also, if wanted to create an `ObjectType` that implements `Node`, you have to do it
Also, if you wanted to create an `ObjectType` that implements `Node`, you have to do it
explicity.
## Django
The Django integration with Graphene now have an independent package: `graphene-django`.
The Django integration with Graphene now has an independent package: `graphene-django`.
For installing, you have to replace the old `graphene[django]` with `graphene-django`.
* As the package is now independent, you have to import now from `graphene_django`.
* As the package is now independent, you now have to import from `graphene_django`.
* **DjangoNode no longer exists**, please use `relay.Node` instead:
```python
@ -178,7 +178,7 @@ For installing, you have to replace the old `graphene[django]` with `graphene-dj
## SQLAlchemy
The SQLAlchemy integration with Graphene now have an independent package: `graphene-sqlalchemy`.
The SQLAlchemy integration with Graphene now has an independent package: `graphene-sqlalchemy`.
For installing, you have to replace the old `graphene[sqlalchemy]` with `graphene-sqlalchemy`.
* As the package is now independent, you have to import now from `graphene_sqlalchemy`.

View File

@ -0,0 +1,106 @@
Dataloader
==========
DataLoader is a generic utility to be used as part of your application's
data fetching layer to provide a simplified and consistent API over
various remote data sources such as databases or web services via batching
and caching.
Batching
--------
Batching is not an advanced feature, it's DataLoader's primary feature.
Create loaders by providing a batch loading function.
.. code:: python
from promise import Promise
from promise.dataloader import DataLoader
class UserLoader(DataLoader):
def batch_load_fn(self, keys):
# Here we return a promise that will result on the
# corresponding user for each key in keys
return Promise.resolve([get_user(id=key) for key in keys])
A batch loading function accepts an list of keys, and returns a ``Promise``
which resolves to an list of ``values``.
Then load individual values from the loader. ``DataLoader`` will coalesce all
individual loads which occur within a single frame of execution (executed once
the wrapping promise is resolved) and then call your batch function with all
requested keys.
.. code:: python
user_loader = UserLoader()
user_loader.load(1).then(lambda user: user_loader.load(user.best_friend_id))
user_loader.load(2).then(lambda user: user_loader.load(user.best_friend_id))
A naive application may have issued *four* round-trips to a backend for the
required information, but with ``DataLoader`` this application will make at most *two*.
``DataLoader`` allows you to decouple unrelated parts of your application without
sacrificing the performance of batch data-loading. While the loader presents
an API that loads individual values, all concurrent requests will be coalesced
and presented to your batch loading function. This allows your application to
safely distribute data fetching requirements throughout your application and
maintain minimal outgoing data requests.
Using with Graphene
-------------------
DataLoader pairs nicely well with Graphene/GraphQL. GraphQL fields are designed
to be stand-alone functions. Without a caching or batching mechanism, it's easy
for a naive GraphQL server to issue new database requests each time a field is resolved.
Consider the following GraphQL request:
.. code::
{
me {
name
bestFriend {
name
}
friends(first: 5) {
name
bestFriend {
name
}
}
}
}
Naively, if ``me``, ``bestFriend`` and ``friends`` each need to request the backend,
there could be at most 13 database requests!
When using DataLoader, we could define the User type using our previous example with
leaner code and at most 4 database requests, and possibly fewer if there are cache hits.
.. code:: python
class User(graphene.ObjectType):
name = graphene.String()
best_friend = graphene.Field(lambda: User)
friends = graphene.List(lambda: User)
def resolve_best_friend(self, args, context, info):
return user_loader.load(self.best_friend_id)
def resolve_friends(self, args, context, info):
return user_loader.load_many(self.friend_ids)

View File

@ -0,0 +1,32 @@
Executing a query
=================
For executing a query a schema, you can directly call the ``execute`` method on it.
.. code:: python
schema = graphene.Schema(...)
result = schema.execute('{ name }')
``result`` represents the result of execution. ``result.data`` is the result of executing the query, ``result.errors`` is ``None`` if no errors occurred, and is a non-empty list if an error occurred.
Context
_______
You can pass context to a query via ``context_value``.
.. code:: python
class Query(graphene.ObjectType):
name = graphene.String()
def resolve_name(self, args, context, info):
return context.get('name')
schema = graphene.Schema(Query)
result = schema.execute('{ name }', context_value={'name': 'Syrus'})

View File

@ -2,39 +2,9 @@
Execution
=========
For executing a query a schema, you can directly call the ``execute`` method on it.
.. code:: python
schema = graphene.Schema(...)
result = schema.execute('{ name }')
``result`` represents he result of execution. ``result.data`` is the result of executing the query, ``result.errors`` is ``None`` if no errors occurred, and is a non-empty list if an error occurred.
Context
_______
You can pass context to a query via ``context_value``.
.. code:: python
class Query(graphene.ObjectType):
name = graphene.String()
def resolve_name(self, args, context, info):
return context.get('name')
schema = graphene.Schema(Query)
result = schema.execute('{ name }', context_value={'name': 'Syrus'})
Middleware
__________
.. toctree::
:maxdepth: 1
:maxdepth: 2
execute
middleware
dataloader

View File

@ -30,15 +30,15 @@ This middleware only continues evaluation if the ``field_name`` is not ``'user'`
.. code:: python
class AuthorizationMiddleware(object):
def resolve(self, next, root, args, context, info):
if info.field_name == 'user':
return None
return next(root, args, context, info)
class AuthorizationMiddleware(object):
def resolve(self, next, root, args, context, info):
if info.field_name == 'user':
return None
return next(root, args, context, info)
And then execute it with:
.. code:: python
result = schema.execute('THE QUERY', middleware=[AuthorizationMiddleware()])
result = schema.execute('THE QUERY', middleware=[AuthorizationMiddleware()])

View File

@ -11,6 +11,7 @@ Contents:
execution/index
relay/index
apollo/index
testing/index
Integrations
-----

View File

@ -5,7 +5,7 @@ What is GraphQL?
----------------
For an introduction to GraphQL and an overview of its concepts, please refer
to `the official introduction <http://graphql.org/learn/>`.
to `the official introduction <http://graphql.org/learn/>`_.
Lets build a basic GraphQL schema from scratch.
@ -30,17 +30,17 @@ server with an associated set of resolve methods that know how to fetch
data.
We are going to create a very simple schema, with a ``Query`` with only
one field: ``hello``. And when we query it, it should return ``"World"``.
one field: ``hello`` and an input name. And when we query it, it should return ``"Hello {name}"``.
.. code:: python
import graphene
class Query(graphene.ObjectType):
hello = graphene.String()
hello = graphene.String(name=graphene.Argument(graphene.String, default_value="stranger"))
def resolve_hello(self, args, context, info):
return 'World'
return 'Hello ' + args['name']
schema = graphene.Schema(query=Query)
@ -52,6 +52,6 @@ Then we can start querying our schema:
.. code:: python
result = schema.execute('{ hello }')
print result.data['hello'] # "World"
print result.data['hello'] # "Hello stranger"
Congrats! You got your first graphene schema working!

View File

@ -3,7 +3,7 @@ Nodes
A ``Node`` is an Interface provided by ``graphene.relay`` that contains
a single field ``id`` (which is a ``ID!``). Any object that inherits
from it have to implement a ``get_node`` method for retrieving a
from it has to implement a ``get_node`` method for retrieving a
``Node`` by an *id*.
@ -26,8 +26,8 @@ Example usage (taken from the `Starwars Relay example`_):
return get_ship(id)
The ``id`` returned by the ``Ship`` type when you query it will be a
scalar which contains the enough info for the server for knowing its
type and its id.
scalar which contains enough info for the server to know its type and
its id.
For example, the instance ``Ship(id=1)`` will return ``U2hpcDox`` as the
id when you query it (which is the base64 encoding of ``Ship:1``), and
@ -77,7 +77,7 @@ Accessing node types
If we want to retrieve node instances from a ``global_id`` (scalar that identifies an instance by it's type name and id),
we can simply do ``Node.get_node_from_global_id(global_id, context, info)``.
In the case we want to restrict the instnance retrieval to an specific type, we can do:
In the case we want to restrict the instance retrieval to a specific type, we can do:
``Node.get_node_from_global_id(global_id, context, info, only_type=Ship)``. This will raise an error
if the ``global_id`` doesn't correspond to a Ship type.
@ -98,4 +98,5 @@ Example usage:
# Should be CustomNode.Field() if we want to use our custom Node
node = relay.Node.Field()
.. _Relay specification: https://facebook.github.io/relay/docs/graphql-relay-specification.html
.. _Starwars Relay example: https://github.com/graphql-python/graphene/blob/master/examples/starwars_relay/schema.py

View File

@ -1,2 +1,4 @@
# Required library
Sphinx==1.5.3
# Docs template
https://github.com/graphql-python/graphene-python.org/archive/docs.zip

111
docs/testing/index.rst Normal file
View File

@ -0,0 +1,111 @@
===================
Testing in Graphene
===================
Automated testing is an extremely useful bug-killing tool for the modern developer. You can use a collection of tests a test suite to solve, or avoid, a number of problems:
- When youre writing new code, you can use tests to validate your code works as expected.
- When youre refactoring or modifying old code, you can use tests to ensure your changes havent affected your applications behavior unexpectedly.
Testing a GraphQL application is a complex task, because a GraphQL application is made of several layers of logic schema definition, schema validation, permissions and field resolution.
With Graphene test-execution framework and assorted utilities, you can simulate GraphQL requests, execute mutations, inspect your applications output and generally verify your code is doing what it should be doing.
Testing tools
-------------
Graphene provides a small set of tools that come in handy when writing tests.
Test Client
~~~~~~~~~~~
The test client is a Python class that acts as a dummy GraphQL client, allowing you to test your views and interact with your Graphene-powered application programmatically.
Some of the things you can do with the test client are:
- Simulate Queries and Mutations and observe the response.
- Test that a given query request is rendered by a given Django template, with a template context that contains certain values.
Overview and a quick example
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
To use the test client, instantiate ``graphene.test.Client`` and retrieve GraphQL responses:
.. code:: python
from graphene.test import Client
def test_hey():
client = Client(my_schema)
executed = client.execute('''{ hey }''')
assert executed == {
'data': {
'hey': 'hello!'
}
}
Execute parameters
~~~~~~~~~~~~~~~~~~
You can also add extra keyword arguments to the ``execute`` method, such as
``context_value``, ``root_value``, ``variable_values``, ...:
.. code:: python
from graphene.test import Client
def test_hey():
client = Client(my_schema)
executed = client.execute('''{ hey }''', context_value={'user': 'Peter'})
assert executed == {
'data': {
'hey': 'hello Peter!'
}
}
Snapshot testing
~~~~~~~~~~~~~~~~
As our APIs evolve, we need to know when our changes introduce any breaking changes that might break
some of the clients of our GraphQL app.
However, writing tests and replicate the same response we expect from our GraphQL application can be
tedious and repetitive task, and sometimes it's easier to skip this process.
Because of that, we recommend the usage of `SnapshotTest <https://github.com/syrusakbary/snapshottest/>`_.
SnapshotTest let us write all this tests in a breeze, as creates automatically the ``snapshots`` for us
the first time the test is executed.
Here is a simple example on how our tests will look if we use ``pytest``:
.. code:: python
def test_hey(snapshot):
client = Client(my_schema)
# This will create a snapshot dir and a snapshot file
# the first time the test is executed, with the response
# of the execution.
snapshot.assert_match(client.execute('''{ hey }'''))
If we are using ``unittest``:
.. code:: python
from snapshottest import TestCase
class APITestCase(TestCase):
def test_api_me(self):
"""Testing the API for /me"""
client = Client(my_schema)
self.assertMatchSnapshot(client.execute('''{ hey }'''))

View File

@ -59,7 +59,33 @@ Notes
-----
``graphene.Enum`` uses |enum.Enum|_ internally (or a backport if
that's not available) and can be used in the exact same way.
that's not available) and can be used in a similar way, with the exception of
member getters.
In the Python ``Enum`` implementation you can access a member by initing the Enum.
.. code:: python
from enum import Enum
class Color(Enum):
RED = 1
GREEN = 2
BLUE = 3
assert Color(1) == Color.RED
However, in Graphene ``Enum`` you need to call get to have the same effect:
.. code:: python
from graphene import Enum
class Color(Enum):
RED = 1
GREEN = 2
BLUE = 3
assert Color.get(1) == Color.RED
.. |enum.Enum| replace:: ``enum.Enum``
.. _enum.Enum: https://docs.python.org/3/library/enum.html

View File

@ -7,6 +7,7 @@ Types Reference
enums
scalars
list-and-nonnull
interfaces
abstracttypes
objecttypes

View File

@ -0,0 +1,50 @@
Lists and Non-Null
==================
Object types, scalars, and enums are the only kinds of types you can
define in Graphene. But when you use the types in other parts of the
schema, or in your query variable declarations, you can apply additional
type modifiers that affect validation of those values.
NonNull
-------
.. code:: python
import graphene
class Character(graphene.ObjectType):
name = graphene.NonNull(graphene.String)
Here, we're using a ``String`` type and marking it as Non-Null by wrapping
it using the ``NonNull`` class. This means that our server always expects
to return a non-null value for this field, and if it ends up getting a
null value that will actually trigger a GraphQL execution error,
letting the client know that something has gone wrong.
The previous ``NonNull`` code snippet is also equivalent to:
.. code:: python
import graphene
class Character(graphene.ObjectType):
name = graphene.String(required=True)
List
----
.. code:: python
import graphene
class Character(graphene.ObjectType):
appears_in = graphene.List(graphene.String)
Lists work in a similar way: We can use a type modifier to mark a type as a
``List``, which indicates that this field will return a list of that type.
It works the same for arguments, where the validation step will expect a list
for that value.

View File

@ -19,7 +19,8 @@ This example defines a Mutation:
ok = graphene.Boolean()
person = graphene.Field(lambda: Person)
def mutate(self, args, context, info):
@staticmethod
def mutate(root, args, context, info):
person = Person(name=args.get('name'))
ok = True
return CreatePerson(person=person, ok=ok)
@ -42,11 +43,16 @@ So, we can finish our schema like this:
class Person(graphene.ObjectType):
name = graphene.String()
age = graphene.Int()
class MyMutations(graphene.ObjectType):
create_person = CreatePerson.Field()
schema = graphene.Schema(mutation=MyMutations)
# We must define a query for our schema
class Query(graphene.ObjectType):
person = graphene.Field(Person)
schema = graphene.Schema(query=Query, mutation=MyMutations)
Executing the Mutation
----------------------
@ -96,11 +102,12 @@ To use an InputField you define an InputObjectType that specifies the structure
class CreatePerson(graphene.Mutation):
class Input:
person_data = graphene.InputField(PersonInput)
person_data = graphene.Argument(PersonInput)
person = graphene.Field(lambda: Person)
def mutate(self, args, context, info):
@staticmethod
def mutate(root, args, context, info):
p_data = args.get('person_data')
name = p_data.get('name')

View File

@ -72,4 +72,4 @@ Types mounted in a ``Field`` act as ``Argument``\ s.
graphene.Field(graphene.String, to=graphene.String())
# Is equivalent to:
graphene.Field(graphene.String, to=graphene.Argument(graphene.String()))
graphene.Field(graphene.String, to=graphene.Argument(graphene.String))

View File

@ -0,0 +1,202 @@
# -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots['test_hero_name_query 1'] = {
'data': {
'hero': {
'name': 'R2-D2'
}
}
}
snapshots['test_hero_name_and_friends_query 1'] = {
'data': {
'hero': {
'id': '2001',
'name': 'R2-D2',
'friends': [
{
'name': 'Luke Skywalker'
},
{
'name': 'Han Solo'
},
{
'name': 'Leia Organa'
}
]
}
}
}
snapshots['test_nested_query 1'] = {
'data': {
'hero': {
'name': 'R2-D2',
'friends': [
{
'name': 'Luke Skywalker',
'appearsIn': [
'NEWHOPE',
'EMPIRE',
'JEDI'
],
'friends': [
{
'name': 'Han Solo'
},
{
'name': 'Leia Organa'
},
{
'name': 'C-3PO'
},
{
'name': 'R2-D2'
}
]
},
{
'name': 'Han Solo',
'appearsIn': [
'NEWHOPE',
'EMPIRE',
'JEDI'
],
'friends': [
{
'name': 'Luke Skywalker'
},
{
'name': 'Leia Organa'
},
{
'name': 'R2-D2'
}
]
},
{
'name': 'Leia Organa',
'appearsIn': [
'NEWHOPE',
'EMPIRE',
'JEDI'
],
'friends': [
{
'name': 'Luke Skywalker'
},
{
'name': 'Han Solo'
},
{
'name': 'C-3PO'
},
{
'name': 'R2-D2'
}
]
}
]
}
}
}
snapshots['test_fetch_luke_query 1'] = {
'data': {
'human': {
'name': 'Luke Skywalker'
}
}
}
snapshots['test_fetch_some_id_query 1'] = {
'data': {
'human': {
'name': 'Luke Skywalker'
}
}
}
snapshots['test_fetch_some_id_query2 1'] = {
'data': {
'human': {
'name': 'Han Solo'
}
}
}
snapshots['test_invalid_id_query 1'] = {
'data': {
'human': None
}
}
snapshots['test_fetch_luke_aliased 1'] = {
'data': {
'luke': {
'name': 'Luke Skywalker'
}
}
}
snapshots['test_fetch_luke_and_leia_aliased 1'] = {
'data': {
'luke': {
'name': 'Luke Skywalker'
},
'leia': {
'name': 'Leia Organa'
}
}
}
snapshots['test_duplicate_fields 1'] = {
'data': {
'luke': {
'name': 'Luke Skywalker',
'homePlanet': 'Tatooine'
},
'leia': {
'name': 'Leia Organa',
'homePlanet': 'Alderaan'
}
}
}
snapshots['test_use_fragment 1'] = {
'data': {
'luke': {
'name': 'Luke Skywalker',
'homePlanet': 'Tatooine'
},
'leia': {
'name': 'Leia Organa',
'homePlanet': 'Alderaan'
}
}
}
snapshots['test_check_type_of_r2 1'] = {
'data': {
'hero': {
'__typename': 'Droid',
'name': 'R2-D2'
}
}
}
snapshots['test_check_type_of_luke 1'] = {
'data': {
'hero': {
'__typename': 'Human',
'name': 'Luke Skywalker'
}
}
}

View File

@ -1,11 +1,12 @@
from graphene.test import Client
from ..data import setup
from ..schema import schema
setup()
client = Client(schema)
def test_hero_name_query():
def test_hero_name_query(snapshot):
query = '''
query HeroNameQuery {
hero {
@ -13,17 +14,11 @@ def test_hero_name_query():
}
}
'''
expected = {
'hero': {
'name': 'R2-D2'
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))
def test_hero_name_and_friends_query():
def test_hero_name_and_friends_query(snapshot):
query = '''
query HeroNameAndFriendsQuery {
hero {
@ -35,23 +30,10 @@ def test_hero_name_and_friends_query():
}
}
'''
expected = {
'hero': {
'id': '2001',
'name': 'R2-D2',
'friends': [
{'name': 'Luke Skywalker'},
{'name': 'Han Solo'},
{'name': 'Leia Organa'},
]
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))
def test_nested_query():
def test_nested_query(snapshot):
query = '''
query NestedQuery {
hero {
@ -66,70 +48,10 @@ def test_nested_query():
}
}
'''
expected = {
'hero': {
'name': 'R2-D2',
'friends': [
{
'name': 'Luke Skywalker',
'appearsIn': ['NEWHOPE', 'EMPIRE', 'JEDI'],
'friends': [
{
'name': 'Han Solo',
},
{
'name': 'Leia Organa',
},
{
'name': 'C-3PO',
},
{
'name': 'R2-D2',
},
]
},
{
'name': 'Han Solo',
'appearsIn': ['NEWHOPE', 'EMPIRE', 'JEDI'],
'friends': [
{
'name': 'Luke Skywalker',
},
{
'name': 'Leia Organa',
},
{
'name': 'R2-D2',
},
]
},
{
'name': 'Leia Organa',
'appearsIn': ['NEWHOPE', 'EMPIRE', 'JEDI'],
'friends': [
{
'name': 'Luke Skywalker',
},
{
'name': 'Han Solo',
},
{
'name': 'C-3PO',
},
{
'name': 'R2-D2',
},
]
},
]
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))
def test_fetch_luke_query():
def test_fetch_luke_query(snapshot):
query = '''
query FetchLukeQuery {
human(id: "1000") {
@ -137,17 +59,10 @@ def test_fetch_luke_query():
}
}
'''
expected = {
'human': {
'name': 'Luke Skywalker',
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))
def test_fetch_some_id_query():
def test_fetch_some_id_query(snapshot):
query = '''
query FetchSomeIDQuery($someId: String!) {
human(id: $someId) {
@ -158,17 +73,10 @@ def test_fetch_some_id_query():
params = {
'someId': '1000',
}
expected = {
'human': {
'name': 'Luke Skywalker',
}
}
result = schema.execute(query, None, variable_values=params)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query, variable_values=params))
def test_fetch_some_id_query2():
def test_fetch_some_id_query2(snapshot):
query = '''
query FetchSomeIDQuery($someId: String!) {
human(id: $someId) {
@ -179,17 +87,10 @@ def test_fetch_some_id_query2():
params = {
'someId': '1002',
}
expected = {
'human': {
'name': 'Han Solo',
}
}
result = schema.execute(query, None, variable_values=params)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query, variable_values=params))
def test_invalid_id_query():
def test_invalid_id_query(snapshot):
query = '''
query humanQuery($id: String!) {
human(id: $id) {
@ -200,15 +101,10 @@ def test_invalid_id_query():
params = {
'id': 'not a valid id',
}
expected = {
'human': None
}
result = schema.execute(query, None, variable_values=params)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query, variable_values=params))
def test_fetch_luke_aliased():
def test_fetch_luke_aliased(snapshot):
query = '''
query FetchLukeAliased {
luke: human(id: "1000") {
@ -216,17 +112,10 @@ def test_fetch_luke_aliased():
}
}
'''
expected = {
'luke': {
'name': 'Luke Skywalker',
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))
def test_fetch_luke_and_leia_aliased():
def test_fetch_luke_and_leia_aliased(snapshot):
query = '''
query FetchLukeAndLeiaAliased {
luke: human(id: "1000") {
@ -237,20 +126,10 @@ def test_fetch_luke_and_leia_aliased():
}
}
'''
expected = {
'luke': {
'name': 'Luke Skywalker',
},
'leia': {
'name': 'Leia Organa',
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))
def test_duplicate_fields():
def test_duplicate_fields(snapshot):
query = '''
query DuplicateFields {
luke: human(id: "1000") {
@ -263,22 +142,10 @@ def test_duplicate_fields():
}
}
'''
expected = {
'luke': {
'name': 'Luke Skywalker',
'homePlanet': 'Tatooine',
},
'leia': {
'name': 'Leia Organa',
'homePlanet': 'Alderaan',
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))
def test_use_fragment():
def test_use_fragment(snapshot):
query = '''
query UseFragment {
luke: human(id: "1000") {
@ -293,22 +160,10 @@ def test_use_fragment():
homePlanet
}
'''
expected = {
'luke': {
'name': 'Luke Skywalker',
'homePlanet': 'Tatooine',
},
'leia': {
'name': 'Leia Organa',
'homePlanet': 'Alderaan',
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))
def test_check_type_of_r2():
def test_check_type_of_r2(snapshot):
query = '''
query CheckTypeOfR2 {
hero {
@ -317,18 +172,10 @@ def test_check_type_of_r2():
}
}
'''
expected = {
'hero': {
'__typename': 'Droid',
'name': 'R2-D2',
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))
def test_check_type_of_luke():
def test_check_type_of_luke(snapshot):
query = '''
query CheckTypeOfLuke {
hero(episode: EMPIRE) {
@ -337,12 +184,4 @@ def test_check_type_of_luke():
}
}
'''
expected = {
'hero': {
'__typename': 'Human',
'name': 'Luke Skywalker',
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))

View File

@ -0,0 +1,32 @@
# -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots['test_correct_fetch_first_ship_rebels 1'] = {
'data': {
'rebels': {
'name': 'Alliance to Restore the Republic',
'ships': {
'pageInfo': {
'startCursor': 'YXJyYXljb25uZWN0aW9uOjA=',
'endCursor': 'YXJyYXljb25uZWN0aW9uOjA=',
'hasNextPage': True,
'hasPreviousPage': False
},
'edges': [
{
'cursor': 'YXJyYXljb25uZWN0aW9uOjA=',
'node': {
'name': 'X-Wing'
}
}
]
}
}
}
}

View File

@ -0,0 +1,62 @@
# -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots['test_mutations 1'] = {
'data': {
'introduceShip': {
'ship': {
'id': 'U2hpcDo5',
'name': 'Peter'
},
'faction': {
'name': 'Alliance to Restore the Republic',
'ships': {
'edges': [
{
'node': {
'id': 'U2hpcDox',
'name': 'X-Wing'
}
},
{
'node': {
'id': 'U2hpcDoy',
'name': 'Y-Wing'
}
},
{
'node': {
'id': 'U2hpcDoz',
'name': 'A-Wing'
}
},
{
'node': {
'id': 'U2hpcDo0',
'name': 'Millenium Falcon'
}
},
{
'node': {
'id': 'U2hpcDo1',
'name': 'Home One'
}
},
{
'node': {
'id': 'U2hpcDo5',
'name': 'Peter'
}
}
]
}
}
}
}
}

View File

@ -0,0 +1,53 @@
# -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots['test_correctly_fetches_id_name_rebels 1'] = {
'data': {
'rebels': {
'id': 'RmFjdGlvbjox',
'name': 'Alliance to Restore the Republic'
}
}
}
snapshots['test_correctly_refetches_rebels 1'] = {
'data': {
'node': {
'id': 'RmFjdGlvbjox',
'name': 'Alliance to Restore the Republic'
}
}
}
snapshots['test_correctly_fetches_id_name_empire 1'] = {
'data': {
'empire': {
'id': 'RmFjdGlvbjoy',
'name': 'Galactic Empire'
}
}
}
snapshots['test_correctly_refetches_empire 1'] = {
'data': {
'node': {
'id': 'RmFjdGlvbjoy',
'name': 'Galactic Empire'
}
}
}
snapshots['test_correctly_refetches_xwing 1'] = {
'data': {
'node': {
'id': 'U2hpcDox',
'name': 'X-Wing'
}
}
}

View File

@ -1,10 +1,13 @@
from graphene.test import Client
from ..data import setup
from ..schema import schema
setup()
client = Client(schema)
def test_correct_fetch_first_ship_rebels():
def test_correct_fetch_first_ship_rebels(snapshot):
query = '''
query RebelsShipsQuery {
rebels {
@ -26,27 +29,4 @@ def test_correct_fetch_first_ship_rebels():
}
}
'''
expected = {
'rebels': {
'name': 'Alliance to Restore the Republic',
'ships': {
'pageInfo': {
'startCursor': 'YXJyYXljb25uZWN0aW9uOjA=',
'endCursor': 'YXJyYXljb25uZWN0aW9uOjA=',
'hasNextPage': True,
'hasPreviousPage': False
},
'edges': [
{
'cursor': 'YXJyYXljb25uZWN0aW9uOjA=',
'node': {
'name': 'X-Wing'
}
}
]
}
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))

View File

@ -1,10 +1,13 @@
from graphene.test import Client
from ..data import setup
from ..schema import schema
setup()
client = Client(schema)
def test_mutations():
def test_mutations(snapshot):
query = '''
mutation MyMutation {
introduceShip(input:{clientMutationId:"abc", shipName: "Peter", factionId: "1"}) {
@ -26,51 +29,4 @@ def test_mutations():
}
}
'''
expected = {
'introduceShip': {
'ship': {
'id': 'U2hpcDo5',
'name': 'Peter'
},
'faction': {
'name': 'Alliance to Restore the Republic',
'ships': {
'edges': [{
'node': {
'id': 'U2hpcDox',
'name': 'X-Wing'
}
}, {
'node': {
'id': 'U2hpcDoy',
'name': 'Y-Wing'
}
}, {
'node': {
'id': 'U2hpcDoz',
'name': 'A-Wing'
}
}, {
'node': {
'id': 'U2hpcDo0',
'name': 'Millenium Falcon'
}
}, {
'node': {
'id': 'U2hpcDo1',
'name': 'Home One'
}
}, {
'node': {
'id': 'U2hpcDo5',
'name': 'Peter'
}
}]
},
}
}
}
result = schema.execute(query)
# raise result.errors[0].original_error, None, result.errors[0].stack
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))

View File

@ -1,8 +1,11 @@
from graphene.test import Client
from ..data import setup
from ..schema import schema
setup()
client = Client(schema)
def test_str_schema():
assert str(schema) == '''schema {
@ -66,7 +69,7 @@ type ShipEdge {
'''
def test_correctly_fetches_id_name_rebels():
def test_correctly_fetches_id_name_rebels(snapshot):
query = '''
query RebelsQuery {
rebels {
@ -75,18 +78,10 @@ def test_correctly_fetches_id_name_rebels():
}
}
'''
expected = {
'rebels': {
'id': 'RmFjdGlvbjox',
'name': 'Alliance to Restore the Republic'
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))
def test_correctly_refetches_rebels():
def test_correctly_refetches_rebels(snapshot):
query = '''
query RebelsRefetchQuery {
node(id: "RmFjdGlvbjox") {
@ -97,18 +92,10 @@ def test_correctly_refetches_rebels():
}
}
'''
expected = {
'node': {
'id': 'RmFjdGlvbjox',
'name': 'Alliance to Restore the Republic'
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))
def test_correctly_fetches_id_name_empire():
def test_correctly_fetches_id_name_empire(snapshot):
query = '''
query EmpireQuery {
empire {
@ -117,18 +104,10 @@ def test_correctly_fetches_id_name_empire():
}
}
'''
expected = {
'empire': {
'id': 'RmFjdGlvbjoy',
'name': 'Galactic Empire'
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))
def test_correctly_refetches_empire():
def test_correctly_refetches_empire(snapshot):
query = '''
query EmpireRefetchQuery {
node(id: "RmFjdGlvbjoy") {
@ -139,18 +118,10 @@ def test_correctly_refetches_empire():
}
}
'''
expected = {
'node': {
'id': 'RmFjdGlvbjoy',
'name': 'Galactic Empire'
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))
def test_correctly_refetches_xwing():
def test_correctly_refetches_xwing(snapshot):
query = '''
query XWingRefetchQuery {
node(id: "U2hpcDox") {
@ -161,12 +132,4 @@ def test_correctly_refetches_xwing():
}
}
'''
expected = {
'node': {
'id': 'U2hpcDox',
'name': 'X-Wing'
}
}
result = schema.execute(query)
assert not result.errors
assert result.data == expected
snapshot.assert_match(client.execute(query))

View File

@ -10,7 +10,7 @@ except NameError:
__SETUP__ = False
VERSION = (1, 1, 3, 'final', 0)
VERSION = (1, 4, 0, 'final', 0)
__version__ = get_version(VERSION)
@ -43,6 +43,7 @@ if not __SETUP__:
PageInfo
)
from .utils.resolve_only_args import resolve_only_args
from .utils.module_loading import lazy_import
__all__ = [
'AbstractType',
@ -72,4 +73,6 @@ if not __SETUP__:
'ClientIDMutation',
'Connection',
'ConnectionField',
'PageInfo']
'PageInfo',
'lazy_import',
]

View File

@ -5,7 +5,7 @@ from functools import partial
import six
from graphql_relay import connection_from_list
from promise import is_thenable, promisify
from promise import Promise, is_thenable
from ..types import (AbstractType, Boolean, Enum, Int, Interface, List, NonNull, Scalar, String,
Union)
@ -143,7 +143,7 @@ class IterableConnectionField(Field):
on_resolve = partial(cls.resolve_connection, connection_type, args)
if is_thenable(resolved):
return promisify(resolved).then(on_resolve)
return Promise.resolve(resolved).then(on_resolve)
return on_resolve(resolved)

View File

@ -5,6 +5,7 @@ import six
from graphql_relay import from_global_id, to_global_id
from ..types import ID, Field, Interface, ObjectType
from ..types.utils import get_type
from ..types.interface import InterfaceMeta
@ -64,17 +65,18 @@ class NodeField(Field):
name=None, **kwargs):
assert issubclass(node, Node), 'NodeField can only operate in Nodes'
self.node_type = node
# If we don's specify a type, the field type will be the node interface
field_type = type or node
self.field_type = type
super(NodeField, self).__init__(
field_type,
# If we don's specify a type, the field type will be the node interface
type or node,
description='The ID of the object',
id=ID(required=True),
resolver=partial(node.node_resolver, only_type=type)
id=ID(required=True)
)
def get_resolver(self, parent_resolver):
return partial(self.node_type.node_resolver, only_type=get_type(self.field_type))
class Node(six.with_metaclass(NodeMeta, Interface)):
'''An object with an ID'''

View File

@ -45,6 +45,7 @@ class RootQuery(ObjectType):
first = String()
node = Node.Field()
only_node = Node.Field(MyNode)
only_node_lazy = Node.Field(lambda: MyNode)
schema = Schema(query=RootQuery, types=[MyNode, MyOtherNode])
@ -116,6 +117,23 @@ def test_node_field_only_type_wrong():
assert executed.data == { 'onlyNode': None }
def test_node_field_only_lazy_type():
executed = schema.execute(
'{ onlyNodeLazy(id:"%s") { __typename, name } } ' % Node.to_global_id("MyNode", 1)
)
assert not executed.errors
assert executed.data == {'onlyNodeLazy': {'__typename': 'MyNode', 'name': '1'}}
def test_node_field_only_lazy_type_wrong():
executed = schema.execute(
'{ onlyNodeLazy(id:"%s") { __typename, name } } ' % Node.to_global_id("MyOtherNode", 1)
)
assert len(executed.errors) == 1
assert str(executed.errors[0]) == 'Must receive an MyOtherNode id.'
assert executed.data == { 'onlyNodeLazy': None }
def test_str_schema():
assert str(schema) == """
schema {
@ -142,5 +160,6 @@ type RootQuery {
first: String
node(id: ID!): Node
onlyNode(id: ID!): MyNode
onlyNodeLazy(id: ID!): MyNode
}
""".lstrip()

39
graphene/test/__init__.py Normal file
View File

@ -0,0 +1,39 @@
import six
from graphql.error import format_error as format_graphql_error
from graphql.error import GraphQLError
from graphene.types.schema import Schema
def default_format_error(error):
if isinstance(error, GraphQLError):
return format_graphql_error(error)
return {'message': six.text_type(error)}
def format_execution_result(execution_result, format_error):
if execution_result:
response = {}
if execution_result.errors:
response['errors'] = [format_error(e) for e in execution_result.errors]
if not execution_result.invalid:
response['data'] = execution_result.data
return response
class Client(object):
def __init__(self, schema, format_error=None, **execute_options):
assert isinstance(schema, Schema)
self.schema = schema
self.execute_options = execute_options
self.format_error = format_error or default_format_error
def execute(self, *args, **kwargs):
return format_execution_result(
self.schema.execute(*args, **dict(self.execute_options, **kwargs)),
self.format_error
)

View File

@ -3,6 +3,9 @@
import graphene
from graphene import resolve_only_args
class Query(graphene.ObjectType):
rand = graphene.String()
class Success(graphene.ObjectType):
yeah = graphene.String()
@ -45,7 +48,7 @@ def test_create_post():
}
'''
schema = graphene.Schema(mutation=Mutations)
schema = graphene.Schema(query=Query, mutation=Mutations)
result = schema.execute(query_string)
assert not result.errors

View File

@ -0,0 +1,53 @@
# https://github.com/graphql-python/graphene/issues/425
import six
from graphene.utils.is_base_type import is_base_type
from graphene.types.objecttype import ObjectTypeMeta, ObjectType
from graphene.types.options import Options
class SpecialObjectTypeMeta(ObjectTypeMeta):
@staticmethod
def __new__(cls, name, bases, attrs):
# Also ensure initialization is only performed for subclasses of
# DjangoObjectType
if not is_base_type(bases, SpecialObjectTypeMeta):
return type.__new__(cls, name, bases, attrs)
options = Options(
attrs.pop('Meta', None),
other_attr='default',
)
cls = ObjectTypeMeta.__new__(cls, name, bases, dict(attrs, _meta=options))
assert cls._meta is options
return cls
class SpecialObjectType(six.with_metaclass(SpecialObjectTypeMeta, ObjectType)):
pass
def test_special_objecttype_could_be_subclassed():
class MyType(SpecialObjectType):
class Meta:
other_attr = 'yeah!'
assert MyType._meta.other_attr == 'yeah!'
def test_special_objecttype_could_be_subclassed_default():
class MyType(SpecialObjectType):
pass
assert MyType._meta.other_attr == 'default'
def test_special_objecttype_inherit_meta_options():
class MyType(SpecialObjectType):
pass
assert MyType._meta.name == 'MyType'
assert MyType._meta.default_resolver == None
assert MyType._meta.interfaces == ()

View File

@ -4,6 +4,7 @@ from itertools import chain
from .mountedtype import MountedType
from .structures import NonNull
from .dynamic import Dynamic
from .utils import get_type
class Argument(MountedType):
@ -15,10 +16,14 @@ class Argument(MountedType):
type = NonNull(type)
self.name = name
self.type = type
self._type = type
self.default_value = default_value
self.description = description
@property
def type(self):
return get_type(self._type)
def __eq__(self, other):
return isinstance(other, Argument) and (
self.name == other.name,

View File

@ -9,10 +9,13 @@ class Dynamic(MountedType):
the schema. So we can have lazy fields.
'''
def __init__(self, type, _creation_counter=None):
def __init__(self, type, with_schema=False, _creation_counter=None):
super(Dynamic, self).__init__(_creation_counter=_creation_counter)
assert inspect.isfunction(type)
self.type = type
self.with_schema = with_schema
def get_type(self):
def get_type(self, schema=None):
if schema and self.with_schema:
return self.type(schema=schema)
return self.type()

View File

@ -3,6 +3,7 @@ from collections import OrderedDict
import six
from ..utils.is_base_type import is_base_type
from ..utils.trim_docstring import trim_docstring
from .options import Options
from .unmountedtype import UnmountedType
@ -12,6 +13,12 @@ except ImportError:
from ..pyutils.enum import Enum as PyEnum
def eq_enum(self, other):
if isinstance(other, self.__class__):
return self is other
return self.value is other
class EnumTypeMeta(type):
def __new__(cls, name, bases, attrs):
@ -23,10 +30,11 @@ class EnumTypeMeta(type):
options = Options(
attrs.pop('Meta', None),
name=name,
description=attrs.get('__doc__'),
description=trim_docstring(attrs.get('__doc__')),
enum=None,
)
if not options.enum:
attrs['__eq__'] = eq_enum
options.enum = PyEnum(cls.__name__, attrs)
new_attrs = OrderedDict(attrs, _meta=options, **options.enum.__members__)
@ -35,11 +43,18 @@ class EnumTypeMeta(type):
def __prepare__(name, bases, **kwargs): # noqa: N805
return OrderedDict()
def get(cls, value):
return cls._meta.enum(value)
def __getitem__(cls, value):
return cls._meta.enum[value]
def __call__(cls, *args, **kwargs): # noqa: N805
if cls is Enum:
description = kwargs.pop('description', None)
return cls.from_enum(PyEnum(*args, **kwargs), description=description)
return super(EnumTypeMeta, cls).__call__(*args, **kwargs)
# return cls._meta.enum(*args, **kwargs)
def from_enum(cls, enum, description=None): # noqa: N805
meta_class = type('Meta', (object,), {'enum': enum, 'description': description})

View File

@ -6,6 +6,7 @@ from .argument import Argument, to_arguments
from .mountedtype import MountedType
from .structures import NonNull
from .unmountedtype import UnmountedType
from .utils import get_type
base_type = type
@ -60,9 +61,7 @@ class Field(MountedType):
@property
def type(self):
if inspect.isfunction(self._type) or type(self._type) is partial:
return self._type()
return self._type
return get_type(self._type)
def get_resolver(self, parent_resolver):
return self.resolver or parent_resolver

39
graphene/types/generic.py Normal file
View File

@ -0,0 +1,39 @@
from __future__ import unicode_literals
from graphql.language.ast import (BooleanValue, FloatValue, IntValue,
StringValue, ListValue, ObjectValue)
from graphene.types.scalars import MIN_INT, MAX_INT
from .scalars import Scalar
class GenericScalar(Scalar):
"""
The `GenericScalar` scalar type represents a generic
GraphQL scalar value that could be:
String, Boolean, Int, Float, List or Object.
"""
@staticmethod
def identity(value):
return value
serialize = identity
parse_value = identity
@staticmethod
def parse_literal(ast):
if isinstance(ast, (StringValue, BooleanValue)):
return ast.value
elif isinstance(ast, IntValue):
num = int(ast.value)
if MIN_INT <= num <= MAX_INT:
return num
elif isinstance(ast, FloatValue):
return float(ast.value)
elif isinstance(ast, ListValue):
return [GenericScalar.parse_literal(value) for value in ast.values]
elif isinstance(ast, ObjectValue):
return {field.name.value: GenericScalar.parse_literal(field.value) for field in ast.fields}
else:
return None

View File

@ -1,5 +1,6 @@
from .mountedtype import MountedType
from .structures import NonNull
from .utils import get_type
class InputField(MountedType):
@ -11,7 +12,11 @@ class InputField(MountedType):
self.name = name
if required:
type = NonNull(type)
self.type = type
self._type = type
self.deprecation_reason = deprecation_reason
self.default_value = default_value
self.description = description
@property
def type(self):
return get_type(self._type)

View File

@ -1,6 +1,7 @@
import six
from ..utils.is_base_type import is_base_type
from ..utils.trim_docstring import trim_docstring
from .abstracttype import AbstractTypeMeta
from .inputfield import InputField
from .options import Options
@ -19,7 +20,7 @@ class InputObjectTypeMeta(AbstractTypeMeta):
options = Options(
attrs.pop('Meta', None),
name=name,
description=attrs.get('__doc__'),
description=trim_docstring(attrs.get('__doc__')),
local_fields=None,
)

View File

@ -1,6 +1,7 @@
import six
from ..utils.is_base_type import is_base_type
from ..utils.trim_docstring import trim_docstring
from .abstracttype import AbstractTypeMeta
from .field import Field
from .options import Options
@ -18,7 +19,7 @@ class InterfaceMeta(AbstractTypeMeta):
options = Options(
attrs.pop('Meta', None),
name=name,
description=attrs.get('__doc__'),
description=trim_docstring(attrs.get('__doc__')),
local_fields=None,
)

View File

@ -3,6 +3,7 @@ from collections import OrderedDict
import six
from ..utils.is_base_type import is_base_type
from ..utils.trim_docstring import trim_docstring
from .abstracttype import AbstractTypeMeta
from .field import Field
from .interface import Interface
@ -19,13 +20,22 @@ class ObjectTypeMeta(AbstractTypeMeta):
return type.__new__(cls, name, bases, attrs)
_meta = attrs.pop('_meta', None)
options = _meta or Options(
attrs.pop('Meta', None),
defaults = dict(
name=name,
description=attrs.get('__doc__'),
description=trim_docstring(attrs.get('__doc__')),
interfaces=(),
possible_types=(),
default_resolver=None,
local_fields=OrderedDict(),
)
if not _meta:
options = Options(
attrs.pop('Meta', None),
**defaults
)
else:
options = _meta.extend_with_defaults(defaults)
options.base_fields = get_base_fields(bases, _as=Field)
if not options.local_fields:
@ -46,6 +56,11 @@ class ObjectTypeMeta(AbstractTypeMeta):
cls = type.__new__(cls, name, bases, dict(attrs, _meta=options))
assert not (options.possible_types and cls.is_type_of), (
'{}.Meta.possible_types will cause type collision with {}.is_type_of. '
'Please use one or other.'
).format(name, name)
for interface in options.interfaces:
interface.implements(cls)

View File

@ -30,6 +30,12 @@ class Options(object):
)
)
def extend_with_defaults(self, defaults):
for attr_name, value in defaults.items():
if not hasattr(self, attr_name):
setattr(self, attr_name, value)
return self
def __repr__(self):
options_props = props(self)
props_as_attrs = ' '.join(['{}={}'.format(key, value) for key, value in options_props.items()])

View File

@ -0,0 +1,19 @@
def attr_resolver(attname, default_value, root, args, context, info):
return getattr(root, attname, default_value)
def dict_resolver(attname, default_value, root, args, context, info):
return root.get(attname, default_value)
default_resolver = attr_resolver
def set_default_resolver(resolver):
global default_resolver
assert callable(resolver), 'Received non-callable resolver.'
default_resolver = resolver
def get_default_resolver():
return default_resolver

View File

@ -1,9 +1,9 @@
import six
from graphql.language.ast import (BooleanValue, FloatValue, IntValue,
StringValue)
from ..utils.is_base_type import is_base_type
from ..utils.trim_docstring import trim_docstring
from .options import Options
from .unmountedtype import UnmountedType
@ -19,7 +19,7 @@ class ScalarTypeMeta(type):
options = Options(
attrs.pop('Meta', None),
name=name,
description=attrs.get('__doc__'),
description=trim_docstring(attrs.get('__doc__')),
)
return type.__new__(cls, name, bases, dict(attrs, _meta=options))

View File

@ -1,3 +1,4 @@
import inspect
from graphql import GraphQLSchema, graphql, is_type
from graphql.type.directives import (GraphQLDirective, GraphQLIncludeDirective,
@ -7,6 +8,7 @@ from graphql.utils.introspection_query import introspection_query
from graphql.utils.schema_printer import print_schema
from .definitions import GrapheneGraphQLType
from .objecttype import ObjectType
from .typemap import TypeMap, is_graphene_type
@ -20,6 +22,9 @@ class Schema(GraphQLSchema):
def __init__(self, query=None, mutation=None, subscription=None,
directives=None, types=None, auto_camelcase=True):
assert inspect.isclass(query) and issubclass(query, ObjectType), (
'Schema query must be Object Type but got: {}.'
).format(query)
self._query = query
self._mutation = mutation
self._subscription = subscription
@ -77,7 +82,10 @@ class Schema(GraphQLSchema):
return graphql(self, *args, **kwargs)
def introspect(self):
return self.execute(introspection_query).data
instrospection = self.execute(introspection_query)
if instrospection.errors:
raise instrospection.errors[0]
return instrospection.data
def __str__(self):
return print_schema(self)
@ -94,4 +102,4 @@ class Schema(GraphQLSchema):
]
if self.types:
initial_types += self.types
self._type_map = TypeMap(initial_types, auto_camelcase=self.auto_camelcase)
self._type_map = TypeMap(initial_types, auto_camelcase=self.auto_camelcase, schema=self)

View File

@ -1,4 +1,5 @@
from .unmountedtype import UnmountedType
from .utils import get_type
class Structure(UnmountedType):
@ -18,7 +19,11 @@ class Structure(UnmountedType):
cls_name,
of_type_name,
))
self.of_type = of_type
self._of_type = of_type
@property
def of_type(self):
return get_type(self._of_type)
def get_type(self):
'''

View File

@ -1,4 +1,5 @@
import pytest
from functools import partial
from ..argument import Argument, to_arguments
from ..field import Field
@ -61,3 +62,15 @@ def test_to_arguments_raises_if_inputfield():
to_arguments(args)
assert str(exc_info.value) == 'Expected arg_string to be Argument, but received InputField. Try using Argument(String).'
def test_argument_with_lazy_type():
MyType = object()
arg = Argument(lambda: MyType)
assert arg.type == MyType
def test_argument_with_lazy_partial_type():
MyType = object()
arg = Argument(partial(lambda: MyType))
assert arg.type == MyType

View File

@ -111,3 +111,52 @@ def test_enum_value_as_unmounted_argument():
unmounted_field = unmounted.Argument()
assert isinstance(unmounted_field, Argument)
assert unmounted_field.type == RGB
def test_enum_can_be_compared():
class RGB(Enum):
RED = 1
GREEN = 2
BLUE = 3
assert RGB.RED == 1
assert RGB.GREEN == 2
assert RGB.BLUE == 3
def test_enum_can_be_initialzied():
class RGB(Enum):
RED = 1
GREEN = 2
BLUE = 3
assert RGB.get(1) == RGB.RED
assert RGB.get(2) == RGB.GREEN
assert RGB.get(3) == RGB.BLUE
def test_enum_can_retrieve_members():
class RGB(Enum):
RED = 1
GREEN = 2
BLUE = 3
assert RGB['RED'] == RGB.RED
assert RGB['GREEN'] == RGB.GREEN
assert RGB['BLUE'] == RGB.BLUE
def test_enum_to_enum_comparison_should_differ():
class RGB1(Enum):
RED = 1
GREEN = 2
BLUE = 3
class RGB2(Enum):
RED = 1
GREEN = 2
BLUE = 3
assert RGB1.RED != RGB2.RED
assert RGB1.GREEN != RGB2.GREEN
assert RGB1.BLUE != RGB2.BLUE

View File

@ -1,9 +1,11 @@
import pytest
from functools import partial
from ..argument import Argument
from ..field import Field
from ..structures import NonNull
from ..scalars import String
from .utils import MyLazyType
class MyInstance(object):
@ -66,6 +68,17 @@ def test_field_with_lazy_type():
assert field.type == MyType
def test_field_with_lazy_partial_type():
MyType = object()
field = Field(partial(lambda: MyType))
assert field.type == MyType
def test_field_with_string_type():
field = Field("graphene.types.tests.utils.MyLazyType")
assert field.type == MyLazyType
def test_field_not_source_and_resolver():
MyType = object()
with pytest.raises(Exception) as exc_info:

View File

@ -0,0 +1,96 @@
from ..generic import GenericScalar
from ..objecttype import ObjectType
from ..schema import Schema
class Query(ObjectType):
generic = GenericScalar(input=GenericScalar())
def resolve_generic(self, args, context, info):
input = args.get('input')
return input
schema = Schema(query=Query)
def test_generic_query_variable():
for generic_value in [
1,
1.1,
True,
'str',
[1, 2, 3],
[1.1, 2.2, 3.3],
[True, False],
['str1', 'str2'],
{
'key_a': 'a',
'key_b': 'b'
},
{
'int': 1,
'float': 1.1,
'boolean': True,
'string': 'str',
'int_list': [1, 2, 3],
'float_list': [1.1, 2.2, 3.3],
'boolean_list': [True, False],
'string_list': ['str1', 'str2'],
'nested_dict': {
'key_a': 'a',
'key_b': 'b'
}
},
None
]:
result = schema.execute(
'''query Test($generic: GenericScalar){ generic(input: $generic) }''',
variable_values={'generic': generic_value}
)
assert not result.errors
assert result.data == {
'generic': generic_value
}
def test_generic_parse_literal_query():
result = schema.execute(
'''
query {
generic(input: {
int: 1,
float: 1.1
boolean: true,
string: "str",
int_list: [1, 2, 3],
float_list: [1.1, 2.2, 3.3],
boolean_list: [true, false]
string_list: ["str1", "str2"],
nested_dict: {
key_a: "a",
key_b: "b"
},
empty_key: undefined
})
}
'''
)
assert not result.errors
assert result.data == {
'generic': {
'int': 1,
'float': 1.1,
'boolean': True,
'string': 'str',
'int_list': [1, 2, 3],
'float_list': [1.1, 2.2, 3.3],
'boolean_list': [True, False],
'string_list': ['str1', 'str2'],
'nested_dict': {
'key_a': 'a',
'key_b': 'b'
},
'empty_key': None
}
}

View File

@ -0,0 +1,30 @@
import pytest
from functools import partial
from ..inputfield import InputField
from ..structures import NonNull
from .utils import MyLazyType
def test_inputfield_required():
MyType = object()
field = InputField(MyType, required=True)
assert isinstance(field.type, NonNull)
assert field.type.of_type == MyType
def test_inputfield_with_lazy_type():
MyType = object()
field = InputField(lambda: MyType)
assert field.type == MyType
def test_inputfield_with_lazy_partial_type():
MyType = object()
field = InputField(partial(lambda: MyType))
assert field.type == MyType
def test_inputfield_with_string_type():
field = InputField("graphene.types.tests.utils.MyLazyType")
assert field.type == MyLazyType

View File

@ -173,3 +173,38 @@ def test_objecttype_container_benchmark(benchmark):
@benchmark
def create_objecttype():
Container(field1='field1', field2='field2')
def test_generate_objecttype_description():
class MyObjectType(ObjectType):
'''
Documentation
Documentation line 2
'''
assert MyObjectType._meta.description == "Documentation\n\nDocumentation line 2"
def test_objecttype_with_possible_types():
class MyObjectType(ObjectType):
class Meta:
possible_types = (dict, )
assert MyObjectType._meta.possible_types == (dict, )
def test_objecttype_with_possible_types_and_is_type_of_should_raise():
with pytest.raises(AssertionError) as excinfo:
class MyObjectType(ObjectType):
class Meta:
possible_types = (dict, )
@classmethod
def is_type_of(cls, root, context, info):
return False
assert str(excinfo.value) == (
'MyObjectType.Meta.possible_types will cause type collision with '
'MyObjectType.is_type_of. Please use one or other.'
)

View File

@ -0,0 +1,48 @@
import pytest
from ..resolver import attr_resolver, dict_resolver, get_default_resolver, set_default_resolver
args = {}
context = None
info = None
demo_dict = {
'attr': 'value'
}
class demo_obj(object):
attr = 'value'
def test_attr_resolver():
resolved = attr_resolver('attr', None, demo_obj, args, context, info)
assert resolved == 'value'
def test_attr_resolver_default_value():
resolved = attr_resolver('attr2', 'default', demo_obj, args, context, info)
assert resolved == 'default'
def test_dict_resolver():
resolved = dict_resolver('attr', None, demo_dict, args, context, info)
assert resolved == 'value'
def test_dict_resolver_default_value():
resolved = dict_resolver('attr2', 'default', demo_dict, args, context, info)
assert resolved == 'default'
def test_get_default_resolver_is_attr_resolver():
assert get_default_resolver() == attr_resolver
def test_set_default_resolver_workd():
default_resolver = get_default_resolver()
set_default_resolver(dict_resolver)
assert get_default_resolver() == dict_resolver
set_default_resolver(default_resolver)

View File

@ -1,7 +1,9 @@
import pytest
from functools import partial
from ..structures import List, NonNull
from ..scalars import String
from .utils import MyLazyType
def test_list():
@ -17,6 +19,23 @@ def test_list_with_unmounted_type():
assert str(exc_info.value) == 'List could not have a mounted String() as inner type. Try with List(String).'
def test_list_with_lazy_type():
MyType = object()
field = List(lambda: MyType)
assert field.of_type == MyType
def test_list_with_lazy_partial_type():
MyType = object()
field = List(partial(lambda: MyType))
assert field.of_type == MyType
def test_list_with_string_type():
field = List("graphene.types.tests.utils.MyLazyType")
assert field.of_type == MyLazyType
def test_list_inherited_works_list():
_list = List(List(String))
assert isinstance(_list.of_type, List)
@ -35,6 +54,23 @@ def test_nonnull():
assert str(nonnull) == 'String!'
def test_nonnull_with_lazy_type():
MyType = object()
field = NonNull(lambda: MyType)
assert field.of_type == MyType
def test_nonnull_with_lazy_partial_type():
MyType = object()
field = NonNull(partial(lambda: MyType))
assert field.of_type == MyType
def test_nonnull_with_string_type():
field = NonNull("graphene.types.tests.utils.MyLazyType")
assert field.of_type == MyLazyType
def test_nonnull_inherited_works_list():
_list = NonNull(List(String))
assert isinstance(_list.of_type, List)

View File

@ -183,3 +183,18 @@ def test_objecttype_camelcase_disabled():
assert foo_field.args == {
'bar_foo': GraphQLArgument(GraphQLString, out_name='bar_foo')
}
def test_objecttype_with_possible_types():
class MyObjectType(ObjectType):
'''Description'''
class Meta:
possible_types = (dict, )
foo_bar = String()
typemap = TypeMap([MyObjectType])
graphql_type = typemap['MyObjectType']
assert graphql_type.is_type_of
assert graphql_type.is_type_of({}, None, None) is True
assert graphql_type.is_type_of(MyObjectType(), None, None) is False

View File

@ -0,0 +1 @@
MyLazyType = object()

View File

@ -21,6 +21,7 @@ from .field import Field
from .inputobjecttype import InputObjectType
from .interface import Interface
from .objecttype import ObjectType
from .resolver import get_default_resolver
from .scalars import ID, Boolean, Float, Int, Scalar, String
from .structures import List, NonNull
from .union import Union
@ -43,16 +44,23 @@ def resolve_type(resolve_type_func, map, type_name, root, context, info):
if inspect.isclass(_type) and issubclass(_type, ObjectType):
graphql_type = map.get(_type._meta.name)
assert graphql_type and graphql_type.graphene_type == _type
assert graphql_type and graphql_type.graphene_type == _type, (
'The type {} does not match with the associated graphene type {}.'
).format(_type, graphql_type.graphene_type)
return graphql_type
return _type
def is_type_of_from_possible_types(possible_types, root, context, info):
return isinstance(root, possible_types)
class TypeMap(GraphQLTypeMap):
def __init__(self, types, auto_camelcase=True):
def __init__(self, types, auto_camelcase=True, schema=None):
self.auto_camelcase = auto_camelcase
self.schema = schema
super(TypeMap, self).__init__(types)
def reducer(self, map, type):
@ -70,23 +78,31 @@ class TypeMap(GraphQLTypeMap):
if type._meta.name in map:
_type = map[type._meta.name]
if isinstance(_type, GrapheneGraphQLType):
assert _type.graphene_type == type
assert _type.graphene_type == type, (
'Found different types with the same name in the schema: {}, {}.'
).format(_type.graphene_type, type)
return map
if issubclass(type, ObjectType):
return self.construct_objecttype(map, type)
if issubclass(type, InputObjectType):
return self.construct_inputobjecttype(map, type)
if issubclass(type, Interface):
return self.construct_interface(map, type)
if issubclass(type, Scalar):
return self.construct_scalar(map, type)
if issubclass(type, Enum):
return self.construct_enum(map, type)
if issubclass(type, Union):
return self.construct_union(map, type)
return map
internal_type = self.construct_objecttype(map, type)
elif issubclass(type, InputObjectType):
internal_type = self.construct_inputobjecttype(map, type)
elif issubclass(type, Interface):
internal_type = self.construct_interface(map, type)
elif issubclass(type, Scalar):
internal_type = self.construct_scalar(map, type)
elif issubclass(type, Enum):
internal_type = self.construct_enum(map, type)
elif issubclass(type, Union):
internal_type = self.construct_union(map, type)
else:
raise Exception("Expected Graphene type, but received: {}.".format(type))
return GraphQLTypeMap.reducer(map, internal_type)
def construct_scalar(self, map, type):
# We have a mapping to the original GraphQL types
# so there are no collisions.
_scalars = {
String: GraphQLString,
Int: GraphQLInt,
@ -95,18 +111,17 @@ class TypeMap(GraphQLTypeMap):
ID: GraphQLID
}
if type in _scalars:
map[type._meta.name] = _scalars[type]
else:
map[type._meta.name] = GrapheneScalarType(
graphene_type=type,
name=type._meta.name,
description=type._meta.description,
return _scalars[type]
serialize=getattr(type, 'serialize', None),
parse_value=getattr(type, 'parse_value', None),
parse_literal=getattr(type, 'parse_literal', None),
)
return map
return GrapheneScalarType(
graphene_type=type,
name=type._meta.name,
description=type._meta.description,
serialize=getattr(type, 'serialize', None),
parse_value=getattr(type, 'parse_value', None),
parse_literal=getattr(type, 'parse_literal', None),
)
def construct_enum(self, map, type):
values = OrderedDict()
@ -117,92 +132,104 @@ class TypeMap(GraphQLTypeMap):
description=getattr(value, 'description', None),
deprecation_reason=getattr(value, 'deprecation_reason', None)
)
map[type._meta.name] = GrapheneEnumType(
return GrapheneEnumType(
graphene_type=type,
values=values,
name=type._meta.name,
description=type._meta.description,
)
return map
def construct_objecttype(self, map, type):
if type._meta.name in map:
_type = map[type._meta.name]
if isinstance(_type, GrapheneGraphQLType):
assert _type.graphene_type == type
return map
map[type._meta.name] = GrapheneObjectType(
assert _type.graphene_type == type, (
'Found different types with the same name in the schema: {}, {}.'
).format(_type.graphene_type, type)
return _type
def interfaces():
interfaces = []
for interface in type._meta.interfaces:
self.graphene_reducer(map, interface)
internal_type = map[interface._meta.name]
assert internal_type.graphene_type == interface
interfaces.append(internal_type)
return interfaces
if type._meta.possible_types:
is_type_of = partial(is_type_of_from_possible_types, type._meta.possible_types)
else:
is_type_of = type.is_type_of
return GrapheneObjectType(
graphene_type=type,
name=type._meta.name,
description=type._meta.description,
fields=None,
is_type_of=type.is_type_of,
interfaces=None
fields=partial(self.construct_fields_for_type, map, type),
is_type_of=is_type_of,
interfaces=interfaces
)
interfaces = []
for i in type._meta.interfaces:
map = self.reducer(map, i)
interfaces.append(map[i._meta.name])
map[type._meta.name]._provided_interfaces = interfaces
map[type._meta.name]._fields = self.construct_fields_for_type(map, type)
# self.reducer(map, map[type._meta.name])
return map
def construct_interface(self, map, type):
if type._meta.name in map:
_type = map[type._meta.name]
if isinstance(_type, GrapheneInterfaceType):
assert _type.graphene_type == type, (
'Found different types with the same name in the schema: {}, {}.'
).format(_type.graphene_type, type)
return _type
_resolve_type = None
if type.resolve_type:
_resolve_type = partial(resolve_type, type.resolve_type, map, type._meta.name)
map[type._meta.name] = GrapheneInterfaceType(
return GrapheneInterfaceType(
graphene_type=type,
name=type._meta.name,
description=type._meta.description,
fields=None,
fields=partial(self.construct_fields_for_type, map, type),
resolve_type=_resolve_type,
)
map[type._meta.name]._fields = self.construct_fields_for_type(map, type)
# self.reducer(map, map[type._meta.name])
return map
def construct_inputobjecttype(self, map, type):
map[type._meta.name] = GrapheneInputObjectType(
return GrapheneInputObjectType(
graphene_type=type,
name=type._meta.name,
description=type._meta.description,
fields=None,
fields=partial(self.construct_fields_for_type, map, type, is_input_type=True),
)
map[type._meta.name]._fields = self.construct_fields_for_type(map, type, is_input_type=True)
return map
def construct_union(self, map, type):
_resolve_type = None
if type.resolve_type:
_resolve_type = partial(resolve_type, type.resolve_type, map, type._meta.name)
types = []
for i in type._meta.types:
map = self.construct_objecttype(map, i)
types.append(map[i._meta.name])
map[type._meta.name] = GrapheneUnionType(
def types():
union_types = []
for objecttype in type._meta.types:
self.graphene_reducer(map, objecttype)
internal_type = map[objecttype._meta.name]
assert internal_type.graphene_type == objecttype
union_types.append(internal_type)
return union_types
return GrapheneUnionType(
graphene_type=type,
name=type._meta.name,
types=types,
resolve_type=_resolve_type,
)
map[type._meta.name].types = types
return map
def get_name(self, name):
if self.auto_camelcase:
return to_camel_case(name)
return name
def default_resolver(self, attname, default_value, root, *_):
return getattr(root, attname, default_value)
def construct_fields_for_type(self, map, type, is_input_type=False):
fields = OrderedDict()
for name, field in type._meta.fields.items():
if isinstance(field, Dynamic):
field = get_field_as(field.get_type(), _as=Field)
field = get_field_as(field.get_type(self.schema), _as=Field)
if not field:
continue
map = self.reducer(map, field.type)
@ -257,13 +284,12 @@ class TypeMap(GraphQLTypeMap):
if resolver:
return get_unbound_function(resolver)
return partial(self.default_resolver, name, default_value)
default_resolver = type._meta.default_resolver or get_default_resolver()
return partial(default_resolver, name, default_value)
def get_field_type(self, map, type):
if isinstance(type, List):
return GraphQLList(self.get_field_type(map, type.of_type))
if isinstance(type, NonNull):
return GraphQLNonNull(self.get_field_type(map, type.of_type))
if inspect.isfunction(type):
type = type()
return map.get(type._meta.name)

View File

@ -1,6 +1,7 @@
import six
from ..utils.is_base_type import is_base_type
from ..utils.trim_docstring import trim_docstring
from .options import Options
from .unmountedtype import UnmountedType
@ -16,7 +17,7 @@ class UnionMeta(type):
options = Options(
attrs.pop('Meta', None),
name=name,
description=attrs.get('__doc__'),
description=trim_docstring(attrs.get('__doc__')),
types=(),
)

View File

@ -1,5 +1,9 @@
import inspect
from collections import OrderedDict
from functools import partial
from six import string_types
from ..utils.module_loading import import_string
from .mountedtype import MountedType
from .unmountedtype import UnmountedType
@ -62,3 +66,11 @@ def yank_fields_from_attrs(attrs, _as=None, delete=True, sort=True):
if sort:
fields_with_names = sorted(fields_with_names, key=lambda f: f[1])
return OrderedDict(fields_with_names)
def get_type(_type):
if isinstance(_type, string_types):
return import_string(_type)
if inspect.isfunction(_type) or type(_type) is partial:
return _type()
return _type

View File

@ -0,0 +1,44 @@
from functools import partial
from importlib import import_module
def import_string(dotted_path, dotted_attributes=None):
"""
Import a dotted module path and return the attribute/class designated by the
last name in the path. When a dotted attribute path is also provided, the
dotted attribute path would be applied to the attribute/class retrieved from
the first step, and return the corresponding value designated by the
attribute path. Raise ImportError if the import failed.
"""
try:
module_path, class_name = dotted_path.rsplit('.', 1)
except ValueError:
raise ImportError("%s doesn't look like a module path" % dotted_path)
module = import_module(module_path)
try:
result = getattr(module, class_name)
except AttributeError:
raise ImportError('Module "%s" does not define a "%s" attribute/class' % (
module_path, class_name)
)
if not dotted_attributes:
return result
else:
attributes = dotted_attributes.split('.')
traveled_attributes = []
try:
for attribute in attributes:
traveled_attributes.append(attribute)
result = getattr(result, attribute)
return result
except AttributeError:
raise ImportError('Module "%s" does not define a "%s" attribute inside attribute/class "%s"' % (
module_path, '.'.join(traveled_attributes), class_name
))
def lazy_import(dotted_path, dotted_attributes=None):
return partial(import_string, dotted_path, dotted_attributes)

View File

@ -0,0 +1,57 @@
from pytest import raises
from graphene import String
from graphene.types.objecttype import ObjectTypeMeta
from ..module_loading import lazy_import, import_string
def test_import_string():
MyString = import_string('graphene.String')
assert MyString == String
MyObjectTypeMeta = import_string('graphene.ObjectType', '__class__')
assert MyObjectTypeMeta == ObjectTypeMeta
def test_import_string_module():
with raises(Exception) as exc_info:
import_string('graphenea')
assert str(exc_info.value) == 'graphenea doesn\'t look like a module path'
def test_import_string_class():
with raises(Exception) as exc_info:
import_string('graphene.Stringa')
assert str(exc_info.value) == 'Module "graphene" does not define a "Stringa" attribute/class'
def test_import_string_attributes():
with raises(Exception) as exc_info:
import_string('graphene.String', 'length')
assert str(exc_info.value) == 'Module "graphene" does not define a "length" attribute inside attribute/class ' \
'"String"'
with raises(Exception) as exc_info:
import_string('graphene.ObjectType', '__class__.length')
assert str(exc_info.value) == 'Module "graphene" does not define a "__class__.length" attribute inside ' \
'attribute/class "ObjectType"'
with raises(Exception) as exc_info:
import_string('graphene.ObjectType', '__classa__.__base__')
assert str(exc_info.value) == 'Module "graphene" does not define a "__classa__" attribute inside attribute/class ' \
'"ObjectType"'
def test_lazy_import():
f = lazy_import('graphene.String')
MyString = f()
assert MyString == String
f = lazy_import('graphene.ObjectType', '__class__')
MyObjectTypeMeta = f()
assert MyObjectTypeMeta == ObjectTypeMeta

View File

@ -0,0 +1,21 @@
from ..trim_docstring import trim_docstring
def test_trim_docstring():
class WellDocumentedObject(object):
"""
This object is very well-documented. It has multiple lines in its
description.
Multiple paragraphs too
"""
pass
assert (trim_docstring(WellDocumentedObject.__doc__) ==
"This object is very well-documented. It has multiple lines in its\n"
"description.\n\nMultiple paragraphs too")
class UndocumentedObject(object):
pass
assert trim_docstring(UndocumentedObject.__doc__) is None

View File

@ -0,0 +1,9 @@
import inspect
def trim_docstring(docstring):
# Cleans up whitespaces from an indented docstring
#
# See https://www.python.org/dev/peps/pep-0257/
# and https://docs.python.org/2/library/inspect.html#inspect.cleandoc
return inspect.cleandoc(docstring) if docstring else None

View File

@ -41,6 +41,7 @@ tests_require = [
'pytest>=2.7.2',
'pytest-benchmark',
'pytest-cov',
'snapshottest',
'coveralls',
'six',
'mock',
@ -81,9 +82,9 @@ setup(
install_requires=[
'six>=1.10.0',
'graphql-core>=1.0.1',
'graphql-core>=1.1',
'graphql-relay>=0.4.5',
'promise>=1.0.1',
'promise>=2.0',
],
tests_require=tests_require,
extras_require={

View File

@ -5,7 +5,8 @@ skipsdist = true
[testenv]
deps=
pytest>=2.7.2
graphql-core>=1.0.1
graphql-core>=1.1
promise>=2.0
graphql-relay>=0.4.5
six
blinker