mirumee / ariadne Goto Github PK
View Code? Open in Web Editor NEWPython library for implementing GraphQL servers using schema-first approach.
Home Page: https://ariadnegraphql.org
License: BSD 3-Clause "New" or "Revised" License
Python library for implementing GraphQL servers using schema-first approach.
Home Page: https://ariadnegraphql.org
License: BSD 3-Clause "New" or "Revised" License
Python variables are usually named using the snake_case
, whereas majority of JS written today uses the camelCase
. Ariadne will need to provide a way to resolve one to another.
The idea is to create resolver_for
function taking single name
argument and returning resolver for specified attr/key name, enabling explicit mapping as such:
resolvers_map = {
"User": {
"lastVisitedOn": resolve_to("last_visited_on")
}
}
Other advantage of having such utility would be arbitrary field names:
resolvers_map = {
"User": {
"lastVisitedOn": resolve_to("last_visit")
}
}
Hi, I keep getting a 405
when using the example code, and I'm wondering if it's a CORS issue, since my frontend and backend are running on different ports. If so, how do I enable CORS with Ariadne?
Currently we require schema
to be explictly declared:
schema {
query: Query
mutation: Mutation
}
This goes against the grain when it comes to GraphQL libraries and guides, which overwhelmingly assume that schema type is implicit and can be but doesn't require to be declared as long as either Query
or Mutation
exists.
Currently, syntax errors in SDL are raised by graphql()
, which itself is called deep in GraphQL server initialization logic. This makes it trickier to trace back to invalid type definitions - especially in complex projects.
Ariadne could provide gql()
utility function that would allow developers to mark pieces of code as SDL, and validate those SDL strings on scripts execution, raising execetions with traceback pointing to point of definition, making those easier to find and debug:
from ariadne import gql
type_defs = gql("""
type User {
username: String!
posts: [Post!]
threads: [Thread!]
follows [User!]
likes: [Post!]
}
""")
Above code would raise GraphQLSyntaxError
pointing to type_defs
, instead of somewhere in GraphQL server initialization logic.
This approach adds one more advantage: If we know that strings wrapped with gql()
are SDL, we can develop a pattern for GraphQL syntax highlighters for IDEs to use to color GraphQL syntax in Python projects.
Lastly, gql()
should pass through the strings, because our current modularization implementation is very simple: internally it does string joining using \n
as glue. We'll rather avoid reimplementing it to create custom mechanism operating on ast
's.
Hi, it is very useful library, thanks.
Do you plan to implement support extending a schema, documentation?
In my case are default scheme and extensions. I need to extend scheme by some extensions according some rule.
Now that Ariadne requires Python 3.6 or greater, we can replace our assertions with built-ins like assert_called_once
The schema object returned by graphql-core
after parsing SDL and putting it trough build_ast_schema
has no resolvers. This is by design as graphql-core
is python reimplementation of GraphQL reference implementation, graphql-js
.
Apollo-Server goes around this by implementing the addResolveFunctionsToSchema utility function that walks trough the schema and assigns resolver functions to it from resolvers map.
Possibly related to #91, I'd love to see what an implementation of SQLAlchemy bindings might look like. Or perhaps bindings isn't the right term โ I'm thinking of how graphene-sqlalchemy works. Should I be doing this model<->resolver mapping myself?
Thank you!
Hi
What is different between ariadne and python-graphql(graphene).
And can we use Graphene-django in Saleor?
Currently Ariadne supports enums that are represented as strings that are mapped 1:1 to what they are in SDL. This should take care of majority of use cases, but there may be edge cases and legacy systems, where enum is defined for existing value that may be integer, and updating python implementation is out of question.
A bulk of logic living in GraphQLMiddleware
could be extracted and reused in server implementations like django_ariadne
.
We should have release notes on our docs, that would provide information about new things and how to update from previous versions.
This task is opposite to #21, eg:
type_defs = """
input UserInput {
contactEmail
}
"""
resolvers = {
"UserInput": {
"contactEmail": resolve_to('contact_email'),
},
}
Alternatively, we would like to explore converting inputs from dicts to custom data objects.
We could provide a utility function that would take a path to the filesystem, and:
.graphql
files, validate each file, and concat them into single SDL stringThat way users could easily use single schema definition for both graphql API and frontend tools.
We've originally planned to have separate django_ariadne
repo to keep bindings for django, but we've discussed this internally and decided to give ariadne.contrib.django
a try instead. This has advantage of keeping all features in single repo, but it may be tricker to test if we add more bindings for other projects to the repo.
If the "application/graphql" Content-Type header is present, treat the HTTP POST body contents as the GraphQL query string.
Due to lack of support for variable type annotations I suggest to drop support for Python 3.5. This is already a problem in #30 in which either code or mypy is failing and only solution is to remove hints.
We might consider testing ariadne on 3.7 for future-proofing our project.
We've changed API for our server utils in #100 requiring users to create executable schema on their own outside of thise. Readme example should be updated to reflect this.
Ariadne is based on idea of schema SDL being the source of truth and so it is not generated from python code, like Graphene does. There are very good reasons why we decided to build it up this way, but one of few inevitable downsides of this decision is problem with mapping input and output data typing validated by GraphQL abstraction to internal representation in business logic.
This might not seem like a big deal at first, but few problems arise quickly in service development:
Example scenario no. 1: extension of mutation input type with new field, leading to data being ignored or inconsistent runtime errors.
Example scenario no. 2: enumeration type in interface is updated, can you handle new/changed value and covered new branches in logic?
So, what solutions do we have? We would like to make use of modern language concepts, like dataclasses and type hints to enforce type safety. After discussing it internally, those appeared to be main candidates:
Schema.get_type("SomeEnum")
or exposing all types through other means, like import hooksThis is complex issue with no obvious solution visible, so I would like us to discuss pros and cons of each one, maybe even come up with yet another one if needed. Keep in mind there is no need to find single champion since there might be no need for any enhancement or maybe there is place for multiple complementary mechanisms.
We should have some test harness to see how context passed to graphql()
gets changed and passed around to resolvers.
def test_context_is_passed_as_is_from_graphql_to_resolver():
context = {"request": "https://example.com/graphql/"}
def resolve_field_using_context(_, info):
assert info.context is context
return True
# run test query against field using context
Hey there! I ported the custom server example from django to flask.
It's quite short and it would be nice to add it to the docs.
from ariadne import gql, ResolverMap, make_executable_schema
from ariadne.constants import PLAYGROUND_HTML
from graphql import format_error, graphql_sync
from flask import request, jsonify
from flask.views import MethodView
type_defs = gql("""
type Query {
hello: String!
}
""")
query = ResolverMap("Query")
@query.field("hello")
def resolve_hello(_, info):
request = info.context
print(request.headers)
user_agent = request.headers.get("User-Agent", "Guest")
return "Hello, %s!" % user_agent
schema = make_executable_schema(type_defs, query)
class GraphQlView(MethodView):
def get(self):
return PLAYGROUND_HTML, 200
def post(self):
data = request.get_json()
if data is None or not isinstance(data, dict):
return 'Bad Request', 400
variables = data.get('variables')
if variables and not isinstance(variables, dict):
return 'Bad Request', 400
# Note: Passing the request to the context is option. In Flask, the current
# request is allways accessible as flask.request.
result = graphql_sync(
schema,
data.get('query'),
context_value=request,
variable_values=variables,
operation_name=data.get('operationName')
)
response = {"data": result.data}
if result.errors:
response["errors"] = [format_error(e) for e in result.errors]
return jsonify(response)
Apollo doc for default resolver says that if field_name
resolves to function, it will be called with query arguments:
Calls a function on obj with the relevant field name and passes the query arguments into that function
This can be useful for situations when parent resolver returned an object with getter functions.
Starlette is lightweight framework for implementing asynchronous APIs/apps in Python. It already has GraphQL support (via Graphene) but the feature and dependencies are opt-in, leaving space for alternative GraphQL implementation in the project.
We've discussed this with @tomchristie who leads work on Starlette and path is open for us to introduce the built-in support for Ariadne in Starlette via PR. This issue is for tracking possible ariadne.contrib.starlette
module/package that would provide necessary bindings on our side.
Debugging resolver exception can become quite difficult, as they are caught by ariadne and I found no reasonable way to get a traceback to where the exception is happening.
# Execute the query
result = graphql_sync(
schema,
data.get("query"),
context_value=request, # expose request as info.context
variable_values=data.get("variables"),
operation_name=data.get("operationName"),
)
log.exception('', exc_info=result.errors[0].original_error)
Output:
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/graphql/execution/execute.py", line 664, in complete_value_catching_error
return_type, field_nodes, info, path, result
File "/usr/local/lib/python3.6/site-packages/graphql/execution/execute.py", line 731, in complete_value
raise result
graphql.error.graphql_error.GraphQLError: My exception
If i catch it manually in the resolver and do the log output there
@query.field("hello")
def resolve_hello(_, info):
try:
raise Exception('My exception')
return "Hello, %s!" % 'asdf'
except Exception as e:
log.exception('', exc_info=e)
raise e
it'll get a bit better - getting the file and line number of the error but still a quite limited traceback
Traceback (most recent call last):
File "/app/webapp/core/views.py", line 35, in resolve_hello
raise Exception('My exception')
Exception: My exception
I would suggest an option where you can switch on raising exceptions eg. in development environments. Or is there already some mechanism how to do error tracing in resolvers during development? Couldn't find anything in the docs.
Thx
Getting started with Ariadne could be made even simpler by providing shortcut function abstracting the GraphQLMiddleware
away on first contact, thus saving users possible confusion about what they really are doing.
Ariadne 0.2 will introduce features that can be enhanced by Apollo-GraphQL plugin for VSC. Project docs should demonstrate how to do that:
gql
for basic schema coloringapollo.config.js
for Pythongql
suggestionsgql
filegql
directorySpec for interfaces: https://graphql.org/learn/schema/#interfaces
Spec for unions: https://graphql.org/learn/schema/#union-types
Apollo Server: https://www.apollographql.com/docs/apollo-server/features/unions-interfaces.html
Currently Ariadne relies on inheritance for GraphQL
behaviour customization. This could be replaced with list of kwargs containing configuration options, eg:
app = GraphQL(
schema,
context: Optional[Any] = None, # Callable
root_resolver: Optional[Callable] = None,
error_handler: Optional[Callable] = None,
debug: bool = False,
validators=[
validate_max_query_depth(10),
validate_max_query_width(80),
validate_max_query_cost(100),
],
)
Our current type for Resolver
is Callable[..., Any]
, catching any arguments and relasing anything.
This is because its currently impossible to tpe Callable
that accepts *args
or **kwagrs
.
This issue is [known to MyPy authors](python/typing#264 but a the time of writing no solution is available.
This is related to #79
We'll need to setup sphinx docs and start documenting our features as well as providing guides for common use cases.
GraphQL supports item descriptions, but currently, Ariadne provides no way to set those, and neither does GraphQL-Core
version we are using.
Ideally, we should provide two ways to set item descriptions:
description=
kwarg to make_executable_schema
& friends that would take dict of dicts and would override items descriptions based on that. We could read special key (eg. __description
) to get description for type. This approach should also support modularization.Currently Ariadne Scalar support is limited to serializing python types to JSON before returning them to client, but we also want to support using custom scalars for input.
Our add_resolve_functions_to_scalar
utility could support following use-cases:
Code below results in one-way only scalar:
type_defs = {'Scalar': {'serialize': callable}}
And this code results in two-way scalar:
type_defs = {'Scalar': {'serialize': callable, 'parse_value': callable, 'parse_literal': callable}}
- explicit syntax for two-directional scalar.This is the white whale of GraphQL server implementation - the way to detect maliciously constructed queries and reject their execution before even touching the data.
We haven't moved on to implementing anything yet, but current idea for achieving this is creating copy of "real" schema, but replacing all scalars with Int
. We could then re-use the existing query execution logic to get data structure containing the price of each individual field in the result. One of potential issues I'm seeing here would be counting fields returning types.
The alternative approach would involve implementing custom query executor which would traverse the fields defined in query, keeping track of its position, depth, and counting the score on the fly. It may be that this approach may not be as scary as it appears, due to us not needing to implement entire execution flow - just walk the fields in query.
GraphQL-Core-Next already implements custom wrappers for graphql.execute
: graphql()
and graphql_sync()
.
Ariadne could replace those wrappers with custom one, giving us proper control on query parsing, hooking in TRACE
measurements and custom validators or extensions.
Relevant official doc on input types.
We've discussed the way that resolvers work in Ariadne currently, and have reached the conclusion that there is space for improving:
sdlFieldName
to python_field_name
will be addedfrom ariadne import ObjectResolver
type_defs = """
type User {
username: String!
firstName: String
posts: [Post]!
link: String
}
"""
query = ResolverMap("Query")
@query.field("people")
def resolve_people(*_):
return [
{"first_name": "John", "last_name": "Doe", "years": 21},
{"first_name": "Bob", "last_name": "Boberson", "years": 24},
]
person = ResolverMap("Person")
person.alias("age", "years")
@person.field("fullName")
def resolve_person_fullname(person, *_):
return "%s %s" % (person["first_name"], person["last_name"])
start_simple_server(type_defs, [query, person])
In the above example, ResolverMap
is used as container for resolvers, providing additional features:
ResolverMap
validates if schema defines expected type and fieldsmake_executable_schema
to dedicated resolver handling classesadd_resolve_functions_to_schema
utility (and all its beneath complexity) is goneCurrently Ariadne ResolverMap
field
attribute can be used both as decorator and function:
@resolver_map.field("fieldname")
def some_resolver():
...
resolver_map.field("fieldname", resolver=some_resolver)
This name works when used as decorator, but is confusing when procedural approach is used by dev instead. Because we have two uses for same function, we also have complex implementation and type annotations. Splitting field
into two functions would greatly simplify things for us.
We would also have to check Scalar
and update it accordingly (but perhaps use set_
prefix?)
CC @patrys
Following idea was brought up in discussion for #24:
Maybe we could default to calling parse_value with ast.value when only one function is provided?
This requires further study. IntValue
, StringValue
and friends are obvious to deal with, but but complex types like ListValue
may require some extra unpacking magic.
Still, if it is possible to pull off, it could be an excellent convenience for developers creating custom scalars, saving the need for potentially maintaining two very simiiar implementations, one doing isinstance(value, basestr)
and other isinstance(value, StringValue)
.
Hello!
I have a question about resolving in nested field. I don't realize if there is a bug or that functionality is not exist yet.
I have this schema and resolver:
from ariadne import ResolverMap, gql, start_simple_server, snake_case_fallback_resolvers
type_defs = """
schema {
query: RootQuery
}
type RootQuery {
public: PublicEntryPoint
}
type PublicEntryPoint {
users: [UserSchema]
}
type UserSchema {
userId: Int
}
"""
def resolve_users(obj, info):
print('RESOLVING USERS!')
return [{'user_id': 1}, {'user_id': 2}]
users_map = ResolverMap("PublicEntryPoint")
users_map.field("users", resolver=resolve_users)
start_simple_server(type_defs, [users_map, snake_case_fallback_resolvers])
But when I make the query to receive data
query {
public {
users {
userId
}
}
}
I get
{
"data": {
"public": null
}
}
And the resolver function is not called! Is this correct?
Thanks
GraphQL spec describes a way for executing the queries over GET
, via query=
.
Quick research shows that this usage is supported by Apollo Server, but only for query
operations. This protects it from 2005 happening all over again with ?action=delete&target=123
links-traps being posted around the internet, but requires additional logic in Ariadne to detect and block mutations and subscriptions.
I'm not an expert in the python ecosystem, and am not sure what's required from me to support asynchronous resolvers in an Ariadne schema. If it's possible, I'd love a brief example illustrating a working async resolver!
What I've tried so far (using python 3.7):
from aiodataloader import DataLoader
from ariadne import ResolverMap, gql, start_simple_server
# DATA LOADERS
heroesLoader = HeroesLoader()
class HeroesLoader(DataLoader):
async def batch_load_fn(self, keys):
# return await batch_get_heroes()
# SCHEMA DEFINITION
type_defs = gql("""
type Query {
heroes: [Hero!]!
}
type Hero {
name: String!
shortName: String
title: String
}
""")
# RESOLVERS
query = ResolverMap("Query")
hero = ResolverMap("Hero")
@query.field("heroes")
async def resolve_heroes(*_):
return await heroesLoader.load('_')
start_simple_server(type_defs, [query, hero])
The make_executable_schema
utility should optionally take list of dicts of dicts (AKA "resolvers map"), this would allow larger projects to easily split and compose resolvers as needed:
from ariadne import make_executable_schema
from products.graphql import resolvers as products_resolvers
from users.graphql import resolvers as users_resolvers
typedefs = "..."
resolvers = [products_resolvers, users_resolvers]
schema = make_executable_schema(typedefs, resolvers)
This task will likely require #13 to be done first, so we are 100% certain that all resolver mappings are dicts.
Our initial readme should provide:
The make_executable_schema
utility should take string or list of strings with SDLs for type definitions and dict of dicts with resolvers as second argument, parse each one, merge them, then, if resolvers are given, add them to schema returned by build_ast_schema
util from graphql-core
, and finally return resulting executable schema that may then be passed to graphql()
.
GraphQL 2.1.0 is out, and brings to table quite a few goodies:
It also changes the signatures on few functions that we are reliant on, like graphql()
.
It is a standard among other GraphQL frameworks to ship a built-in GraphQL API explorer. We should consider adding one in Ariadne as well:
Probably we should choose the second one as it allows to set headers and contains queries history.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.