99designs / gqlgen Goto Github PK
View Code? Open in Web Editor NEWgo generate based graphql server library
Home Page: https://gqlgen.com
License: MIT License
go generate based graphql server library
Home Page: https://gqlgen.com
License: MIT License
When we first started using gqlgen, if your go model did not have a field publicly accessible that was specified on your schema, the generator would generate a resolver for you to manually handle it.
// schema.graphql
schema {
query: Query
}
type Query {
cars: [Car]!
}
type Car {
name: String
model: String
created:Time
new: Boolean
}
// model.go
package model
type Car struct {
Name string `db:"name"`
Model string `db:"model"`
Created time.Time `db:"created"`
}
// types.json
{
"Car": "path/to/model.Car"
}
// generated.go
type Resolvers interface {
Query_cars(ctx context.Context) ([]model.Car, error)
Field_car_new(ctx context.Context, car model.Car) (bool, error) // This is missing
}
Unfortunately, this functionality disappeared when you introduced the model generator.
This was really nice feature if you have any fields on your graphql model that are generated by business logic. For the example above, something like:
// resolver.go
func (r *QueryResolver) Field_car_new(ctx context.Context, car model.Car) (bool, error) {
weekAgo := time.Now().AddDate(0, -7, 0)
return car.Created.After(weekAgo)
}
I can understand the answer is probably to separate db go models from graphql go models, but for an example like this where most of the fields are duplicated, I don't see it worth it for the added overhead and complexity.
example :
input A {
id:Int!
}
input B {
a : A!
}
type Mutation {
C(b:B!) : Boolean
}
type Query {
}
schema {
mutation: Mutation
query :Query
}
This is not strictly an issue, just a question about usage.
I saw in the doc you can point, through types.json, to a model struct and define it yourself.
I was wondering if it's a good idea to point it, and hence use, to the same models structs we're already using for the domain logic.
The app already has a REST interface and we want to add a separate GraphQL interface to it, so it would be very handy not to have to duplicate the domain objects across the two layers.
Do you see any reasons not to?
Thanks
I added a custom enum type to types.json and use it in my graphql spec.
gqlgen would use the custom type instead of generating enum into models_gen.go
enum code is generated into models_gen.go and a mismatch error is reported
type mismatch on Foo.state, expected <package>/foo.FooState got <package>/foo/custom.FooState
graph.graphql:
type Query {
foo(): Foo!
}
type Foo {
state: FooState!
}
enum FooState {
BIN
BAR
BAZ
}
types.json:
{
"FooState": "<package>/foo/custom.FooState"
}
custom/foo.go:
package custom
import (
"fmt"
"io"
"strconv"
)
type FooState string
const (
FooStateBin FooState = "BIN"
FooStateBar FooState = "BAR"
FooStateBaz FooState = "BAZ"
)
func (e FooState) IsValid() bool {
switch e {
case FooStateBin, FooStateBar, FooStateBaz:
return true
}
return false
}
func (e FooState) String() string {
return string(e)
}
func (e *FooState) UnmarshalGQL(v interface{}) error {
str, ok := v.(string)
if !ok {
return fmt.Errorf("enums must be strings")
}
*e = FooState(str)
if !e.IsValid() {
return fmt.Errorf("%s is not a valid FooState", str)
}
return nil
}
func (e FooState) MarshalGQL(w io.Writer) {
fmt.Fprint(w, strconv.Quote(e.String()))
}
Implementation could be as simple as a channel on the result:
func (r *Resolvers) Subscription_messageAdded(ctx context.Context, chatroomId int) (chan Message, error) {
result := make(chan Message)
go func() {
// keep emitting Message events until ctx.Done()
}()
return result, nil
}
As of version 0.12.3 the default style of describing a type or field is to add a string literal instead of a comment.
https://github.com/graphql/graphql-js/blob/master/src/utilities/extendSchema.js#L44
Parsing a schema with string literals as descriptions and generating the Golang files.
Crashing due to invalid syntax.
"This is a description."
schema {
query: Query
}
Results in unable to parse schema: graphql: syntax error: unexpected "\"This is a comment.\"", expecting Ident (line 1, column 1)
in codegen/models_build.go:69
why not support recursive type define?
we need support it .
input is work.
input R {
id:Int
items:[R]
}
type dont work.
type R {
id:Int
items:[R]
}
If non-alias field and alias fields are defined in a query, alias fields are unexpectedly omitted.
Here is a starwars example to reproduce.
{
reviews(episode: EMPIRE) {
stars
}
jediReviews: reviews(episode: JEDI) {
stars
}
}
{
"data": {
"reviews": [],
"jediReviews": []
}
}
{
"data": {
"reviews": []
}
}
In addition to that, resolver for field jediReviews
is never invoked.
I would expect that having a model defined as:
type Model struct {
Interval uint32
}
and a schema
type Model {
interval: Int!
}
and running:
gqlgen -typemap types.json -schema schema.graphql
type mismatch on Model.interval, expected int got uint32
would not result in that error.
It seems that the generator doesn't support unsigned integers but maybe I'm missing something.
See above
I wonder if this helps when working with dgraph ?
I tried to define structs using embedding to make the definition DRY.
This can be compiled to run server.
But unfortunately, there are error messages when I use gqlgen
.
Can I use gqlgen
without any error?
unable to bind Droid.id to anything, github.com/vektah/gqlgen/example/starwars.Droid has no suitable fields or methods
unable to bind Droid.name to anything, github.com/vektah/gqlgen/example/starwars.Droid has no suitable fields or methods
unable to bind Droid.appearsIn to anything, github.com/vektah/gqlgen/example/starwars.Droid has no suitable fields or methods
unable to bind Human.id to anything, github.com/vektah/gqlgen/example/starwars.Human has no suitable fields or methods
unable to bind Human.name to anything, github.com/vektah/gqlgen/example/starwars.Human has no suitable fields or methods
unable to bind Human.appearsIn to anything, github.com/vektah/gqlgen/example/starwars.Human has no suitable fields or methods
Here is an example to repro.
OK: go run ./server/server.go
NG: gqlgen -out generated.go -package starwars -typemap types.json
type CharacterFields struct {
ID string
Name string
FriendIds []string
AppearsIn []string
}
type Human struct {
CharacterFields
StarshipIds []string
heightMeters float64
Mass float64
}
type Droid struct {
CharacterFields
PrimaryFunction string
}
I am hitting an issue where I cannot get the gqlgen
generated code to build when I map a schema interface type to a Go interface.
Below is an example of what I'm talking about. Full code and detailed instructions to reproduce are at https://github.com/ereyes01/gqlgen-interface-bug
Suppose I have this schema:
schema {
query: Query
}
type Query {
shapes(): [Shape]
}
interface Shape {
area(): Float
}
type Circle implements Shape {
radius: Float
area() : Float
}
type Rectangle implements Shape {
length: Float
width: Float
area(): Float
}
And suppose I have the following corresponding Go types:
package shapes
import "math"
type Shape interface {
Area() float64
}
type Circle struct {
Radius float64
}
func (c *Circle) Area() float64 {
return c.Radius * math.Pi * math.Pi
}
type Rectangle struct {
Length, Width float64
}
func (r *Rectangle) Area() float64 {
return r.Length * r.Width
}
So let's map our types as follows:
{
"Shape": "github.com/ereyes01/gqlgen-interface-bug/shapes.Shape",
"Circle": "github.com/ereyes01/gqlgen-interface-bug/shapes.Circle",
"Rectangle": "github.com/ereyes01/gqlgen-interface-bug/shapes.Rectangle"
}
NOTE: ^^ This uses the path to my repository given above as if I'd obtained it via go get
.
So from this we can now create our generated.go
as follows:
$ gqlgen -out generated.go -package shapes -typemap types.json schema.graphql
... and that works fine. We now implement the needed resolver:
package shapes
import "context"
type ShapeResolver struct{}
func (r *ShapeResolver) Query_shapes(ctx context.Context) ([]Shape, error) {
return []Shape{
&Circle{Radius: 10.0},
&Rectangle{Length: 1.0, Width: 10.0},
&Rectangle{Length: 10.0, Width: 10.0},
}, nil
}
... And I confirmed that the code generated for Shape.area()
looks correct- it just calls the Area()
Go method on the Go type.
So now let's say that inside shapes/server
we place a simple Playground server:
package main
import (
"log"
"net/http"
"github.com/ereyes01/gqlgen-interface-bug/shapes"
"github.com/vektah/gqlgen/handler"
)
func main() {
http.Handle("/", handler.Playground("Shapes", "/query"))
http.Handle("/query", handler.GraphQL(shapes.MakeExecutableSchema(new(shapes.ShapeResolver))))
log.Fatal(http.ListenAndServe(":9090", nil))
}
If I try to build that via go install
I get a compile error in the generated code:
$ go install github.com/ereyes01/gqlgen-interface-bug/shapes/server
# github.com/ereyes01/gqlgen-interface-bug/shapes
shapes/generated.go:711:2: impossible type switch case: *obj (type Shape) cannot have dynamic type Circle (Area method has pointer receiver)
shapes/generated.go:716:2: impossible type switch case: *obj (type Shape) cannot have dynamic type Rectangle (Area method has pointer receiver)
If we go have a look at that part of the generated code, we have:
func (ec *executionContext) _Shape(sel []query.Selection, obj *Shape) graphql.Marshaler {
switch obj := (*obj).(type) {
case nil:
return graphql.Null
case Circle:
return ec._Circle(sel, &obj)
case *Circle:
return ec._Circle(sel, obj)
case Rectangle:
return ec._Rectangle(sel, &obj)
case *Rectangle:
return ec._Rectangle(sel, obj)
default:
panic(fmt.Errorf("unexpected type %T", obj))
}
}
Of course, the case Circle
and case Rectangle
are not valid when type-switching on the Shape type, as those types do not implement the Shape interface (the pointers to those types do).
I could work around this by commenting out the case Circle
and case Rectangle
sections, and then everything works fine.
I investigated why this code was generated, and I found that in the template for the interface type, it just blindly writes out a case for both the Go type and its pointer:
{{- $interface := . }}
func (ec *executionContext) _{{$interface.GQLType}}(sel []query.Selection, obj *{{$interface.FullName}}) graphql.Marshaler {
switch obj := (*obj).(type) {
case nil:
return graphql.Null
{{- range $implementor := $interface.Implementors }}
case {{$implementor.FullName}}:
return ec._{{$implementor.GQLType}}(sel, &obj)
case *{{$implementor.FullName}}:
return ec._{{$implementor.GQLType}}(sel, obj)
{{- end }}
default:
panic(fmt.Errorf("unexpected type %T", obj))
}
}
This generated will only ever work if:
interface{}
)... which doesn't sound right if you're going to allow the interface type to be matched to your own type.
I'm not immediately sure what the best way to fix this might be. One way might be to look into whether the Go type system can tell you which are all the implementing types of an interface (I haven't really investigated that rabbit hole).
Another approach might be to allow us to tell you the implementing types in the types.json
file. Of course, that will further complicate the format of that JSON file, but it seems otherwise robust at first glance.
Thanks for taking a look.
if i has this graphql define
type User{
id : Int! #user id
name:String #user name
login_count:Int! #user login count
}
its will be generated a golang struct
type User struct {
ID int
Name *string
Login_count int
}
i want generated to
type User struct {
ID int `gql:"id,desc:'user id'`
Name *string `gql:"name,desc:'user name'"`
Login_count int `gql:"login_count,desc:'user login count'"`
}
and attach more info to field tag.
we can return desc
tag of field when request __schema.
Hello, first thank you for a great project! Great job!
I have a question, say i am using postgres as my datasource for resolvers and i want to take advantage of the fact that graphql allows us to select subset of fields. So i want to just query fields i need to optimize the query.
Now how would i go about that when implementing resolvers for example:
query findTodos {
todos {
text
done
}
}
func (r *TodoResolver) Query_todos(ctx context.Context) ([]Todo, error) {
// how do i access which of the fields were selected in the query here
rows, err := r.db.Query("SELECT text, done FROM todos")
}
Thank you for any pointers :) Keep up the great job!
Hey, thanks for getting custom types in so quickly!
I was wondering if you could extend your built-in Time scalar to support more formats similar to the one in neelance's library.
Not sure if there was a particular reason or not you didn't want to do this. If your ok with this I can open an MR for it.
From neelance/graphql-go
func (t *Time) UnmarshalGraphQL(input interface{}) error {
switch input := input.(type) {
case time.Time:
t.Time = input
return nil
case string:
var err error
t.Time, err = time.Parse(time.RFC3339, input)
return err
case int:
t.Time = time.Unix(int64(input), 0)
return nil
case float64:
t.Time = time.Unix(int64(input), 0)
return nil
default:
return fmt.Errorf("wrong type")
}
}
gqlgen generates following model from your Getting Started example schema:
`
// This file was generated by github.com/vektah/gqlgen, DO NOT EDIT
package gql
type Todo struct {
ID string
Text string
Done bool
UserID int
}
type User struct {
ID string
Name string
}
`
UserID is an int while the ID field ind User is a string. This seems to be a regression since your Getting Started example code needs to be modified to compile. Even if not, I question if it is a good idea to represent an automtically generated ID as an integer.
how to make authorzation layer when i generated a graphql server.
i think we need a event call before call each query/mutation.
for:
its like:
type Event interface{
BeforeQuery(ctx context.Context)
BeforeEachQuery(ctx context.Context,opName string,opParams map[string]interface{})
BeforeEachMutation(ctx context.Context,opName string,opParams map[string]interface{})
AfterQuery(ctx context.Context)
AfterEachQuery(ctx context.Context,opName string,opParams map[string]interface{},output interface{})
AfterEachMutation(ctx context.Context,opName string,opParams map[string]interface{},output interface{})
}
schema.graphql
schema {
query: Query
mutation: Mutation
}
type Query {
project(id: ID!): Project
}
type Mutation {
updateProject(name: String!, metadata: [String]): Project
}
type Project {
id: ID
name: String
metadata: [String]
}
When I invoke command gqlgen -out generated.go
with empty types.json {}
I will get the result:
var arg1 []string
if tmp, ok := field.Args["metadata"]; ok {
tmp2, err := coerceString(tmp)
if err != nil {
ec.Error(err)
continue
}
arg1 = tmp2 //this is wrong
}
The full code is here
// This file was generated by github.com/vektah/gqlgen, DO NOT EDIT
package graphql1
import (
context "context"
fmt "fmt"
io "io"
reflect "reflect"
strconv "strconv"
strings "strings"
sync "sync"
time "time"
mapstructure "github.com/mitchellh/mapstructure"
jsonw "github.com/vektah/gqlgen/jsonw"
errors "github.com/vektah/gqlgen/neelance/errors"
introspection "github.com/vektah/gqlgen/neelance/introspection"
query "github.com/vektah/gqlgen/neelance/query"
schema "github.com/vektah/gqlgen/neelance/schema"
validation "github.com/vektah/gqlgen/neelance/validation"
)
type Resolvers interface {
Mutation_updateProject(ctx context.Context, name string, metadata []string) (*interface{}, error)
Project_id(ctx context.Context, it *interface{}) (*string, error)
Project_name(ctx context.Context, it *interface{}) (*string, error)
Project_metadata(ctx context.Context, it *interface{}) ([]string, error)
Query_project(ctx context.Context, id string) (*interface{}, error)
}
func NewExecutor(resolvers Resolvers) func(context.Context, string, string, map[string]interface{}, io.Writer) []*errors.QueryError {
return func(ctx context.Context, document string, operationName string, variables map[string]interface{}, w io.Writer) []*errors.QueryError {
doc, qErr := query.Parse(document)
if qErr != nil {
return []*errors.QueryError{qErr}
}
errs := validation.Validate(parsedSchema, doc)
if len(errs) != 0 {
return errs
}
op, err := doc.GetOperation(operationName)
if err != nil {
return []*errors.QueryError{errors.Errorf("%s", err)}
}
c := executionContext{
resolvers: resolvers,
variables: variables,
doc: doc,
ctx: ctx,
}
var data jsonw.Writer
if op.Type == query.Query {
data = c._query(op.Selections, nil)
} else if op.Type == query.Mutation {
data = c._mutation(op.Selections, nil)
} else {
return []*errors.QueryError{errors.Errorf("unsupported operation type")}
}
c.wg.Wait()
result := &jsonw.OrderedMap{}
result.Add("data", data)
if len(c.Errors) > 0 {
result.Add("errors", errors.ErrorWriter(c.Errors))
}
result.WriteJson(w)
return nil
}
}
type executionContext struct {
errors.Builder
resolvers Resolvers
variables map[string]interface{}
doc *query.Document
ctx context.Context
wg sync.WaitGroup
}
var mutationImplementors = []string{"Mutation"}
// nolint: gocyclo, errcheck, gas, goconst
func (ec *executionContext) _mutation(sel []query.Selection, it *interface{}) jsonw.Writer {
fields := ec.collectFields(sel, mutationImplementors, map[string]bool{})
out := jsonw.NewOrderedMap(len(fields))
for i, field := range fields {
out.Keys[i] = field.Alias
out.Values[i] = jsonw.Null
switch field.Name {
case "updateProject":
var arg0 string
if tmp, ok := field.Args["name"]; ok {
tmp2, err := coerceString(tmp)
if err != nil {
ec.Error(err)
continue
}
arg0 = tmp2
}
var arg1 []string
if tmp, ok := field.Args["metadata"]; ok {
tmp2, err := coerceString(tmp)
if err != nil {
ec.Error(err)
continue
}
arg1 = tmp2
}
res, err := ec.resolvers.Mutation_updateProject(ec.ctx, arg0, arg1)
if err != nil {
ec.Error(err)
continue
}
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = ec._project(field.Selections, res)
}
default:
panic("unknown field " + strconv.Quote(field.Name))
}
}
return out
}
var projectImplementors = []string{"Project"}
// nolint: gocyclo, errcheck, gas, goconst
func (ec *executionContext) _project(sel []query.Selection, it *interface{}) jsonw.Writer {
fields := ec.collectFields(sel, projectImplementors, map[string]bool{})
out := jsonw.NewOrderedMap(len(fields))
for i, field := range fields {
out.Keys[i] = field.Alias
out.Values[i] = jsonw.Null
switch field.Name {
case "id":
ec.wg.Add(1)
go func(i int, field collectedField) {
defer ec.wg.Done()
res, err := ec.resolvers.Project_id(ec.ctx, it)
if err != nil {
ec.Error(err)
return
}
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = jsonw.String(*res)
}
}(i, field)
case "name":
ec.wg.Add(1)
go func(i int, field collectedField) {
defer ec.wg.Done()
res, err := ec.resolvers.Project_name(ec.ctx, it)
if err != nil {
ec.Error(err)
return
}
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = jsonw.String(*res)
}
}(i, field)
case "metadata":
ec.wg.Add(1)
go func(i int, field collectedField) {
defer ec.wg.Done()
res, err := ec.resolvers.Project_metadata(ec.ctx, it)
if err != nil {
ec.Error(err)
return
}
arr1 := jsonw.Array{}
for idx1 := range res {
var tmp1 jsonw.Writer
tmp1 = jsonw.String(res[idx1])
arr1 = append(arr1, tmp1)
}
out.Values[i] = arr1
}(i, field)
default:
panic("unknown field " + strconv.Quote(field.Name))
}
}
return out
}
var queryImplementors = []string{"Query"}
// nolint: gocyclo, errcheck, gas, goconst
func (ec *executionContext) _query(sel []query.Selection, it *interface{}) jsonw.Writer {
fields := ec.collectFields(sel, queryImplementors, map[string]bool{})
out := jsonw.NewOrderedMap(len(fields))
for i, field := range fields {
out.Keys[i] = field.Alias
out.Values[i] = jsonw.Null
switch field.Name {
case "project":
var arg0 string
if tmp, ok := field.Args["id"]; ok {
tmp2, err := coerceString(tmp)
if err != nil {
ec.Error(err)
continue
}
arg0 = tmp2
}
ec.wg.Add(1)
go func(i int, field collectedField) {
defer ec.wg.Done()
res, err := ec.resolvers.Query_project(ec.ctx, arg0)
if err != nil {
ec.Error(err)
return
}
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = ec._project(field.Selections, res)
}
}(i, field)
case "__schema":
res := ec.introspectSchema()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = ec.___Schema(field.Selections, res)
}
case "__type":
var arg0 string
if tmp, ok := field.Args["name"]; ok {
tmp2, err := coerceString(tmp)
if err != nil {
ec.Error(err)
continue
}
arg0 = tmp2
}
res := ec.introspectType(arg0)
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = ec.___Type(field.Selections, res)
}
default:
panic("unknown field " + strconv.Quote(field.Name))
}
}
return out
}
var __DirectiveImplementors = []string{"__Directive"}
// nolint: gocyclo, errcheck, gas, goconst
func (ec *executionContext) ___Directive(sel []query.Selection, it *introspection.Directive) jsonw.Writer {
fields := ec.collectFields(sel, __DirectiveImplementors, map[string]bool{})
out := jsonw.NewOrderedMap(len(fields))
for i, field := range fields {
out.Keys[i] = field.Alias
out.Values[i] = jsonw.Null
switch field.Name {
case "name":
res := it.Name()
out.Values[i] = jsonw.String(res)
case "description":
res := it.Description()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = jsonw.String(*res)
}
case "locations":
res := it.Locations()
arr1 := jsonw.Array{}
for idx1 := range res {
var tmp1 jsonw.Writer
tmp1 = jsonw.String(res[idx1])
arr1 = append(arr1, tmp1)
}
out.Values[i] = arr1
case "args":
res := it.Args()
arr1 := jsonw.Array{}
for idx1 := range res {
var tmp1 jsonw.Writer
if res[idx1] == nil {
tmp1 = jsonw.Null
} else {
tmp1 = ec.___InputValue(field.Selections, res[idx1])
}
arr1 = append(arr1, tmp1)
}
out.Values[i] = arr1
default:
panic("unknown field " + strconv.Quote(field.Name))
}
}
return out
}
var __EnumValueImplementors = []string{"__EnumValue"}
// nolint: gocyclo, errcheck, gas, goconst
func (ec *executionContext) ___EnumValue(sel []query.Selection, it *introspection.EnumValue) jsonw.Writer {
fields := ec.collectFields(sel, __EnumValueImplementors, map[string]bool{})
out := jsonw.NewOrderedMap(len(fields))
for i, field := range fields {
out.Keys[i] = field.Alias
out.Values[i] = jsonw.Null
switch field.Name {
case "name":
res := it.Name()
out.Values[i] = jsonw.String(res)
case "description":
res := it.Description()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = jsonw.String(*res)
}
case "isDeprecated":
res := it.IsDeprecated()
out.Values[i] = jsonw.Bool(res)
case "deprecationReason":
res := it.DeprecationReason()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = jsonw.String(*res)
}
default:
panic("unknown field " + strconv.Quote(field.Name))
}
}
return out
}
var __FieldImplementors = []string{"__Field"}
// nolint: gocyclo, errcheck, gas, goconst
func (ec *executionContext) ___Field(sel []query.Selection, it *introspection.Field) jsonw.Writer {
fields := ec.collectFields(sel, __FieldImplementors, map[string]bool{})
out := jsonw.NewOrderedMap(len(fields))
for i, field := range fields {
out.Keys[i] = field.Alias
out.Values[i] = jsonw.Null
switch field.Name {
case "name":
res := it.Name()
out.Values[i] = jsonw.String(res)
case "description":
res := it.Description()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = jsonw.String(*res)
}
case "args":
res := it.Args()
arr1 := jsonw.Array{}
for idx1 := range res {
var tmp1 jsonw.Writer
if res[idx1] == nil {
tmp1 = jsonw.Null
} else {
tmp1 = ec.___InputValue(field.Selections, res[idx1])
}
arr1 = append(arr1, tmp1)
}
out.Values[i] = arr1
case "type":
res := it.Type()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = ec.___Type(field.Selections, res)
}
case "isDeprecated":
res := it.IsDeprecated()
out.Values[i] = jsonw.Bool(res)
case "deprecationReason":
res := it.DeprecationReason()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = jsonw.String(*res)
}
default:
panic("unknown field " + strconv.Quote(field.Name))
}
}
return out
}
var __InputValueImplementors = []string{"__InputValue"}
// nolint: gocyclo, errcheck, gas, goconst
func (ec *executionContext) ___InputValue(sel []query.Selection, it *introspection.InputValue) jsonw.Writer {
fields := ec.collectFields(sel, __InputValueImplementors, map[string]bool{})
out := jsonw.NewOrderedMap(len(fields))
for i, field := range fields {
out.Keys[i] = field.Alias
out.Values[i] = jsonw.Null
switch field.Name {
case "name":
res := it.Name()
out.Values[i] = jsonw.String(res)
case "description":
res := it.Description()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = jsonw.String(*res)
}
case "type":
res := it.Type()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = ec.___Type(field.Selections, res)
}
case "defaultValue":
res := it.DefaultValue()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = jsonw.String(*res)
}
default:
panic("unknown field " + strconv.Quote(field.Name))
}
}
return out
}
var __SchemaImplementors = []string{"__Schema"}
// nolint: gocyclo, errcheck, gas, goconst
func (ec *executionContext) ___Schema(sel []query.Selection, it *introspection.Schema) jsonw.Writer {
fields := ec.collectFields(sel, __SchemaImplementors, map[string]bool{})
out := jsonw.NewOrderedMap(len(fields))
for i, field := range fields {
out.Keys[i] = field.Alias
out.Values[i] = jsonw.Null
switch field.Name {
case "types":
res := it.Types()
arr1 := jsonw.Array{}
for idx1 := range res {
var tmp1 jsonw.Writer
if res[idx1] == nil {
tmp1 = jsonw.Null
} else {
tmp1 = ec.___Type(field.Selections, res[idx1])
}
arr1 = append(arr1, tmp1)
}
out.Values[i] = arr1
case "queryType":
res := it.QueryType()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = ec.___Type(field.Selections, res)
}
case "mutationType":
res := it.MutationType()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = ec.___Type(field.Selections, res)
}
case "subscriptionType":
res := it.SubscriptionType()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = ec.___Type(field.Selections, res)
}
case "directives":
res := it.Directives()
arr1 := jsonw.Array{}
for idx1 := range res {
var tmp1 jsonw.Writer
if res[idx1] == nil {
tmp1 = jsonw.Null
} else {
tmp1 = ec.___Directive(field.Selections, res[idx1])
}
arr1 = append(arr1, tmp1)
}
out.Values[i] = arr1
default:
panic("unknown field " + strconv.Quote(field.Name))
}
}
return out
}
var __TypeImplementors = []string{"__Type"}
// nolint: gocyclo, errcheck, gas, goconst
func (ec *executionContext) ___Type(sel []query.Selection, it *introspection.Type) jsonw.Writer {
fields := ec.collectFields(sel, __TypeImplementors, map[string]bool{})
out := jsonw.NewOrderedMap(len(fields))
for i, field := range fields {
out.Keys[i] = field.Alias
out.Values[i] = jsonw.Null
switch field.Name {
case "kind":
res := it.Kind()
out.Values[i] = jsonw.String(res)
case "name":
res := it.Name()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = jsonw.String(*res)
}
case "description":
res := it.Description()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = jsonw.String(*res)
}
case "fields":
var arg0 bool
if tmp, ok := field.Args["includeDeprecated"]; ok {
tmp2, err := coerceBool(tmp)
if err != nil {
ec.Error(err)
continue
}
arg0 = tmp2
}
res := it.Fields(arg0)
arr1 := jsonw.Array{}
for idx1 := range res {
var tmp1 jsonw.Writer
if res[idx1] == nil {
tmp1 = jsonw.Null
} else {
tmp1 = ec.___Field(field.Selections, res[idx1])
}
arr1 = append(arr1, tmp1)
}
out.Values[i] = arr1
case "interfaces":
res := it.Interfaces()
arr1 := jsonw.Array{}
for idx1 := range res {
var tmp1 jsonw.Writer
if res[idx1] == nil {
tmp1 = jsonw.Null
} else {
tmp1 = ec.___Type(field.Selections, res[idx1])
}
arr1 = append(arr1, tmp1)
}
out.Values[i] = arr1
case "possibleTypes":
res := it.PossibleTypes()
arr1 := jsonw.Array{}
for idx1 := range res {
var tmp1 jsonw.Writer
if res[idx1] == nil {
tmp1 = jsonw.Null
} else {
tmp1 = ec.___Type(field.Selections, res[idx1])
}
arr1 = append(arr1, tmp1)
}
out.Values[i] = arr1
case "enumValues":
var arg0 bool
if tmp, ok := field.Args["includeDeprecated"]; ok {
tmp2, err := coerceBool(tmp)
if err != nil {
ec.Error(err)
continue
}
arg0 = tmp2
}
res := it.EnumValues(arg0)
arr1 := jsonw.Array{}
for idx1 := range res {
var tmp1 jsonw.Writer
if res[idx1] == nil {
tmp1 = jsonw.Null
} else {
tmp1 = ec.___EnumValue(field.Selections, res[idx1])
}
arr1 = append(arr1, tmp1)
}
out.Values[i] = arr1
case "inputFields":
res := it.InputFields()
arr1 := jsonw.Array{}
for idx1 := range res {
var tmp1 jsonw.Writer
if res[idx1] == nil {
tmp1 = jsonw.Null
} else {
tmp1 = ec.___InputValue(field.Selections, res[idx1])
}
arr1 = append(arr1, tmp1)
}
out.Values[i] = arr1
case "ofType":
res := it.OfType()
if res == nil {
out.Values[i] = jsonw.Null
} else {
out.Values[i] = ec.___Type(field.Selections, res)
}
default:
panic("unknown field " + strconv.Quote(field.Name))
}
}
return out
}
var parsedSchema = schema.MustParse("schema {\n\tquery: Query\n\tmutation: Mutation\n}\n\n\ntype Query {\n project(id: ID!): Project\n}\n\ntype Mutation {\n updateProject(name: String!, metadata: [String]): Project\n}\n\n\ntype Project {\n id: ID\n name: String\n metadata: [String]\n}\n\n")
func (ec *executionContext) introspectSchema() *introspection.Schema {
return introspection.WrapSchema(parsedSchema)
}
func (ec *executionContext) introspectType(name string) *introspection.Type {
t := parsedSchema.Resolve(name)
if t == nil {
return nil
}
return introspection.WrapType(t)
}
func instanceOf(val string, satisfies []string) bool {
for _, s := range satisfies {
if val == s {
return true
}
}
return false
}
func (ec *executionContext) collectFields(selSet []query.Selection, satisfies []string, visited map[string]bool) []collectedField {
var groupedFields []collectedField
for _, sel := range selSet {
switch sel := sel.(type) {
case *query.Field:
f := getOrCreateField(&groupedFields, sel.Name.Name, func() collectedField {
f := collectedField{
Alias: sel.Alias.Name,
Name: sel.Name.Name,
}
if len(sel.Arguments) > 0 {
f.Args = map[string]interface{}{}
for _, arg := range sel.Arguments {
f.Args[arg.Name.Name] = arg.Value.Value(ec.variables)
}
}
return f
})
f.Selections = append(f.Selections, sel.Selections...)
case *query.InlineFragment:
if !instanceOf(sel.On.Ident.Name, satisfies) {
continue
}
for _, childField := range ec.collectFields(sel.Selections, satisfies, visited) {
f := getOrCreateField(&groupedFields, childField.Name, func() collectedField { return childField })
f.Selections = append(f.Selections, childField.Selections...)
}
case *query.FragmentSpread:
fragmentName := sel.Name.Name
if _, seen := visited[fragmentName]; seen {
continue
}
visited[fragmentName] = true
fragment := ec.doc.Fragments.Get(fragmentName)
if fragment == nil {
ec.Errorf("missing fragment %s", fragmentName)
continue
}
if !instanceOf(fragment.On.Ident.Name, satisfies) {
continue
}
for _, childField := range ec.collectFields(fragment.Selections, satisfies, visited) {
f := getOrCreateField(&groupedFields, childField.Name, func() collectedField { return childField })
f.Selections = append(f.Selections, childField.Selections...)
}
default:
panic(fmt.Errorf("unsupported %T", sel))
}
}
return groupedFields
}
type collectedField struct {
Alias string
Name string
Args map[string]interface{}
Selections []query.Selection
}
func decodeHook(sourceType reflect.Type, destType reflect.Type, value interface{}) (interface{}, error) {
if destType.PkgPath() == "time" && destType.Name() == "Time" {
if dateStr, ok := value.(string); ok {
return time.Parse(time.RFC3339, dateStr)
}
return nil, errors.Errorf("time should be an RFC3339 formatted string")
}
return value, nil
}
// nolint: deadcode, megacheck
func unpackComplexArg(result interface{}, data interface{}) error {
decoder, err := mapstructure.NewDecoder(&mapstructure.DecoderConfig{
TagName: "graphql",
ErrorUnused: true,
Result: result,
DecodeHook: decodeHook,
})
if err != nil {
panic(err)
}
return decoder.Decode(data)
}
func getOrCreateField(c *[]collectedField, name string, creator func() collectedField) *collectedField {
for i, cf := range *c {
if cf.Alias == name {
return &(*c)[i]
}
}
f := creator()
*c = append(*c, f)
return &(*c)[len(*c)-1]
}
// nolint: deadcode, megacheck
func coerceString(v interface{}) (string, error) {
switch v := v.(type) {
case string:
return v, nil
case int:
return strconv.Itoa(v), nil
case float64:
return fmt.Sprintf("%f", v), nil
case bool:
if v {
return "true", nil
} else {
return "false", nil
}
case nil:
return "null", nil
default:
return "", fmt.Errorf("%T is not a string", v)
}
}
// nolint: deadcode, megacheck
func coerceBool(v interface{}) (bool, error) {
switch v := v.(type) {
case string:
return "true" == strings.ToLower(v), nil
case int:
return v != 0, nil
case bool:
return v, nil
default:
return false, fmt.Errorf("%T is not a bool", v)
}
}
// nolint: deadcode, megacheck
func coerceInt(v interface{}) (int, error) {
switch v := v.(type) {
case string:
return strconv.Atoi(v)
case int:
return v, nil
case float64:
return int(v), nil
default:
return 0, fmt.Errorf("%T is not an int", v)
}
}
// nolint: deadcode, megacheck
func coercefloat64(v interface{}) (float64, error) {
switch v := v.(type) {
case string:
return strconv.ParseFloat(v, 64)
case int:
return float64(v), nil
case float64:
return v, nil
default:
return 0, fmt.Errorf("%T is not an float", v)
}
}
validation isn't firing when posting an enum value in a variable:
query = mutation setState($state: State!) {
setState(state: $state) { id }
}
variables = {"state": "INVALID_STATE"}
results in
"errors": []
should be something like
errors: [
{
"message": "Argument \"state\" has invalid value {state: INVALID_STATE}.\nIn field \"state\": Expected type \"State\", found INVALID_STATE.",
"locations": [
{
"line": 14,
"column": 22
}
]
}
]
Not really a bug.. but more an annoyance for working with the database/sql Scanner/Valuer interfaces.
I am working with a custom type that is defined in graphql as
type Thing {
listOfEnums: [FooEnum]!
}
enum FooEnum {
bin, bar, baz
}
the type generated in go is
type FooEnum string
type Thing struct{
ListOfEnums: []FooEnum
}
the trouble i am running into is this. The value for ListOfEnums is stored in a database as a string bin,bar,baz
for the Valuer interface i can't use []FooEnum as the type but need to define a list type like:
type ListFooEnum []FooEnum
func (f ListFooEnum) Value() (driver.Value, error) {/* logic to convert the list to a string... */}
func (f *ListFooEnum) Scan(value interface{}) (error) {/* logic to convert the string into struct... */}
Currently how i am working around this is by changing the type on the model file to the []FooEnum when running gqlgen and then changing it back to ListFooEnum after.
I understand i could probably use an intermediary step of scanning to the ListFooEnum and casting to the []FooEnum. I am wondering there would be a possibility of having an annotation in the future to set the generated model value to the List version over the [] version.
:)
When attempting to generate code from a schema, a panic is thrown in import_build.go
when attempting to resolve imports. The import path for my all of my types matches the name
of one of the core imports the generated code relies on. i.e: my package: /path/to/package/introspection
which conflicts with internal package: github.com/vektah/gqlgen/neelance/introspection
(As they have the same base-name). I think this is down to line 40 in import_build.go
:
for imp := imports.findByName(localName); imp != nil && imp.Package != t.Package; localName = filepath.Base(t.Package) + strconv.Itoa(i) { ... }
As for the example above: for the first iteration it will find match for the first iteration with the package: github.com/vektah/gqlgen/neelance/introspection
, but will fail the imp != nil && imp.Package != t.Package
condition. On any subsequent iteration however, the new name (i.e introspection1, ...2, ..3) will never match a package thus causing the panic('too many collisions')
error.
hashing incoming queries and caching the document would improve parse times for large queries.
It would also serve as a great point to whitelist allowed queries.
Open question: how should the whitelist be specified?
Perhaps Persisted Queries?
Hello,
Thanks again for this library. It seems that default arguments in the graph schema isn't implemented?
blockchain(limit: Int = 10): BlockchainConnection!
Generates code that has limit as *int
but I would expect it to be an int
that sets it to 10 or the passed in value.
Automatic generation of models from the schema would make getting started much easier. Mapping onto user types is only convenient when you already have them.
Hi @vektah as requested here is a sample schema for the issue I saw. It is an input with a recursive type of itself.
input CatalogFilter {
OR: [CatalogFilter!]
AND: [CatalogFilter!]
id: IntFilter
}
I pasted the code generated in the other issue, I think the issue is somewhere in the unmarshal
func on line 84 in type.go
Would it be possible to generate code using string
as the underlying type of GQL ID? Currently it is generated as int
, which is more cumbersome for my databases.
Hello @vektah,
I've noticed that everything that was sent via websocket is printed on the console: https://github.com/vektah/gqlgen/blob/master/handler/websocket.go#L194.
Could you please disable it, or make it enabled only in info mode?
Thank you!
type Foo {
when Time
}
when = null if zeroed
when = 0000/00/00 00:00:00:00
Follow the conventions in https://github.com/graphcool/graphql-import to stitch together multiple schemas.
Prereq for Schema Stitching
Hi @vektah , great job on making the code generator.
I wonder if you have plan to migrate you code to the new repo graph-gophers/graphql-go?
neelance/graphql-go (@neelance ) has been transferred to graph-gophers/graphql-go and @tonyghita is actively working on and maintaining it.
Is there a good way to handle objects which are plain old Go objects that do not require a Resolver func?
For example, my schema has some objects like this:
type Widget {
foo: String!
bar: String!
children: [Tuple!]!
}
type Tuple {
frob: String!
noz: Int
}
On my backend, the real computation work is for producing a Widget... and it comes complete with all the Tuples. I want the Tuple fields to be all listed out in my graphql schema and queries so that I'm taking advantage of all the type checking... but there's nothing interesting to put in for "resolver" logic -- the Widget resolver already did all the work.
So, ideally, I'd like gqlgen's model generation to make a Widget type in Go that contains the whole thing at once:
type Widget struct {
Foo string
Bar string
Children []Tuple
}
Correspondingly, my Resolver
interface need not have a Widget_children()
method; gqlgen should be able to generate an executionContext._Widget_children
method that does all the remaining recursion itself.
I'm not sure if this relates to custom scalars. My intuition is "no" because I don't really consider these things scalars -- it is still desirable for the GQL query to specify fields inside these objects. This specification of types for trees of fields is most of the reason we're using graphQL schemas, after all! The example I gave here with just one nested struct and a few primitives is a simple one; I have this effect for objects around 4~5 layers deep.
Supporting this kind of superscalar/resolverless object would cover... honestly, a lot of the objects in the graphql schemas I'm working with. It would drastically reduce the number of methods on my Resolver
interface (and thus drastically reduce desires for resolver generators like described in #9 , for example).
Is there any feature like this in gqlgen I haven't seen yet? Do you think it sounds possible to add codegen that has this kind of behavior? (Or am I just using graphql "wrong"?)
schema {
query: Query
mutation: Mutation
}
interface Node {
id: ID!
}
type Query {
node(id: ID!): Node
nodes(ids: [ID!]!): [Node]!
}
type Mutation {
createCity(name:String!):City
}
type City implements Node {
id: ID!
name: String!
}
Generate
Query_node(ctx context.Context, id string) (, error)
Query_nodes(ctx context.Context, ids []string) ([], error)
No return type.
I see that the ResolverError
field on QueryError
is not set. What is the plan for this? The Builder
for errors currently only takes the error message and creates a QueryError
from that.
I was a bit surprised to see that by default the generated execution context drops the original error and exposes the raw message error in the GraphQL response. What are your thoughts for this?
I can imagine one wants to be able to control how errors thrown from the resolvers are rendered. I'm using a custom written HTTP handler and would like to be able to log the original errors. Do you think it's better to do logging etc. of the errors inside the resolvers and return a custom 'public' error?
Thanks for the awesome lib! Please let us know if there are any areas where you could use some help!
If a string argument to a query is declared optional its a Pointer type. But it looks like if its not passed in, the generated code marshals it into a string with the value of "null"
so the pointer is never actually nil. I was doing nil checks to see if a string was sent but this seems to break this as its always set to "null"
.
no type mismatch
type mismatch on SomeInput.id, expected gopkg.in/mgo.v2/bson.ObjectId got my/repo/path/vendor/gopkg.in/mgo.v2/bson.ObjectId
input SomeInput {
id: ID!
}
// generated type
type SomeInput struct {
ID bson.ObjectId
}
type ID = bson.ObjectId
func MarshalID(id bson.ObjectId) graphql.Marshaler {
return graphql.WriterFunc(func(w io.Writer) {
w.Write([]byte(id.Hex()))
})
}
func UnmarshalID(v interface{}) (bson.ObjectId, error) {
str, ok := v.(string)
if !ok {
return bson.NewObjectId(), fmt.Errorf("ids must be strings")
}
if !bson.IsObjectIdHex(str) {
return bson.NewObjectId(), fmt.Errorf("ids must be valid")
}
id := bson.ObjectIdHex(str)
return id, nil
}
{
"ID": "gopkg.in/mgo.v2/bson.ObjectId"
}
type FieldA {
id : Int!
name: String!
}
type ComplexField {
Items:[FieldA]
}
will be parse to
type ComplexField struct {
ItemsID int
}
type FieldA struct {
ID int
Name string
}
if FieldA is related to another model,its fine. but its not good when FieldA just is a Field of ComplexField(In Mysql 5.7 support JSON data type), i want we has a way to generate it to
type ComplexField struct {
Items []FieldA
}
type FieldA struct {
ID int
Name string
}
Hello @vektah, firstly I have to say that this library is awesome!
I've run into an issue with CORS enabled server using gqlgen. All OPTIONS
requests from client fails (Bad request -{“data”:null,“errors”:[{“message”:“json body could not be decoded: EOF”}]}
), because currently only GET
and POST
requests are supported.
if r.Method == "GET" {
...
} else {
// Assumes that this is POST request
if err := json.NewDecoder(r.Body).Decode(&reqParams); err != nil {
sendErrorf(w, http.StatusBadRequest, "json body could not be decoded: "+err.Error())
return
}
}
Can you please add OPTIONS
request support?
Thank you very much in advance!
So sometimes when I'm running the generation I get this error,
expected Address to be a named struct, instead found string
and the interface will generate all the fields, but sometimes the generate runs successfully with no errors and its using the type I put in types.json.
I have been trying to figure out why this is happening, any clues?
Hey,
Thank you for your great work; this library has a lot of potential. Sorry, I am not sure where else to ask this question. I am curious to know why did you decide to create a new library. Why not just hard fork neelance's go library?
At my work, we are using neelance's go library and even wrote our own code generator. We are also looking at its internal code and has hard forked it to experiment changes (i.e., make internal code exportable, use fields as resolver instead of methods, access to requested fields, add missing security features etc)
It would be nice if the generated code had a corresponding _test.go
file with it so that we could have the coverage when including the generated code file into our codebase.
I'm getting a compiler error when gqlgen creates resolvers based on a field that has a wrapped builtin type. For example:
schema.graphql
schema {
query: Query
}
type Query {
foo: Foo
}
type Foo {
id: String!
}
model.go
package tmp
type Identifier string
type Foo struct {
ID Identifier
}
types.json
{
"Foo": "tmp.Foo"
}
$ gqlgen -out gen.go
$ go build
# tmp
./gen.go:84:41: cannot use res (type Identifier) as type string in argument to graphql.MarshalString
Because the generated function uses MarshalString(res)
instead of MarshalString(string(res))
:
// nolint: gocyclo, errcheck, gas, goconst
func (ec *executionContext) _foo(sel []query.Selection, it *Foo) graphql.Marshaler {
fields := graphql.CollectFields(ec.doc, sel, fooImplementors, ec.variables)
out := graphql.NewOrderedMap(len(fields))
for i, field := range fields {
out.Keys[i] = field.Alias
out.Values[i] = graphql.Null
switch field.Name {
case "__typename":
out.Values[i] = graphql.MarshalString("Foo")
case "id":
badArgs := false
if badArgs {
continue
}
res := it.ID
out.Values[i] = graphql.MarshalString(res) // <-- invalid call
default:
panic("unknown field " + strconv.Quote(field.Name))
}
}
return out
}
Hi,
I am trying to figure out how to use custom scalars with this library, I added it to types.json and I'm still getting a "panic: unknown scalar". Looking at the code it wasn't immediately apparent how to do this.
Thanks!
gqlgen -out generated.go -package graphql
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0x121635f]
goroutine 1 [running]:
github.com/vektah/gqlgen/codegen.Bind(0xc420116af0, 0xc420140bd0, 0xc424bb6280, 0x3b, 0x1, 0xc4230de660, 0xb)
/Users/buchanae/src/github.com/vektah/gqlgen/codegen/build.go:87 +0x45f
main.main()
/Users/buchanae/src/github.com/vektah/gqlgen/main.go:75 +0x3d6
schema
scalar Time
type Workflow {
id: ID!
inputs: [WorkflowInput!]!
outputs: [WorkflowOutput!]!
steps: [Step!]!
runs: [WorkflowRun!]!
}
type WorkflowInput {
id :ID!
}
type WorkflowOutput {
id :ID!
}
type Step {
id :ID!
name: String!
inputs: [StepInput!]!
outputs: [StepOutput!]!
order: Int!
workflow: Workflow!
}
type StepInput {
id :ID!
}
type StepOutput {
id :ID!
}
enum RunState {
Complete
Running
Error
Idle
}
type WorkflowRun {
id :ID!
total: Int!
idle: Int!
running: Int!
error: Int!
complete: Int!
state: RunState!
done: Boolean!
startTime: Time
endTime: Time
workflow: Workflow!
stepRuns: [StepRun!]!
}
type StepRun {
id :ID!
state: RunState!
task: Task!
step: Step!
}
enum TaskState {
Unknown
Queued
Initializing
Running
ExecutorError
SystemError
Complete
}
type Task {
id :ID!
state: TaskState!
}
Generating a model for enums would be great, it probably only needs to be a set of string constants:
type State string
const (
StateOk = "OK"
StateFailed = "FAILED"
StateUnknown = "UNKNOWN"
)
maybe a few utility methods too:
It would be convenient if there was a command to generate the resolver skeleton and server writing for you from schema.
Combined with Model Generation you could have a working server pretty quickly.
Can it also update the existing resolver signatures if the schema has changed?
It would be neat if you could take a schema and generate a strictly typed client from it:
type Schema {
getUser(id: Int) User
}
type User {
...
}
generate-client schema.graphql -package myclient
It would generate a client allowing easy querying of schemas, carrying selection set information forward in context:
func (r *Resolver) Profile_user(ctx context.Context, parent *Profile) (myclient.User, error) {
return r.myclient.GetUser(ctx, parent.UserID)
}
Under the hood the client would generate the query based on the selection set (does it need any type information to generate the query?) and unmarshal the result, looking at __typename where appropriate to create the correct concrete types.
What about https://github.com/shurcooL/githubql?
It relies heavily on reflection to generate the query from the struct, but the shape isn't known until runtime so there is nothing to reflect.
schema.graphql
type TestCase {
id: ID!
name: String
}
type TestCycle {
cases: [TestCase]
}
models_gen.go will be like this
type TestCase struct {
ID string
Name *string
}
type TestCycle struct {
CasesID int
}
I would expect the following mutation to be working:
{
"query": "mutation createApp($app: AppInput!) {\\n createApp(app: $app) {\\n}\\n}\\n",
"variables": {
"appID": "foobar"
}
}
This is the body of the JSON mutation sent to gqlgen.
Unfortunately the actual "model" in the resolver has that field empty.
If the mutation is sent as:
mutation createApp {
createApp(app: {appID: "foobar"}) {
}
}
then it works.
I debugged a bit and I noticed that in the first case the variables are in the context of the mutation:
func (r *Resolver) Mutation_createApp(ctx context.Context, app models.App) (*models.App, error) { // nolint: golint
reqContext := graphql.GetRequestContext(ctx)
fmt.Println(reqContext.Variables)
runtime.Breakpoint()
# .....
}
this prints all the variables I sent using the client but the actual model struct is empty. With the second method there are no variables and the model struct is filled with the values.
Am I doing something wrong?
I would expect the mutation to receive the struct filled with the variables sent on the wire. In the meantime I'm filling the model struct manually from the request context.
type Mutation {
createApp(app: AppInput!): App
}
type App {
appID: ID!
}
input AppInput {
appID: ID!
}
and the model:
type App struct {
AppID string `sql:",notnull" json:"appID"`
}
Default variables are generated and used when declaring an input.
Default variables aren't generated and are ignored.
schema {
query: Query
}
type Query {
Test(input: DateFilter): Boolean
}
enum DATE_FILTER_OP {
EQ
NEQ
GT
GTE
LT
LTE
}
input DateFilter {
value: String!
timezone: String = "UTC"
op: DATE_FILTER_OP!
}
snippet of model generated
type DateFilter struct {
Value string
Timezone *string
Op DATE_FILTER_OP
}
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.